WorldWideScience

Sample records for standard analysis techniques

  1. Standard techniques for presentation and analysis of crater size-frequency data

    Science.gov (United States)

    1978-01-01

    In September 1977, a crater studies workshop was held for the purpose of developing standard data analysis and presentation techniques. This report contains the unanimous recommendations of the participants. This first meeting considered primarily crater size-frequency data. Future meetings will treat other aspects of crater studies such as morphologies.

  2. Cost minimisation analysis of using acellular dermal matrix (Strattice™) for breast reconstruction compared with standard techniques.

    Science.gov (United States)

    Johnson, R K; Wright, C K; Gandhi, A; Charny, M C; Barr, L

    2013-03-01

    We performed a cost analysis (using UK 2011/12 NHS tariffs as a proxy for cost) comparing immediate breast reconstruction using the new one-stage technique of acellular dermal matrix (Strattice™) with implant versus the standard alternative techniques of tissue expander (TE)/implant as a two-stage procedure and latissimus dorsi (LD) flap reconstruction. Clinical report data were collected for operative time, length of stay, outpatient procedures, and number of elective and emergency admissions in our first consecutive 24 patients undergoing one-stage Strattice reconstruction. Total cost to the NHS based on tariff, assuming top-up payments to cover Strattice acquisition costs, was assessed and compared to the two historical control groups matched on key variables. Eleven patients having unilateral Strattice reconstruction were compared to 10 having TE/implant reconstruction and 10 having LD flap and implant reconstruction. Thirteen patients having bilateral Strattice reconstruction were compared to 12 having bilateral TE/implant reconstruction. Total costs were: unilateral Strattice, £3685; unilateral TE, £4985; unilateral LD and implant, £6321; bilateral TE, £5478; and bilateral Strattice, £6771. The cost analysis shows a financial advantage of using acellular dermal matrix (Strattice) in unilateral breast reconstruction versus alternative procedures. The reimbursement system in England (Payment by Results) is based on disease-related groups similar to that of many countries across Europe and tariffs are based on reported hospital costs, making this analysis of relevance in other countries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    Science.gov (United States)

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  4. Standardization of proton-induced x-ray emission technique for analysis of thick samples

    Science.gov (United States)

    Ali, Shad; Zeb, Johar; Ahad, Abdul; Ahmad, Ishfaq; Haneef, M.; Akbar, Jehan

    2015-09-01

    This paper describes the standardization of the proton-induced x-ray emission (PIXE) technique for finding the elemental composition of thick samples. For the standardization, three different samples of standard reference materials (SRMs) were analyzed using this technique and the data were compared with the already known data of these certified SRMs. These samples were selected in order to cover the maximum range of elements in the periodic table. Each sample was irradiated for three different values of collected beam charges at three different times. A proton beam of 2.57 MeV obtained using 5UDH-II Pelletron accelerator was used for excitation of x-rays from the sample. The acquired experimental data were analyzed using the GUPIXWIN software. The results show that the SRM data and the data obtained using the PIXE technique are in good agreement.

  5. Standard techniques for presentation and analysis of crater size-frequency data. [on moon and planetary surfaces

    Science.gov (United States)

    Arvidson, R.; Boyce, J.; Chapman, C.; Cintala, M.; Fulchignoni, M.; Moore, H.; Soderblom, L.; Neukum, G.; Schultz, P.; Strom, R.

    1979-01-01

    In September 1977 a crater studies workshop was held for the purpose of developing standardized data analysis and presentation techniques. The present report contains the unanimous recommendations of the participants. Recommendations are devoted primarily to crater size-frequency data and refer to cumulative and relative size-frequency distribution plots and to morphological analysis.

  6. Extending and simplifying the standard Köhn-pipette technique for grain size analysis

    Science.gov (United States)

    Hirsch, Florian; Raab, Thomas

    2014-05-01

    Grain size distribution is a fundamental parameter to characterize physical properties of soils and sediments. Manifold approaches exist and according to the DIN ISO 11277 soil texture is analyzed by default with the combined pipette sieving and sedimentation method developed by Köhn. With this standard technique subfractions of sand and silt as well as the total clay content can be determined but the differentiation of clay subfractions is impossible. As the differentiation of the clay subfractions yields relevant information about pedogenesis, we present a protocol basing on standard techniques of granulometry with easy to handle and low cost equipment. The protocol was tested on a set of soil samples to cover the range of grain size distributions. We used a three-step procedure for achieving the grain size distribution of soil samples taking into account the subfractions of sand, silt and clay by a combination of sedimentation, centrifugal sedimentation and wet sieving. The pipetting was done with a piston-stroke pipette instead of the referred complex pipette from the DIN ISO 11277. Our first results show that the applied protocol is less prone to operating errors than the standard Köhn-pipette technique. Furthermore, even a less experienced laboratory worker can handle 10 samples in one day. Analyses of a luvisol profile, sampled in high spatial resolution, showed that the lessivation process is characterized by translocation of fine clay from the eluvial horizon to the illuvial horizon. Therefore our protocol is a fast alternative to detect lessivation, which is otherwise only clearly identifiable by micromorphological investigation and not by the standard Köhn-pipette technique.

  7. Phytochemical analysis and standardization of Strychnos nux-vomica extract through HPTLC techniques

    Directory of Open Access Journals (Sweden)

    Dinesh Kumar Patel

    2012-05-01

    Full Text Available Objective: The objective is to develop a noval qualitative and quantitative method by which we can determine different phytoconstituents of Strychnos nux-vomica L. Methods: To profile the phyconstituents of Strychnos nux-vomica, in the present study hydroalcoholic extract of Strychnos nux-vomica was subjected to preliminary phytochemical analysis, antimicrobial activities against certain pathogenic microorganisms, solubility test, loss on drying and pH value. Extract was also subjected to the quantitative analysis including total phenol, flavonoid and heavy metal analysis. Quantitative analysis was performed through HPTLC methods using strychnine and brucine as a standard marker. Results: Phytochemical analysis revealed the presence of alkaloid, carbohydrate, tannin, steroid, triterpenoid and glycoside in the extract. Total flavonoid and phenol content of Strychnos nux-vomica L extract was found to be 0.40 % and 0.43%. Result showed that the level of heavy metal (lead, arsenic, mercury and cadmium complie the standard level. Total bacterial count, yeast and moulds contents were found to be under the limit whereas E. coli and salmonella was found to be absent in the extract. Content of strychnine and brucine were found to be 4.75% and 3.91%. Conclusions: These studies provide valluable information for correct identification and selection of the drug from various adulterations. In future this study will be helpful for the quantitative analysis as well as standardization of the Strychnos nux-vomica L.

  8. Comparative analysis of photograph-based clinical goniometry to standard techniques.

    Science.gov (United States)

    Crasto, Jared A; Sayari, Arash J; Gray, Robert R-L; Askari, Morad

    2015-06-01

    Assessment of joint range of motion (ROM) is an accepted evaluation of disability as well as an indicator of recovery from musculoskeletal injuries. Many goniometric techniques have been described to measure ROM, with variable validity due to inter-rater reliability. In this report, we assessed the validity of photograph-based goniometry in measurement of ROM and its inter-rater reliability and compared it to two other commonly used techniques. We examined three methods for measuring ROM in the upper extremity: manual goniometry (MG), visual estimations (VE), and photograph-based goniometry (PBG). Eight motions of the upper extremity were measured in 69 participants at an academic medical center. We found visual estimations and photograph-based goniometry to be clinically valid when tested against manual goniometry (r avg. 0.58, range 0.28 to 0.87). Photograph-based measurements afforded a satisfactory degree of inter-rater reliability (ICC avg. 0.77, range 0.28 to 0.96). Our study supports photograph-based goniometry as the new standard goniometric technique, as it has been clinically validated, is performed with greater consistency and better inter-rater reliability when compared with manual goniometry. It also allows for better documentation of measurements and potential incorporation into medical records in direct contrast to visual estimation.

  9. Remnant preservation in anterior cruciate ligament reconstruction versus standard techniques: a meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Ma, Tianjun; Zeng, Chun; Pan, Jianying; Zhao, Chang; Fang, Hang; Cai, Daozhang

    2017-01-01

    Preserving the remnant during anterior cruciate ligament (ACL) reconstruction is considered beneficial for graft healing, but it might increase the technical difficulties and complications. This study was to compare outcomes of using the technique of remnant preservation during the ACL reconstruction versus the standard procedure with the debridement of remnant. We searched PubMed and EMBASE and the Cochrane Library for randomized controlled trials comparing the outcomes of ACL reconstruction both with and without remnant preservation. The risk of bias was assessed in accordance with the Cochrane Collaboration's risk of bias tool. Meta-analysis was performed to compare results. Six randomized controlled trials with 346 patients were included. Statistically significant differences in favor of using technique of remnant preservation were observed for Lysholm Score, arthrometer measurements, and tibial tunnel enlargement. There was no significant difference between remnant technique of preservation and the standard procedure with respect to the IKDC (International Knee Documentation Committee) grade, IKDC score, Lachman Test, Pivot-shift Test, range of motion (ROM), and the incidence of the cyclops lesion. This meta-analysis of randomized controlled trials showed that ACL reconstruction with technique of remnant preservation cannot provide superior clinical outcomes compared with the standard procedure.

  10. Evaluation of a standard breast tangent technique: a dose-volume analysis of tangential irradiation using three-dimensional tools

    International Nuclear Information System (INIS)

    Krasin, Matthew; McCall, Anne; King, Stephanie; Olson, Mary; Emami, Bahman

    2000-01-01

    Purpose: A thorough dose-volume analysis of a standard tangential radiation technique has not been published. We evaluated the adequacy of a tangential radiation technique in delivering dose to the breast and regional lymphatics, as well as dose delivered to underlying critical structures. Methods and Materials: Treatment plans of 25 consecutive women with breast cancer undergoing lumpectomy and adjuvant breast radiotherapy were studied. Patients underwent two-dimensional (2D) treatment planning followed by treatment with standard breast tangents. These 2D plans were reconstructed without modification on our three-dimensional treatment planning system and analyzed with regard to dose-volume parameters. Results: Adequate coverage of the breast (defined as 95% of the target receiving at least 95% of the prescribed dose) was achieved in 16 of 25 patients, with all patients having at least 85% of the breast volume treated to 95% of the prescribed dose. Only 1 patient (4%) had adequate coverage of the Level I axilla, and no patient had adequate coverage of the Level II axilla, Level III axilla, or the internal mammary lymph nodes. Conclusion: Three-dimensional treatment planning is superior in quantification of the dose received by the breast, regional lymphatics, and critical structures. The standard breast tangent technique delivers an adequate dose to the breast but does not therapeutically treat the regional lymph nodes in the majority of patients. If coverage of the axilla or internal mammary lymph nodes is desired, alternate beam arrangements or treatment fields will be necessary

  11. Should hand-assisted retroperitoneoscopic nephrectomy replace the standard laparoscopic technique for living donor nephrectomy? A meta-analysis.

    Science.gov (United States)

    Elmaraezy, Ahmed; Abushouk, Abdelrahman Ibrahim; Kamel, Moaz; Negida, Ahmed; Naser, Omar

    2017-04-01

    We performed this meta-analysis to compare hand-assisted retroperitoneoscopic (HARP) and traditional laparoscopic (TLS) techniques for living donor nephrectomy. We searched PubMed, Cochrane Central, EMBASE, and Web of science for prospective studies, comparing HARP and TLS techniques. Data were extracted from eligible studies and pooled as risk ratios (RR) or standardized mean difference (SMD), using RevMan software (version 5.3 for windows). We performed a sensitivity analysis to test the robustness of our evidence and a subgroup analysis to stratify intraoperative complications on Clavien-Dindo score. Seven studies (498 patients) were included in the final analysis. HARP was superior to TLS in terms of shortening the operative duration (SMD = -0.84, 95% CI [-1.18 to -0.50]) and warm ischemia time (SMD = -0.93, 95% CI [-1.13 to -0.72]). There was no significant difference between HARP and TLS in terms of blood loss (SMD = 0.13, 95% CI [-0.50 to 0.76]), hospital stay (SMD = -0.27, 95% CI [-0.70 to 0.15]) or graft survival (RR = 0.97, 95% CI [0.92 to 1.02]). The overall risk ratio of intraoperative complications did not differ significantly between the two groups (RR = 0.62, 95% CI [0.31 to 1.21]). Our meta-analysis shows that HARP was associated with a shorter surgery duration and less warm ischemia time than TLS. However, no significant differences were found between the two groups in terms of graft survival or intraoperative complication rates. We recommend HARP over TLS for living donor nephrectomy; however, future studies with larger sample sizes are recommended to compare both techniques in terms of operative safety and quality of life outcomes. Copyright © 2017 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.

  12. Standard dilution analysis.

    Science.gov (United States)

    Jones, Willis B; Donati, George L; Calloway, Clifton P; Jones, Bradley T

    2015-02-17

    Standard dilution analysis (SDA) is a novel calibration method that may be applied to most instrumental techniques that will accept liquid samples and are capable of monitoring two wavelengths simultaneously. It combines the traditional methods of standard additions and internal standards. Therefore, it simultaneously corrects for matrix effects and for fluctuations due to changes in sample size, orientation, or instrumental parameters. SDA requires only 200 s per sample with inductively coupled plasma optical emission spectrometry (ICP OES). Neither the preparation of a series of standard solutions nor the construction of a universal calibration graph is required. The analysis is performed by combining two solutions in a single container: the first containing 50% sample and 50% standard mixture; the second containing 50% sample and 50% solvent. Data are collected in real time as the first solution is diluted by the second one. The results are used to prepare a plot of the analyte-to-internal standard signal ratio on the y-axis versus the inverse of the internal standard concentration on the x-axis. The analyte concentration in the sample is determined from the ratio of the slope and intercept of that plot. The method has been applied to the determination of FD&C dye Blue No. 1 in mouthwash by molecular absorption spectrometry and to the determination of eight metals in mouthwash, wine, cola, nitric acid, and water by ICP OES. Both the accuracy and precision for SDA are better than those observed for the external calibration, standard additions, and internal standard methods using ICP OES.

  13. Relationship between alveolar bone measured by 125I absorptiometry with analysis of standardized radiographs: 2. Bjorn technique

    International Nuclear Information System (INIS)

    Ortman, L.F.; McHenry, K.; Hausmann, E.

    1982-01-01

    The Bjorn technique is widely used in periodontal studies as a standardized measure of alveolar bone. Recent studies have demonstrated the feasibility of using 125 I absorptiometry to measure bone mass. The purpose of this study was to compare 125 I absorptiometry with the Bjorn technique in detecting small sequential losses of alveolary bone. Four periodontal-like defects of incrementally increasing size were produced in alveolar bone in the posterior segment of the maxilla of a human skull. An attempt was made to sequentially reduce the amount of bone in 10% increments until no bone remained, a through and through defect. The bone remaining at each step was measured using 125 I absorptiometry. At each site the 125 I absorptiometry measurements were made at the same location by fixing the photon source to a prefabricated precision-made occlusal splint. This site was just beneath the crest and midway between the borders of two adjacent teeth. Bone loss was also determined by the Bjorn technique. Standardized intraoral films were taken using a custom-fitted acrylic clutch, and bone measurements were made from the root apex to coronal height of the lamina dura. A comparison of the data indicates that: (1) in early bone loss, less than 30%, the Bjorn technique underestimates the amount of loss, and (2) in advanced bone loss, more than 60% the Bjorn technique overestimates it

  14. Chemical Separation Technique of Strontium-90 in the Soil Water as theStandard Methods for Environmental Radioactivity Analysis

    International Nuclear Information System (INIS)

    Ngasifudin-Hamdani; Suratman; Djoko-Sardjono, Ign; Winduanto-Wahyu SP

    2000-01-01

    Research about separation technique of strontium-90 from its materialmatrix using chemical precipitation method has been done. That technique wasapplied on the detection of radionuclide strontium-90 containing in the soilwater of near nuclear reactor facility P3TM BATAN in three location. The twoimportant parameters used in this technique were growth time of Y-90 andstirring time. The result shown that activity of strontium-90 in the pos-01was between 1.801x10 -19 - 9.616x10 -17 μCi/cm 3 , pos-02 was8.448x10 -19 - 1.003x X 10 -16 μCi/cm 3 and pos-03 was 6.719x10 -19 - 11.644x10 -16 μCi/cm 3 . From those data shown that activity of Sr-90in the soil water of near nuclear reactor facility P3TM BATAN was still belowthe limit value of maximum concentration permitted i.e. 4.0x10 -7 -3.5x10 -6 μCi/cm 3 . The statistic test using analysis of varian twofactorial with random block design shown that the activity of Sr-90 in thesoil water was influenced by the interaction which take place between growthlong time of Y-90 and stirring long time. (author)

  15. Internal standardization--atomic spectrometry and geographical pattern recognition techniques for the multielement analysis and classification of Catalonian red wines.

    Science.gov (United States)

    Iglesias, Mònica; Besalú, Emili; Anticó, Enriqueta

    2007-01-24

    Major and minor (K, P, Ca, Mg, Na, Fe, Mn, Zn, and Sr) and trace (Ba, Ni, Pb, V, Co, Cd, and Sb) elements from wine samples from the Denomination of Origin (DO) Empordà-Costa Brava (Catalonia, Spain) were analyzed by inductively coupled plasma atomic emission spectrometry (ICP-AES) and mass spectrometry (ICP-MS) respectively. Previously, a comparison of different calibration methodologies and sample digestion treatments had been carried out using ANOVA statistical tool. The obtained results demonstrated that internal standardization provides reliable results with the advantage that no further manipulation of the sample is needed. A principal component analysis of the concentration data was performed to differentiate the samples of DO Empordà-Costa Brava from wine samples from other wine-producing regions in Spain (i.e., Penedès, Somontano, and Rioja). It was found that Sr and Ba contents discriminate the two DO groups. Moreover, a discriminant analysis function involving both variables distinguishes the two groups with a 100% classification rate. At the level of the leave-one-out cross-validation, all of the Empordà-Costa Brava samples were well classified, whereas the other DOs presented two borderline misclassifications.

  16. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images.

    Science.gov (United States)

    Zweerink, Alwin; Allaart, Cornelis P; Kuijer, Joost P A; Wu, LiNa; Beek, Aernout M; van de Ven, Peter M; Meine, Mathias; Croisille, Pierre; Clarysse, Patrick; van Rossum, Albert C; Nijveldt, Robin

    2017-12-01

    Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. • Myocardial strain analysis could potentially improve patient selection for CRT. • Currently a well validated clinical approach to derive segmental strains is lacking. • The novel SLICE technique derives segmental strains from standard CMR cine images. • SLICE-derived strain markers of CRT response showed close agreement with CMR-TAG. • Future studies will focus on the prognostic value of SLICE in CRT candidates.

  17. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images

    Energy Technology Data Exchange (ETDEWEB)

    Zweerink, Alwin; Allaart, Cornelis P.; Wu, LiNa; Beek, Aernout M.; Rossum, Albert C. van; Nijveldt, Robin [VU University Medical Center, Department of Cardiology, and Institute for Cardiovascular Research (ICaR-VU), Amsterdam (Netherlands); Kuijer, Joost P.A. [VU University Medical Center, Department of Physics and Medical Technology, Amsterdam (Netherlands); Ven, Peter M. van de [VU University Medical Center, Department of Epidemiology and Biostatistics, Amsterdam (Netherlands); Meine, Mathias [University Medical Center, Department of Cardiology, Utrecht (Netherlands); Croisille, Pierre; Clarysse, Patrick [Univ Lyon, UJM-Saint-Etienne, INSA, CNRS UMR 5520, INSERM U1206, CREATIS, Saint-Etienne (France)

    2017-12-15

    Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. (orig.)

  18. ELISA technique standardization for strongyloidiasis diagnosis

    International Nuclear Information System (INIS)

    Huapaya, P.; Espinoza, I.; Huiza, A.; Universidad Nacional Mayor de San Marcos, Lima; Sevilla, C.

    2002-01-01

    To standardize ELISA technique for human Strongyloides stercoralis infection diagnosis a crude antigen was prepared using filariform larvae obtained from positive stool samples cultured with charcoal. Harvested larvae were crushed by sonication and washed by centrifugation in order to obtain protein extracts to be used as antigen. Final protein concentration was 600 μg/mL. Several kinds of ELISA plates were tested and antigen concentration, sera dilution, conjugate dilution and cut off were determined to identify infection. Sera from patients with both hyper-infection syndrome and intestinal infection demonstrated by parasitological examination were positive controls and sera from people living in non-endemic areas with no infection demonstrated by parasitological examination were negative controls. Best values were 5 μg/mL for antigen, 1/64 for sera, 1/1000 for conjugate; optical density values for positive samples were 1,2746 (1,1065 - 1,4206, DS = 0,3284) and for negative samples 0,4457 (0,3324 - 0,5538, DS = 0,2230). Twenty sera samples from positive subjects and one hundred from negative subjects were examined, obtaining 90% sensitivity and 88% specificity. The results show this technique could be useful as strongyloidiasis screening test in population studies

  19. Analysis of vitamin D metabolic markers by mass spectrometry: current techniques, limitations of the "gold standard" method, and anticipated future directions.

    Science.gov (United States)

    Volmer, Dietrich A; Mendes, Luana R B C; Stokes, Caroline S

    2015-01-01

    Vitamin D compounds belong to a group of secosteroids, which occur naturally as vitamin D3 in mammals and D2 in plants. Vitamin D is vital for bone health but recent studies have shown a much wider role in the pathologies of diseases such as diabetes, cancer, autoimmune, neurodegenerative, mental and cardiovascular diseases. Photosynthesis of vitamin D in the human skin and subsequent hepatic and renal metabolism generate a wide range of transformation products occurring over a large dynamic range spanning from picomolar to nanomolar levels. This necessitates selective and sensitive analytical methods to quantitatively capture these low concentration levels in relevant tissues such as blood. Ideally, vitamin D assessment would be performed using a universal and standardized analytical method available to clinical laboratories that provides reliable and accurate quantitative results for all relevant vitamin D metabolites with sufficiently high throughput. At present, LC-MS/MS assays are the most promising techniques for vitamin D analysis. The present review focuses on developments in mass spectrometry methodologies of the past 12 years. It will highlight detrimental influences of the biological matrix, epimer contributions, pitfalls of specific mass spectrometry data acquisition routines (in particular multiple reaction monitoring, MRM), influence of ionization source, derivatization reactions, inter-laboratory comparisons on precision, accuracy, and application range of vitamin D metabolites. © 2013 Wiley Periodicals, Inc.

  20. Is There a Cosmetic Advantage to Single-Incision Laparoscopic Surgical Techniques Over Standard Laparoscopic Surgery? A Systematic Review and Meta-analysis.

    Science.gov (United States)

    Evans, Luke; Manley, Kate

    2016-06-01

    Single-incision laparoscopic surgery represents an evolution of minimally invasive techniques, but has been a controversial development. A cosmetic advantage is stated by many authors, but has not been found to be universally present or even of considerable importance by patients. This systematic review and meta-analysis demonstrates that there is a cosmetic advantage of the technique regardless of the operation type. The treatment effect in terms of cosmetic improvement is of the order of 0.63.

  1. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images

    NARCIS (Netherlands)

    Zweerink, A.; Allaart, C.P.; Kuijer, J.P.A.; Wu, L.; Beek, A.M.; Ven, P.M. van de; Meine, M.; Croisille, P.; Clarysse, P.; Rossum, A.C. van; Nijveldt, R.

    2017-01-01

    OBJECTIVES: Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive

  2. Video Coding Technique using MPEG Compression Standards ...

    African Journals Online (AJOL)

    Some application areas of video compression focused on the problem of optimizing storage space and transmission bandwidth (BW). The two dimensional discrete cosine transform (2-D DCT) is an integral part of video and image compression, which is used in Moving Picture Expert Group (MPEG) encoding standards.

  3. Quality assurance techniques for activation analysis

    International Nuclear Information System (INIS)

    Becker, D.A.

    1984-01-01

    The principles and techniques of quality assurance are applied to the measurement method of activation analysis. Quality assurance is defined to include quality control and quality assessment. Plans for quality assurance include consideration of: personnel; facilities; analytical design; sampling and sample preparation; the measurement process; standards; and documentation. Activation analysis concerns include: irradiation; chemical separation; counting/detection; data collection, and analysis; and calibration. Types of standards discussed include calibration materials and quality assessment materials

  4. Analysis of soil and sewage sludge by ICP-OES and the German standard DIN 38414 sample preparation technique (P3)

    International Nuclear Information System (INIS)

    Edlund, M.; Heitland, P.; Visser, H.

    2002-01-01

    Full text: The elemental analyses of soil and sewage sludge has developed to become one of the main applications for ICP optical emission spectrometry (ICP-OES) and is described in many official procedures. These methods include different acid mixtures and digestion techniques. Even though the German standard DIN 38414 part 7 and the Dutch NEN 6465 do not guarantee complete recoveries for all elements, they are widely accepted in Europe. This paper describes sample preparation, line selection and investigates precision, accuracy and Limits of detection. The SPECTRO CIROSCCD EOP with axial plasma observation and the SPECTRO CIROSCCD SOP with radial observation were compared and evaluated for the analyses of soil and sewage sludge. Accuracy was investigated using the certified reference materials CRM-141 R, CRM-143 R and GSD 11. Both instruments show excellent performance in terms of speed, precision, accuracy and detection limits for the determination of trace metals in soil and sewage sludge. (author)

  5. Standardization of surgical techniques used in facial bone contouring.

    Science.gov (United States)

    Lee, Tae Sung

    2015-12-01

    Since the introduction of facial bone contouring surgery for cosmetic purposes, various surgical methods have been used to improve the aesthetics of facial contours. In general, by standardizing the surgical techniques, it is possible to decrease complication rates and achieve more predictable surgical outcomes, thereby increasing patient satisfaction. The technical strategies used by the author to standardize facial bone contouring procedures are introduced here. The author uses various pre-manufactured surgical tools and hardware for facial bone contouring. During a reduction malarplasty or genioplasty procedure, double-bladed reciprocating saws and pre-bent titanium plates customized for the zygomatic body, arch and chin are used. Various guarded oscillating saws are used for mandibular angloplasty. The use of double-bladed saws and pre-bent plates to perform reduction malarplasty reduces the chances of post-operative asymmetry or under- or overcorrection of the zygoma contours due to technical faults. Inferior alveolar nerve injury and post-operative jawline asymmetry or irregularity can be reduced by using a guarded saw during mandibular angloplasty. For genioplasty, final placement of the chin in accordance with preoperative quantitative analysis can be easily performed with pre-bent plates, and a double-bladed saw allows more procedural accuracy during osteotomies. Efforts by the surgeon to avoid unintentional faults are key to achieving satisfactory results and reducing the incidence of complications. The surgical techniques described in this study in conjunction with various in-house surgical tools and modified hardware can be used to standardize techniques to achieve aesthetically gratifying outcomes. Copyright © 2015 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  6. Human reliability assessment in a 99Mo/99mTc generator production facility using the standardized plant analysis risk-human (SPAR-H) technique.

    Science.gov (United States)

    Eyvazlou, Meysam; Dadashpour Ahangar, Ali; Rahimi, Azin; Davarpanah, Mohammad Reza; Sayyahi, Seyed Soheil; Mohebali, Mehdi

    2018-02-13

    Reducing human error is an important factor for enhancing safety protocols in various industries. Hence, analysis of the likelihood of human error in nuclear industries such as radiopharmaceutical production facilities has become more essential. This cross-sectional descriptive study was conducted to quantify the probability of human errors in a 99 Mo/ 99m Tc generator production facility in Iran. First, through expert interviews, the production process of the 99 Mo/ 99m Tc generator was analyzed using hierarchical task analysis (HTA). The standardized plant analysis risk-human (SPAR-H) method was then applied in order to calculate the probability of human error. Twenty tasks were determined using HTA. All of the eight performance shaping factors (PSF S ) were evaluated for the tasks. The mean probability of human error was 0.320. The highest and the lowest probability of human error in the 99 Mo/ 99m Tc generator production process, related to the 'loading the generator with the molybdenum solution' task and the 'generator elution' task, were 0.858 and 0.059, respectively. Required measures for reducing the human error probability (HEP) were suggested. These measures were derived from the level of PSF S that were evaluated in this study.

  7. Multivariate analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bendavid, Josh [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Fisher, Wade C. [Michigan State Univ., East Lansing, MI (United States); Junk, Thomas R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2016-01-01

    The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually both be improved by separating signal events from background events with higher efficiency and purity.

  8. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  9. Standard Establishment Through Scenarios (SETS): A new technique for occupational fitness standards.

    Science.gov (United States)

    Blacklock, R E; Reilly, T J; Spivock, M; Newton, P S; Olinek, S M

    2015-01-01

    An objective and scientific task analysis provides the basis for establishing legally defensible Physical Employment Standards (PES), based on common and essential occupational tasks. Infrequent performance of these tasks creates challenges when developing PES based on criterion, or content validity. Develop a systematic approach using Subject Matter Experts (SME) to provide tasks with 1) an occupationally relevant scenario considered common to all personnel; 2) a minimum performance standard defined by time, distance, load or work. Examples provided here relate to the development of a new PES for the Canadian Armed Forces (CAF). SME of various experience are selected based on their eligibility criteria. SME are required to define a reasonable scenario for each task from personal experience, provide occupational performance requirements of the scenario in sub-groups, and discuss and agree by consensus vote on the final standard based on the definition of essential. A common and essential task for the CAF is detailed as a case example of process application. Techniques to avoid common SME rating errors are discussed and advantages to the method described. The SETS method was developed as a systematic approach to setting occupational performance standards and qualifying information from SME.

  10. Standardization of P-33 by the TDCR efficiency calculation technique

    CSIR Research Space (South Africa)

    Simpson, BRS

    2004-02-01

    Full Text Available and Isotopes 60 (2004) 465?468 The standardization of 33P by the TDCR efficiency calculation technique B.R.S. Simpson*, W.M. Morris Radioactivity Standards Laboratory, CSIR?NML, 15 Lower Hope Road, Rosebank, Cape Town 7700, South Africa Abstract The activity... allowed). 2. Review of the TDCR efficiency calculation technique A LS source is viewed by a three-phototube detection system and the double-tube coincidence counting rate Nd and the triple-tube rate Nt are recorded simulta- neously. If N0 is the 33P...

  11. [How to write an andrological paper: standardization, mechanics and techniques].

    Science.gov (United States)

    Huang, Yu-Feng; Lu, Jin-Chun

    2010-12-01

    Andrological research papers not only reflect the current status and academic level of andrology, but also constitute an important communication platform for researchers and clinicians engaged in this field and contribute significantly to the development of andrology. It would be made easier to write a high-quality andrological paper once the author observes the basic requirements of research papers, adheres to the use of standard scientific terminology, knows the special writing mechanics, and equips himself with some essential writing techniques. Based on the long experience of editorship, we present a detailed introduction of the standardization, mechanics and techniques of writing an andrological paper.

  12. Paediatric sutureless circumcision-an alternative to the standard technique.

    LENUS (Irish Health Repository)

    2012-01-31

    INTRODUCTION: Circumcision is one of the most commonly performed surgical procedures in male children. A range of surgical techniques exist for this commonly performed procedure. The aim of this study is to assess the safety, functional outcome and cosmetic appearance of a sutureless circumcision technique. METHODS: Over a 9-year period, 502 consecutive primary sutureless circumcisions were performed by a single surgeon. All 502 cases were entered prospectively into a database including all relevant clinical details and a review was performed. The technique used to perform the sutureless circumcision is a modification of the standard sleeve technique with the use of a bipolar diathermy and the application of 2-octyl cyanoacrylate (2-OCA) to approximate the tissue edges. RESULTS: All boys in this study were pre-pubescent and the ages ranged from 6 months to 12 years (mean age 3.5 years). All patients had this procedure performed as a day case and under general anaesthetic. Complications included: haemorrhage (2.2%), haematoma (1.4%), wound infection (4%), allergic reaction (0.2%) and wound dehiscence (0.8%). Only 9 (1.8%) parents or patients were dissatisfied with the cosmetic appearance. CONCLUSION: The use of 2-OCA as a tissue adhesive for sutureless circumcisions is an alternative to the standard suture technique. The use of this tissue adhesive, 2-OCA, results in comparable complication rates to the standard circumcision technique and results in excellent post-operative cosmetic satisfaction.

  13. A novel wireless modulation technique for inter-standard communications

    NARCIS (Netherlands)

    Bekkaoui, A.; Haartsen, J.C.

    2006-01-01

    A new technique is presented with permits transceivers of different radio standards to communicate with each other. Applications are found in unlicensed bands where radios using different technologies, like for example Bluetooth, ZigBee, or WLAN IEEE 802.11, have to share the same radio spectrum. If

  14. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Borsaru, M.; Holmes, R.J.; Mathew, P.J.

    1983-01-01

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  15. NET European Network on Neutron Techniques Standardization for Structural Integrity

    International Nuclear Information System (INIS)

    Youtsos, A.

    2004-01-01

    Improved performance and safety of European energy production systems is essential for providing safe, clean and inexpensive electricity to the citizens of the enlarged EU. The state of the art in assessing internal stresses, micro-structure and defects in welded nuclear components -as well as their evolution due to complex thermo-mechanical loads and irradiation exposure -needs to be improved before relevant structural integrity assessment code requirements can safely become less conservative. This is valid for both experimental characterization techniques and predictive numerical algorithms. In the course of the last two decades neutron methods have proven to be excellent means for providing valuable information required in structural integrity assessment of advanced engineering applications. However, the European industry is hampered from broadly using neutron research due to lack of harmonised and standardized testing methods. 35 European major industrial and research/academic organizations have joined forces, under JRC coordination, to launch the NET European Network on Neutron Techniques Standardization for Structural Integrity in May 2002. The NET collaborative research initiative aims at further development and harmonisation of neutron scattering methods, in support of structural integrity assessment. This is pursued through a number of testing round robin campaigns on neutron diffraction and small angle neutron scattering - SANS and supported by data provided by other more conventional destructive and non-destructive methods, such as X-ray diffraction and deep and surface hole drilling. NET also strives to develop more reliable and harmonized simulation procedures for the prediction of residual stress and damage in steel welded power plant components. This is pursued through a number of computational round robin campaigns based on advanced FEM techniques, and on reliable data obtained by such novel and harmonized experimental methods. The final goal of

  16. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  17. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  18. National Green Building Standard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2012-07-01

    DOE's Building America Program is a research and development program to improve the energy performance of new and existing homes. The ultimate goal of the Building America Program is to achieve examples of cost-effective, energy efficient solutions for all U.S. climate zones. Periodic maintenance of an ANSI standard by review of the entire document and action to revise or reaffirm it on a schedule not to exceed five years is required by ANSI. In compliance, a consensus group has once again been formed and the National Green Building Standard is currently being reviewed to comply with the periodic maintenance requirement of an ANSI standard.

  19. Standardized fluoroscopy-based technique to measure intraoperative cup anteversion.

    Science.gov (United States)

    Zingg, Matthieu; Boudabbous, Sana; Hannouche, Didier; Montet, Xavier; Boettner, Friedrich

    2017-10-01

    Direct anterior approach (DAA) with the patient lying supine has facilitated the use of intraoperative fluoroscopy and allows for standardized positioning of the patient. The current study presents a new technique to measure acetabular component anteversion using intraoperative fluoroscopy. The current paper describes a mathematical formula to calculate true acetabular component anteversion based on the acetabular component abduction angle and the c-arm tilt angle (CaT). The CaT is determined by tilting the c-arm until an external pelvic oblique radiograph with the equatorial plane of the acetabular component perpendicular to the fluoroscopy receptor is obtained. CaT is determined by direct reading on the C-arm device. The technique was validated using a radiopaque synbone model comparing the described technique to computed tomography anteversion measurement. The experiment was repeated 25 times. The difference in anteversion between the two measuring techniques was on average 0.2° (range -3.0-3.1). The linear regression coefficients evaluating the agreement between the experimental and control methods were 0.99 (95%CI 0.88-1.10, p < 0.001) and 0.33 (95%CI -1.53-2.20, p = 0.713) for the slope and intercept, respectively. The current study confirms that the described three-step c-arm acetabular cup measuring technique can reproducibly and reliably assess acetabular component anteversion in the supine position, as compared to CT-imaging. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc. J Orthop Res 35:2307-2312, 2017. © 2017 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  20. Reef Fish Survey Techniques: Assessing the Potential for Standardizing Methodologies.

    Directory of Open Access Journals (Sweden)

    Zachary R Caldwell

    Full Text Available Dramatic changes in populations of fishes living on coral reefs have been documented globally and, in response, the research community has initiated efforts to assess and monitor reef fish assemblages. A variety of visual census techniques are employed, however results are often incomparable due to differential methodological performance. Although comparability of data may promote improved assessment of fish populations, and thus management of often critically important nearshore fisheries, to date no standardized and agreed-upon survey method has emerged. This study describes the use of methods across the research community and identifies potential drivers of method selection. An online survey was distributed to researchers from academic, governmental, and non-governmental organizations internationally. Although many methods were identified, 89% of survey-based projects employed one of three methods-belt transect, stationary point count, and some variation of the timed swim method. The selection of survey method was independent of the research design (i.e., assessment goal and region of study, but was related to the researcher's home institution. While some researchers expressed willingness to modify their current survey protocols to more standardized protocols (76%, their willingness decreased when methodologies were tied to long-term datasets spanning five or more years. Willingness to modify current methodologies was also less common among academic researchers than resource managers. By understanding both the current application of methods and the reported motivations for method selection, we hope to focus discussions towards increasing the comparability of quantitative reef fish survey data.

  1. Compressed air injection technique to standardize block injection pressures.

    Science.gov (United States)

    Tsui, Ban C H; Li, Lisa X Y; Pillay, Jennifer J

    2006-11-01

    Presently, no standardized technique exists to monitor injection pressures during peripheral nerve blocks. Our objective was to determine if a compressed air injection technique, using an in vitro model based on Boyle's law and typical regional anesthesia equipment, could consistently maintain injection pressures below a 1293 mmHg level associated with clinically significant nerve injury. Injection pressures for 20 and 30 mL syringes with various needle sizes (18G, 20G, 21G, 22G, and 24G) were measured in a closed system. A set volume of air was aspirated into a saline-filled syringe and then compressed and maintained at various percentages while pressure was measured. The needle was inserted into the injection port of a pressure sensor, which had attached extension tubing with an injection plug clamped "off". Using linear regression with all data points, the pressure value and 99% confidence interval (CI) at 50% air compression was estimated. The linearity of Boyle's law was demonstrated with a high correlation, r = 0.99, and a slope of 0.984 (99% CI: 0.967-1.001). The net pressure generated at 50% compression was estimated as 744.8 mmHg, with the 99% CI between 729.6 and 760.0 mmHg. The various syringe/needle combinations had similar results. By creating and maintaining syringe air compression at 50% or less, injection pressures will be substantially below the 1293 mmHg threshold considered to be an associated risk factor for clinically significant nerve injury. This technique may allow simple, real-time and objective monitoring during local anesthetic injections while inherently reducing injection speed.

  2. Algorithms Design Techniques and Analysis

    CERN Document Server

    Alsuwaiyel, M H

    1999-01-01

    Problem solving is an essential part of every scientific discipline. It has two components: (1) problem identification and formulation, and (2) solution of the formulated problem. One can solve a problem on its own using ad hoc techniques or follow those techniques that have produced efficient solutions to similar problems. This requires the understanding of various algorithm design techniques, how and when to use them to formulate solutions and the context appropriate for each of them. This book advocates the study of algorithm design techniques by presenting most of the useful algorithm desi

  3. A comparison between two lateral cephalometry techniques (standard and natural head position

    Directory of Open Access Journals (Sweden)

    Hedayati Z. Assistant Professor. Sang S. DMD

    2003-06-01

    Full Text Available Statement of Problem: Cephalometric findings are of high importance in making decision about orthodontic treatment plans and orthognathic surgeries. Natural head position (NHP is considered as a useful and exact radiographic technique."nAim: The aim of the present study was to compare two techniques, namely Standard and NHP, in lateral"ncephalometry."nMaterials and Methods: In this cross- sectional study, performed in Shiraz. 138 randomly selected students {70 males .and 68 females, age ranging from (13-15. were evaluated clinically and radiographically. None of them had a history of orthodontic treatment, head and face fracture or surgical operations. Lateral cephalograms were taken in both standard and natural head position techniques, for each student. For statistical analysis, l-lest for paired samples, was done."nResults: This study showed that in anterior-posterior dimension, among angles with significant differences in two techniques, those of standard one were more reliable, whereas in vertical dimension."nstatistical differences showed natural technique as more useful one."nConclusion: Due to the fact that natural cephalometry requires more patient cooperation, more time and a higher complex technique, it is not suggested, except when different clinical and cepholomeiric findings are observed.

  4. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    Smith, S.F.; Castleberry, K.N.; Nowlin, C.H.

    1992-01-01

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  5. [Study on standardization of cupping technique: elucidation on the establishment of the National Standard Standardized Manipulation of Acupuncture and Moxibustion, Part V, Cupping].

    Science.gov (United States)

    Gao, Shu-zhong; Liu, Bing

    2010-02-01

    From the aspects of basis, technique descriptions, core contents, problems and solutions, and standard thinking in standard setting process, this paper states experiences in the establishment of the national standard Standardized Manipulation of Acupuncture and Moxibustion, Part V, Cupping, focusing on methodologies used in cupping standard setting process, the method selection and operating instructions of cupping standardization, and the characteristics of standard TCM. In addition, this paper states the scope of application, and precautions for this cupping standardization. This paper also explaines tentative ideas on the research of standardized manipulation of acupuncture and moxibustion.

  6. Incorporating Experience Curves in Appliance Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Garbesi, Karina; Chan, Peter; Greenblatt, Jeffery; Kantner, Colleen; Lekov, Alex; Meyers, Stephen; Rosenquist, Gregory; Buskirk, Robert Van; Yang, Hung-Chia; Desroches, Louis-Benoit

    2011-10-31

    The technical analyses in support of U.S. energy conservation standards for residential appliances and commercial equipment have typically assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. There is, however, considerable evidence that this assumption does not reflect real market prices. Costs and prices generally fall in relation to cumulative production, a phenomenon known as experience and modeled by a fairly robust empirical experience curve. Using price data from the Bureau of Labor Statistics, and shipment data obtained as part of the standards analysis process, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These allow us to develop more representative appliance price projections than the assumption-based approach of constant prices. These experience curves were incorporated into recent energy conservation standards for these products. The impact on the national modeling can be significant, often increasing the net present value of potential standard levels in the analysis. In some cases a previously cost-negative potential standard level demonstrates a benefit when incorporating experience. These results imply that past energy conservation standards analyses may have undervalued the economic benefits of potential standard levels.

  7. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    Tenorio, D.

    2002-01-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  8. Development of communications analysis techniques

    Science.gov (United States)

    Shelton, R. D.

    1972-01-01

    Major results from the frequency analysis of system program (FASP) are reported. The FASP procedure was designed to analyze or design linear dynamic systems, but can be used to solve any problem that can be described by a system of linear time invariant differential equations. The program also shows plots of performance changes as design parameters are adjusted. Experimental results on narrowband FM distortion are also reported.

  9. Innovative Techniques Simplify Vibration Analysis

    Science.gov (United States)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  10. Incorporating experience curves in appliance standards analysis

    International Nuclear Information System (INIS)

    Desroches, Louis-Benoit; Garbesi, Karina; Kantner, Colleen; Van Buskirk, Robert; Yang, Hung-Chia

    2013-01-01

    There exists considerable evidence that manufacturing costs and consumer prices of residential appliances have decreased in real terms over the last several decades. This phenomenon is generally attributable to manufacturing efficiency gained with cumulative experience producing a certain good, and is modeled by an empirical experience curve. The technical analyses conducted in support of U.S. energy conservation standards for residential appliances and commercial equipment have, until recently, assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. This assumption does not reflect real market price dynamics. Using price data from the Bureau of Labor Statistics, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These experience curves were incorporated into recent energy conservation standards analyses for these products. Including experience curves increases the national consumer net present value of potential standard levels. In some cases a potential standard level exhibits a net benefit when considering experience, whereas without experience it exhibits a net cost. These results highlight the importance of modeling more representative market prices. - Highlights: ► Past appliance standards analyses have assumed constant equipment prices. ► There is considerable evidence of consistent real price declines. ► We incorporate experience curves for several large appliances into the analysis. ► The revised analyses demonstrate larger net present values of potential standards. ► The results imply that past standards analyses may have undervalued benefits.

  11. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    Data analysis techniques for gravitational wave observations. S V Dhurandhar ... The performance of some of these techniques on real data obtained will be discussed. Finally, some results on ... S V Dhurandhar1. Inter-University Centre for Astronomy and Astrophysics, Post Bag 4, Ganeshkhind, Pune 411 007, India ...

  12. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  13. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  14. Technique for fabrication of gradual standards of radiographic image blachening density

    International Nuclear Information System (INIS)

    Borovin, I.V.; Kondina, M.A.

    1987-01-01

    The technique of fabrication of gradual standards of blackening density for industrial radiography by contact printing from a negative is described. The technique is designed for possibilities of industrial laboratoriesof radiation defectoscopy possessing no special-purpose sensitometric equipment

  15. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  16. Basic prediction techniques in modern video coding standards

    CERN Document Server

    Kim, Byung-Gyu

    2016-01-01

    This book discusses in detail the basic algorithms of video compression that are widely used in modern video codec. The authors dissect complicated specifications and present material in a way that gets readers quickly up to speed by describing video compression algorithms succinctly, without going to the mathematical details and technical specifications. For accelerated learning, hybrid codec structure, inter- and intra- prediction techniques in MPEG-4, H.264/AVC, and HEVC are discussed together. In addition, the latest research in the fast encoder design for the HEVC and H.264/AVC is also included.

  17. Elemental analyses of goundwater: demonstrated advantage of low-flow sampling and trace-metal clean techniques over standard techniques

    Science.gov (United States)

    Creasey, C. L.; Flegal, A. R.

    The combined use of both (1) low-flow purging and sampling and (2) trace-metal clean techniques provides more representative measurements of trace-element concentrations in groundwater than results derived with standard techniques. The use of low-flow purging and sampling provides relatively undisturbed groundwater samples that are more representative of in situ conditions, and the use of trace-element clean techniques limits the inadvertent introduction of contaminants during sampling, storage, and analysis. When these techniques are applied, resultant trace-element concentrations are likely to be markedly lower than results based on standard sampling techniques. In a comparison of data derived from contaminated and control groundwater wells at a site in California, USA, trace-element concentrations from this study were 2-1000 times lower than those determined by the conventional techniques used in sampling of the same wells prior to (5months) and subsequent to (1month) the collections for this study. Specifically, the cadmium and chromium concentrations derived using standard sampling techniques exceed the California Maximum Contaminant Levels (MCL), whereas in this investigation concentrations of both of those elements are substantially below their MCLs. Consequently, the combined use of low-flow and trace-metal clean techniques may preclude erroneous reports of trace-element contamination in groundwater. Résumé L'utilisation simultanée de la purge et de l'échantillonnage à faible débit et des techniques sans traces de métaux permet d'obtenir des mesures de concentrations en éléments en traces dans les eaux souterraines plus représentatives que les résultats fournis par les techniques classiques. L'utilisation de la purge et de l'échantillonnage à faible débit donne des échantillons d'eau souterraine relativement peu perturbés qui sont plus représentatifs des conditions in situ, et le recours aux techniques sans éléments en traces limite l

  18. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Prust, J.O.

    1985-05-01

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  19. Preliminary detection of explosive standard components with Laser Raman Technique

    International Nuclear Information System (INIS)

    Botti, S.; Ciardi, R.

    2008-01-01

    Presently, our section is leader of the ISOTREX project (Integrated System for On-line TRace EXplosives detection in solid, liquid and vapour state), funded in the frame of the PASR 2006 action (Preparatory Action on the enhancement of the European industrial potential in the field of Security Research Preparatory Action) of the 6. EC framework. ISOTREX project will exploit the capabilities of different laser techniques as LIBS (Laser Induced Breakdown Spectroscopy), LPA (Laser Photo Acustic) and CRDS (Cavity Ring Down Spectroscopy) to monitor explosive traces. In this frame, we extended our investigation also to the laser induced Raman effect spectroscopy, in order to investigate its capabilities and possible future integration. We analysed explosive samples in bulk solid phase, diluted liquid phase and as evaporated films over suitable substrate. In the following, we present the main results obtained, outlining preliminary conclusions [it

  20. A numerical technique for reactor subchannel analysis

    International Nuclear Information System (INIS)

    Fath, Hassan E.S.

    1983-01-01

    A numerical technique is developed for the solution of the transient boundary layer equations with a moving liquid-vapour interface boundary. The technique uses the finite difference method with the velocity components defined over an Eulerian mesh. A system of interface massless markers is defined where the markers move with the flow field according to a simple kinematic relation between the interface geometry and the fluid velocity. Different applications of nuclear engineering interest are reported with some available results. The present technique is capable of predicting the interface profile near the wall which is important in the reactor subchannel analysis

  1. Standardization of Berberis aristata extract through conventional and modern HPTLC techniques

    Directory of Open Access Journals (Sweden)

    Dinesh K. Patel

    2012-05-01

    Full Text Available Objective: Berberis aristata (Berberidaceae is an important medicinal plant, found in the different region of the world. It has significant medicinal value in the traditional Indian and Chinese system of medicine. The aim of the present investigation includes qualitative and quantitative analysis of Berberis aristata extract. Methods: Present study includes determination of phytochemical analysis, solubility test, heavy metal analysis, antimicrobial study and quantitative analysis by HPTLC method. Results: Preliminary phytochemical analysis showed the presence of carbohydrate, glycoside, alkaloid, protein, amino acid, saponin, tannin and flavonoid. Solubility in water and alcohal were found to be 81.90% in water and 84.52% in 50% in alcohal. Loss on drying was found to be 5.32%. Total phenol and flavonoid content were found to be 0.11% and 2.8%. Level of lead, arsenic, mercury and cadmium complies the standard level. E. coli and salmonella was found to be absent whereas total bacterial count, yeast and moulds contents were found to be under the limit. Content of berberine was found to be 13.47% through HPTLC techniques. Conclusions: The results obtained from the present studies could be used as source of valuable information which can play an important role for the food scientists, researchers and even the consumers for its standards.

  2. A review of sensitivity analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hamby, D.M.

    1993-12-31

    Mathematical models are utilized to approximate various highly complex engineering, physical, environmental, social, and economic phenomena. Model parameters exerting the most influence on model results are identified through a {open_quotes}sensitivity analysis.{close_quotes} A comprehensive review is presented of more than a dozen sensitivity analysis methods. The most fundamental of sensitivity techniques utilizes partial differentiation whereas the simplest approach requires varying parameter values one-at-a-time. Correlation analysis is used to determine relationships between independent and dependent variables. Regression analysis provides the most comprehensive sensitivity measure and is commonly utilized to build response surfaces that approximate complex models.

  3. Gold analysis by the gamma absorption technique

    International Nuclear Information System (INIS)

    Kurtoglu, Arzu; Tugrul, A.B.

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement

  4. Sensitivity analysis of hybrid thermoelastic techniques

    Science.gov (United States)

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  5. Fourier Spectroscopy: A Simple Analysis Technique

    Science.gov (United States)

    Oelfke, William C.

    1975-01-01

    Presents a simple method of analysis in which the student can integrate, point by point, any interferogram to obtain its Fourier transform. The manual technique requires no special equipment and is based on relationships that most undergraduate physics students can derive from the Fourier integral equations. (Author/MLH)

  6. Neutron activation analysis for certification of standard reference materials

    International Nuclear Information System (INIS)

    Capote Rodriguez, G.; Perez Zayas, G.; Hernandez Rivero, A.; Ribeiro Guevara, S.

    1996-01-01

    Neutron activation analysis is used extensively as one of the analytical techniques in the certification of standard reference materials. Characteristics of neutron activation analysis which make it valuable in this role are: accuracy multielemental capability to asses homogeneity, high sensitivity for many elements, and essentially non-destructive method. This paper report the concentrations of 30 elements (major, minor and trace elements) in four Cuban samples. The samples were irradiated in a thermal neutron flux of 10 12- 10 13 n.cm 2. s -1. The gamma ray spectra were measured by HPGe detectors and were analyzed using ACTAN program development in Center of Applied Studies for Nuclear Development

  7. Non standard analysis, polymer models, quantum fields

    International Nuclear Information System (INIS)

    Albeverio, S.

    1984-01-01

    We give an elementary introduction to non standard analysis and its applications to the theory of stochastic processes. This is based on a joint book with J.E. Fenstad, R. Hoeegh-Krohn and T. Lindstroeem. In particular we give a discussion of an hyperfinite theory of Dirichlet forms with applications to the study of the Hamiltonian for a quantum mechanical particle in the potential created by a polymer. We also discuss new results on the existence of attractive polymer measures in dimension d 1 2 phi 2 2 )sub(d)-model of interacting quantum fields. (orig.)

  8. Accession Medical Standards Analysis and Research Activity

    Science.gov (United States)

    2010-01-01

    each Initial Entry Training ( IET ) sites to USMEPCOM but this reporting is not required by service regulations. The total numbers of reported...classification and reporting from the IET sites to MEPCOM, which is still passive, should be mandated and standardized by DoD/service regulations. Analysis would...attrit Hernia 2,029 19.7 565 27.8 335 59.3 158 47.2 18 11.4 Gastroesophageal reflux disease (GERD) 300 2.9 91 30.3 46 50.5 24 52.2 9 37.5 Diabetes

  9. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  11. Technology and Technique Standards for Camera-Acquired Digital Dermatologic Images: A Systematic Review.

    Science.gov (United States)

    Quigley, Elizabeth A; Tokay, Barbara A; Jewell, Sarah T; Marchetti, Michael A; Halpern, Allan C

    2015-08-01

    Photographs are invaluable dermatologic diagnostic, management, research, teaching, and documentation tools. Digital Imaging and Communications in Medicine (DICOM) standards exist for many types of digital medical images, but there are no DICOM standards for camera-acquired dermatologic images to date. To identify and describe existing or proposed technology and technique standards for camera-acquired dermatologic images in the scientific literature. Systematic searches of the PubMed, EMBASE, and Cochrane databases were performed in January 2013 using photography and digital imaging, standardization, and medical specialty and medical illustration search terms and augmented by a gray literature search of 14 websites using Google. Two reviewers independently screened titles of 7371 unique publications, followed by 3 sequential full-text reviews, leading to the selection of 49 publications with the most recent (1985-2013) or detailed description of technology or technique standards related to the acquisition or use of images of skin disease (or related conditions). No universally accepted existing technology or technique standards for camera-based digital images in dermatology were identified. Recommendations are summarized for technology imaging standards, including spatial resolution, color resolution, reproduction (magnification) ratios, postacquisition image processing, color calibration, compression, output, archiving and storage, and security during storage and transmission. Recommendations are also summarized for technique imaging standards, including environmental conditions (lighting, background, and camera position), patient pose and standard view sets, and patient consent, privacy, and confidentiality. Proposed standards for specific-use cases in total body photography, teledermatology, and dermoscopy are described. The literature is replete with descriptions of obtaining photographs of skin disease, but universal imaging standards have not been developed

  12. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  13. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    Jonah, S. A.

    2000-07-01

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235 U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  14. Chromatographic Techniques for Rare Earth Elements Analysis

    Science.gov (United States)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  15. ANALYSIS OF COMPUTER AIDED PROCESS PLANNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Salim A. Saleh

    2013-05-01

    Full Text Available Computer Aided Process Planning ( CAPP has been recognized as playing a key role in Computer Integrated Manufacturing ( CIM . It was used as a bridge to link CAD with CAM systems, in order to give the possibility of full integration in agreement with computer engineering to introduce CIM. The benefits of CAPP in the real industrial environment are still to be achieved. Due to different manufacturing applications, many different CAPP systems have been developed. The development of CAPP techniques needs to a summarized classification and a descriptive analysis. This paper presents the most important and famous techniques for the available CAPP systems, which are based on the variant, generative or semi-generative methods, and a descriptive analysis of their application possibilities.

  16. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  17. Accelerometer Data Analysis and Presentation Techniques

    Science.gov (United States)

    Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy

    1997-01-01

    The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.

  18. Comparison of analysis techniques for electromyographic data.

    Science.gov (United States)

    Johnson, J C

    1978-01-01

    Electromyography has been effectively employed to estimate the stress encountered by muscles in performing a variety of functions in the static environment. Such analysis provides the basis for modification of a man-machine system in order to optimize the performances of individual tasks by reducing muscle stress. Myriad analysis methods have been proposed and employed to convert raw electromyographic data into numerical indices of stress and, more specifically, muscle work. However, the type of analysis technique applied to the data can significantly affect the outcome of the experiment. In this study, four methods of analysis are employed to simultaneously process electromyographic data from the flexor muscles of the forearm. The methods of analysis include: 1) integrated EMG (three separate time constants), 2) root mean square voltage, 3) peak height discrimination (three level), and 4) turns counting (two methods). Mechanical stress input as applied to the arm of the subjects includes static load and vibration. The results of the study indicate the comparative sensitivity of each of the techniques to changes in EMG resulting from changes in static and dynamic load on the muscle.

  19. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  20. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  1. Cleanup standards and pathways analysis methods

    International Nuclear Information System (INIS)

    Devgun, J.S.

    1993-01-01

    Remediation of a radioactively contaminated site requires that certain regulatory criteria be met before the site can be released for unrestricted future use. Since the ultimate objective of remediation is to protect the public health and safety, residual radioactivity levels remaining at a site after cleanup must be below certain preset limits or meet acceptable dose or risk criteria. Release of a decontaminated site requires proof that the radiological data obtained from the site meet the regulatory criteria for such a release. Typically release criteria consist of a composite of acceptance limits that depend on the radionuclides, the media in which they are present, and federal and local regulations. In recent years, the US Department of Energy (DOE) has developed a pathways analysis model to determine site-specific soil activity concentration guidelines for radionuclides that do not have established generic acceptance limits. The DOE pathways analysis computer code (developed by Argonne National Laboratory for the DOE) is called RESRAD (Gilbert et al. 1989). Similar efforts have been initiated by the US Nuclear Regulatory Commission (NRC) to develop and use dose-related criteria based on genetic pathways analyses rather than simplistic numerical limits on residual radioactivity. The focus of this paper is radionuclide contaminated soil. Cleanup standards are reviewed, pathways analysis methods are described, and an example is presented in which RESRAD was used to derive cleanup guidelines

  2. A new analysis technique for microsamples

    International Nuclear Information System (INIS)

    Boyer, R.; Journoux, J.P.; Duval, C.

    1989-01-01

    For many decades, isotopic analysis of Uranium or Plutonium has been performed by mass spectrometry. The most recent analytical techniques, using the counting method or a plasma torch combined with a mass spectrometer (ICP.MS) have not yet to reach a greater degree of precision than the older methods in this field. The two means of ionization for isotopic analysis - by electronic bombardment of atoms or molecules (source of gas ions) and - by thermal effect (thermoionic source) are compared revealing some inconsistency between the quantity of sample necessary for analysis and the luminosity. In fact, the quantity of sample necessary for the gas source mass spectrometer is 10 to 20 times greater than that for the thermoionization spectrometer, while the sample consumption is between 10 5 to 10 6 times greater. This proves that almost the entire sample is not necessary for the measurement; it is only required because of the system of introduction for the gas spectrometer. The new analysis technique referred to as ''Microfluorination'' corrects this anomaly and exploits the advantages of the electron bombardment method of ionization

  3. Forensic Analysis using Geological and Geochemical Techniques

    Science.gov (United States)

    Hoogewerff, J.

    2009-04-01

    Due to the globalisation of legal (and illegal) trade there is an increasing demand for techniques which can verify the geographical origin and transfer routes of many legal and illegal commodities and products. Although geological techniques have been used in forensic investigations since the emergence of forensics as a science in the late eighteen hundreds, the last decade has seen a marked increase in geo-scientists initiating concept studies using the latest analytical techniques, including studying natural abundance isotope variations, micro analysis with laser ablation ICPMS and geochemical mapping. Most of the concept studies have shown a good potential but uptake by the law enforcement and legal community has been limited due to concerns about the admissibility of the new methods. As an introduction to the UGU2009 session "Forensic Provenancing using Geological and Geochemical Techniques" I will give an overview of the state of the art of forensic geology and the issues that concern the admissibility of geological forensic evidence. I will use examples from the NITECRIME and FIRMS networks, the EU TRACE project and other projects and literature to illustrate the important issues at hand.

  4. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  5. Biomechanical Analysis of Contemporary Throwing Technique Theory

    Directory of Open Access Journals (Sweden)

    Chen Jian

    2015-01-01

    Full Text Available Based on the movement process of throwing and in order to further improve the throwing technique of our country, this paper will first illustrate the main influence factors which will affect the shot distance via the mutual combination of movement equation and geometrical analysis. And then, it will give the equation of the acting force that the throwing athletes have to bear during throwing movement; and will reach the speed relationship between each arthrosis during throwing and batting based on the kinetic analysis of the throwing athletes’ arms while throwing. This paper will obtain the momentum relationship of the athletes’ each arthrosis by means of rotational inertia analysis; and then establish a restricted particle dynamics equation from the Lagrange equation. The obtained result shows that the momentum of throwing depends on the momentum of the athletes’ wrist joints while batting.

  6. Multielement analysis of biological standards by neutron activation analysis

    International Nuclear Information System (INIS)

    Nadkarni, R.A.

    1977-01-01

    Up to 28 elements were determined in two IAEA standards: Animal Muscle H4 and Fish Soluble A 6/74, and three NBS standards: Spinach: SRM-1570, Tomato Leaves: SRM-1573 and Pine Needles: SRM-1575 by instrumental neutron-activation analysis. Seven noble metals were determined in two NBS standards: Coal: SRM-1632 and Coal Fly Ash: SRM-1633 by radiochemical procedure while 11 rare earth elements were determined in NBS standard Orchard Leaves: SRM-1571 by instrumental neutron-activation analysis. The results are in good agreement with the certified and/or literature data where available. The irradiations were performed at the Cornell TRIGA Mark II nuclear reactor at a thermal neutron flux of 1-3x10 12 ncm -2 sec -1 . The short-lived species were determined after a 2-minute irradiation in the pneumatic rabbit tube, and the longer-lived species after an 8-hour irradiation in the central thimble facility. The standards and samples were counted on coaxial 56-cm 3 Ge(Li) detector. The system resolution was 1.96 keV (FWHM) with a peak to Compton ratio of 37:1 and counting efficiency of 13%, all compared to the 1.332 MeV photopeak of Co-60. (T.I.)

  7. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis

    International Nuclear Information System (INIS)

    2016-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  8. Clustering Analysis within Text Classification Techniques

    Directory of Open Access Journals (Sweden)

    Madalina ZURINI

    2011-01-01

    Full Text Available The paper represents a personal approach upon the main applications of classification which are presented in the area of knowledge based society by means of methods and techniques widely spread in the literature. Text classification is underlined in chapter two where the main techniques used are described, along with an integrated taxonomy. The transition is made through the concept of spatial representation. Having the elementary elements of geometry and the artificial intelligence analysis, spatial representation models are presented. Using a parallel approach, spatial dimension is introduced in the process of classification. The main clustering methods are described in an aggregated taxonomy. For an example, spam and ham words are clustered and spatial represented, when the concepts of spam, ham and common and linkage word are presented and explained in the xOy space representation.

  9. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  10. Are neck nodal volumes drawn on CT slices covered by standard three-field technique?

    Science.gov (United States)

    Sanguineti, Giuseppe; Culp, Laura R; Endres, Eugene J; Bayouth, John E

    2004-07-01

    Several definitions have been proposed in the past few years on how to contour the various neck nodal levels on CT slices. However, whether the resulting nodal volumes would have been covered by standard techniques is unknown. The purpose of this study was to clarify this issue. Eight patients (N0-N1) with head-and-neck cancer from various primary sites referred to us for definitive radiotherapy were included in this study. Two observers contoured the level Ib-V neck nodal volumes on planning CT according to seven reported definitions. Each observer also drew blocks on digitally reconstructed radiographs for the initial (on-cord) phase of a standard three-field technique (parallel opposed lateral fields and AP supraclavicular field) for three different clinical settings: "medium" larynx (to cover upper, mid, and low jugular nodes), "big" larynx (same as for medium, plus posterior cervical nodes), and "tonsil" (same as for big plus retropharyngeal nodes). Fields blocks were concentrically reduced 5 mm in all directions as a surrogate for the clinical target volume to planning target volume expansion. A plan was created for each of the clinical settings, delivering 2 Gy to the International Commission on Radiation Units and Measurements reference point. The coverage of the nodal levels according to the various definitions was investigated throughout the analysis of the volume receiving 50%, 80%, and 95% of the prescribed dose (V(50), V(80), and V(95), respectively) and dose covering at least 95% of the volume (D(95)) values extracted from their cumulative dose-volume histograms in the three clinical settings. The V(50) coverage of levels III and IV was adequate for all definitions and trials. For level V, about 3-5% of the volume was outside the 50% isodose of those trials that targeted the posterior cervical chain. Coverage of level Ib was highly dependent on the definition, with up to 21% of the volume outside the standard tonsillar fields. For level II, although

  11. Interferogram analysis using the Abel inversion technique

    International Nuclear Information System (INIS)

    Yusof Munajat; Mohamad Kadim Suaidi

    2000-01-01

    High speed and high resolution optical detection system were used to capture the image of acoustic waves propagation. The freeze image in the form of interferogram was analysed to calculate the transient pressure profile of the acoustic waves. The interferogram analysis was based on the fringe shift and the application of the Abel inversion technique. An easier approach was made by mean of using MathCAD program as a tool in the programming; yet powerful enough to make such calculation, plotting and transfer of file. (Author)

  12. Interferogram analysis using Fourier transform techniques

    Science.gov (United States)

    Roddier, Claude; Roddier, Francois

    1987-01-01

    A method of interferogram analysis is described in which Fourier transform techniques are used to map the complex fringe visibility in several types of interferograms. Algorithms are developed for estimation of both the amplitude and the phase of the fringes (yielding the modulus and the phase of the holographically recorded object Fourier transform). The algorithms were applied to the reduction of interferometric seeing measurements (i.e., the estimation of the fringe amplitude only), and the reduction of interferometric tests (i.e., estimation of the fringe phase only). The method was used to analyze scatter-plate interferograms obtained at NOAO.

  13. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente

    2005-01-01

    Ultra performance liquid chromatography is a new development of the HPLC separation technique that allows separations on column materials at high pressures up to 10(8) Pa using particle diameters of 1.7 mu m. This increases the efficiency, the resolution and the speed of the separation. Four...... aqueous selenium standards were separated within 1.2 min on a 1.00 id x 50 mm reversed phase column in an ion-pair chromatographic system using a flow rate of 200 mu L min(-1). Hence, analysis times could be reduced to 1/10 compared with ordinary HPLC for aqueous standards. The precision and detection...... the use of short columns. Hence, analysis times could be halved without loss of separation efficiency in this biological sample...

  14. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    the methods in two: Those who assume independence between the variables and thus use a diagonal estimate of the within-class covariance matrix, and those who assume dependence between the variables and thus use an estimate of the within-class covariance matrix, which also estimates the correlations between......This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... generally focus on two things: Obtaining sparsity (variable selection) and regularizing the estimate of the within-class covariance matrix. For high-dimensional data, this gives rise to increased interpretability and generalization ability over standard linear discriminant analysis. Here, we group...

  15. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  16. LOGISTIC REGRESSION ANALYSIS WITH STANDARDIZED MARKERS.

    Science.gov (United States)

    Huang, Ying; Pepe, Margaret S; Feng, Ziding

    2013-09-01

    Two different approaches to analysis of data from diagnostic biomarker studies are commonly employed. Logistic regression is used to fit models for probability of disease given marker values while ROC curves and risk distributions are used to evaluate classification performance. In this paper we present a method that simultaneously accomplishes both tasks. The key step is to standardize markers relative to the non-diseased population before including them in the logistic regression model. Among the advantages of this method are: (i) ensuring that results from regression and performance assessments are consistent with each other; (ii) allowing covariate adjustment and covariate effects on ROC curves to be handled in a familiar way, and (iii) providing a mechanism to incorporate important assumptions about structure in the ROC curve into the fitted risk model. We develop the method in detail for the problem of combining biomarker datasets derived from multiple studies, populations or biomarker measurement platforms, when ROC curves are similar across data sources. The methods are applicable to both cohort and case-control sampling designs. The dataset motivating this application concerns Prostate Cancer Antigen 3 (PCA3) for diagnosis of prostate cancer in patients with or without previous negative biopsy where the ROC curves for PCA3 are found to be the same in the two populations. Estimated constrained maximum likelihood and empirical likelihood estimators are derived. The estimators are compared in simulation studies and the methods are illustrated with the PCA3 dataset.

  17. ICT Standardization and use of ICT standards: a firm level analysis

    OpenAIRE

    Riillo, Cesare Fabio Antonio

    2014-01-01

    Standards perform some fundamental economic functions and their relevance for ICT is acknowledged by firms, researchers and policy-makers. This paper investigates the driving forces of formal ICT standards setting (i.e. standardization). Previous quantitative studies have neglected that ICT standards use and engagement in ICT standardization are related activities. Leveraging upon a unique module of the ICT usage survey 2013 for Luxembourg, the analysis explicitly takes into account the use o...

  18. Low energy analysis techniques for CUORE

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Huang, H.Z.; Sakai, M.; Schmidt, J. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); D' Aguanno, D.; Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Drobizhev, A.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Freedman, S.J. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V. [Universite Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Saragossa (Spain); Moggi, N.; Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Universita di Bologna - Alma Mater Studiorum, Dipartimento di Fisica e Astronomia, Bologna (IT); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (FR); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (US); University of California, Department of Nuclear Engineering, Berkeley, CA (US); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (US); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (US); Wise, T. [Yale University, Department of Physics, New Haven, CT (US); University of Wisconsin, Department of Physics, Madison, WI (US); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (GB); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (US)

    2017-12-15

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of {sup 130}Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. In this paper, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, a single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60 keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils in CUORE-0. (orig.)

  19. Flow Diversion versus Standard Endovascular Techniques for the Treatment of Unruptured Carotid-Ophthalmic Aneurysms.

    Science.gov (United States)

    Di Maria, F; Pistocchi, S; Clarençon, F; Bartolini, B; Blanc, R; Biondi, A; Redjem, H; Chiras, J; Sourour, N; Piotin, M

    2015-12-01

    Over the past few years, flow diversion has been increasingly adopted for the treatment of intracranial aneurysms, especially in the paraclinoid and paraophthalmic carotid segment. We compared clinical and angiographic outcomes and complication rates in 2 groups of patients with unruptured carotid-ophthalmic aneurysms treated for 7 years by either standard coil-based techniques or flow diversion. From February 2006 to December 2013, 162 unruptured carotid-ophthalmic aneurysms were treated endovascularly in 138 patients. Sixty-seven aneurysms were treated by coil-based techniques in 61 patients. Flow diverters were deployed in 95 unruptured aneurysms (77 patients), with additional coiling in 27 patients. Complication rates, clinical outcome, and immediate and long-term angiographic results were retrospectively analyzed. No procedure-related deaths occurred. Four procedure-related thromboembolic events (6.6%) leading to permanent morbidity in 1 case (1.6%) occurred in the coiling group. Neurologic complications were observed in 6 patients (7.8%) in the flow-diversion group, resulting in 3.9% permanent morbidity. No statistically significant difference was found between complication (P = .9) and morbidity rates (P = .6). In the coiling group (median follow-up, 31.5 ± 24.5 months), recanalization occurred at 1 year in 23/50 (54%) aneurysms and 27/55 aneurysms (50.9%) at the latest follow-up, leading to retreatment in 6 patients (9%). In the flow-diversion group (mean follow-up, 13.5 ± 10.8 months), 85.3% (35/41) of all aneurysms were occluded after 12 months, and 74.6% (50/67) on latest follow-up. The retreatment rate was 2.1%. Occlusion rates between the 2 groups differed significantly at 12 months (P < .001) and at the latest follow-up (P < .005). Our retrospective analysis shows better long-term occlusion of carotid-ophthalmic aneurysms after use of flow diverters compared with standard coil-based techniques, without significant differences in permanent morbidity

  20. MIMO wireless networks channels, techniques and standards for multi-antenna, multi-user and multi-cell systems

    CERN Document Server

    Clerckx, Bruno

    2013-01-01

    This book is unique in presenting channels, techniques and standards for the next generation of MIMO wireless networks. Through a unified framework, it emphasizes how propagation mechanisms impact the system performance under realistic power constraints. Combining a solid mathematical analysis with a physical and intuitive approach to space-time signal processing, the book progressively derives innovative designs for space-time coding and precoding as well as multi-user and multi-cell techniques, taking into consideration that MIMO channels are often far from ideal. Reflecting developments

  1. Laryngeal mask airway insertion in children: comparison between rotational, lateral and standard technique.

    Science.gov (United States)

    Ghai, Babita; Makkar, Jeetinder Kaur; Bhardwaj, Neerja; Wig, Jyotsna

    2008-04-01

    The purpose of the study was to compare the success and ease of insertion of three techniques of laryngeal mask airway (LMA) insertion; the standard Brain technique, a lateral technique with cuff partially inflated and a rotational technique with cuff partially inflated. One hundred and sixty-eight ASA I and II children aged 6 months to 6 years undergoing short elective surgical procedures lasting 40-60 min were included in the study. A standard anesthesia protocol was followed for all patients. Patients were randomly allocated into one of the three groups i.e. standard (S), rotational (R) and lateral (L). The primary outcome measure of the study was success rate at the first attempt using three techniques of LMA insertion. Secondary outcomes measures studied were overall success rate, time before successful LMA insertion, complications and maneuvers used to relieve airway obstruction. Successful insertion at the first attempt was significantly higher in group R (96%) compared with group L (84%) and group S (80%) (P = 0.03). Overall success rate (i.e. successful insertion with two attempts) was 100% for group R, 93% for group L and 87% for group S (P = 0.03). Time for successful insertion was significantly lower in group R compared with group L and S (P insertion and lowest incidence of complications and could be the technique of first choice for LMA insertion in pediatric patients.

  2. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    Sathisan, S.K.; Chagari, A.K.

    1994-01-01

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  3. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  4. Data Analysis Techniques for Ligo Detector Characterization

    Science.gov (United States)

    Valdes Sanchez, Guillermo A.

    Gravitational-wave astronomy is a branch of astronomy which aims to use gravitational waves to collect observational data about astronomical objects and events such as black holes, neutron stars, supernovae, and processes including those of the early universe shortly after the Big Bang. Einstein first predicted gravitational waves in the early century XX, but it was not until Septem- ber 14, 2015, that the Laser Interferometer Gravitational-Wave Observatory (LIGO) directly ob- served the first gravitational waves in history. LIGO consists of two twin detectors, one in Livingston, Louisiana and another in Hanford, Washington. Instrumental and sporadic noises limit the sensitivity of the detectors. Scientists conduct Data Quality studies to distinguish a gravitational-wave signal from the noise, and new techniques are continuously developed to identify, mitigate, and veto unwanted noise. This work presents the application of data analysis techniques, such as Hilbert-Huang trans- form (HHT) and Kalman filtering (KF), in LIGO detector characterization. We investigated the application of HHT to characterize the gravitational-wave signal of the first detection, we also demonstrated the functionality of HHT identifying noise originated from light being scattered by perturbed surfaces, and we estimated thermo-optical aberration using KF. We put particular attention to the scattering origin application, for which a tool was developed to identify disturbed surfaces originating scattering noise. The results reduced considerably the time to search for the scattering surface and helped LIGO commissioners to mitigate the noise.

  5. Chromatographic screening techniques in systematic toxicological analysis.

    Science.gov (United States)

    Drummer, O H

    1999-10-15

    A review of techniques used to screen biological specimens for the presence of drugs was conducted with particular reference to systematic toxicological analysis. Extraction systems of both the liquid-liquid and solid-phase type show little apparent difference in their relative ability to extract a range of drugs according to their physio-chemical properties, although mixed-phase SPE extraction is a preferred technique for GC-based applications, and liquid-liquid were preferred for HPLC-based applications. No one chromatographic system has been shown to be capable of detecting a full range of common drugs of abuse, and common ethical drugs, hence two or more assays are required for laboratories wishing to cover a reasonably comprehensive range of drugs of toxicological significance. While immunoassays are invariably used to screen for drugs of abuse, chromatographic systems relying on derivatization and capable of extracting both acidic and basic drugs would be capable of screening a limited range of targeted drugs. Drugs most difficult to detect in systematic toxicological analysis include LSD, psilocin, THC and its metabolites, fentanyl and its designer derivatives, some potent opiates, potent benzodiazepines and some potent neuroleptics, many of the newer anti-convulsants, alkaloids colchicine, amantins, aflatoxins, antineoplastics, coumarin-based anti-coagulants, and a number of cardiovascular drugs. The widespread use of LC-MS and LC-MS-MS for specific drug detection and the emergence of capillary electrophoresis linked to MS and MS-MS provide an exciting possibility for the future to increase the range of drugs detected in any one chromatographic screening system.

  6. Analysis of diatomaceous earth by x-ray fluorescence techniques

    International Nuclear Information System (INIS)

    Parker, J.

    1985-01-01

    The use of diatomaceous earth in industry as filtering aids, mineral fillers, catalyst carriers, chromatographic supports, and paint additives is well documented. The diatomite matrix is well suited to x-ray analysis, but this application has not been cited in the literature. In our laboratory, x-ray fluorescence spectrometry has been used to support the analytical needs of diatomite product development. Lithium borate fusion and pressed powder techniques have been used to determine major, minor, and trace elements in diatomite and synthetic silicate samples. Conventional matrix correction models and fundamental parameters have been used to reduce x-ray measurements to accurate chemical analyses. Described are sample and standard preparation techniques, data reduction methods, applications, and results

  7. Qualitative analysis of Orzooiyeh plain groundwater resources using GIS techniques

    Directory of Open Access Journals (Sweden)

    Mohsen Pourkhosravani

    2016-09-01

    Full Text Available Background: Unsustainable development of human societies, especially in arid and semi-arid areas, is one of the most important environmental hazards that require preservation of groundwater resources, and permanent study of qualitative and quantitative changes through sampling. Accordingly, this research attempts to assess and analyze the spatial variation of quantitative and qualitative indicators of Orzooiyeh groundwater resources in the Kerman province by using the geographic information system (GIS. Methods: This study attempts to survey the spatial analysis of these indexes using GIS techniques besides the evaluation of the groundwater resources quality in the study area. For this purpose, data quality indicators and statistics such as electrical conductivity, pH, sulphate, residual total dissolved solids (TDS, sodium, calcium; magnesium and chlorine of 28 selected wells sampled by the Kerman regional water organization were used. Results: A comparison of the present research results with standard of Industrial Research of Iran and also the World Health Organization (WHO shows that, among the measured indices, the electrical conductivity and TDS in the chosen samples are higher than the national standard of Iran and of the WHO but other indices are more favourable. Conclusion: Results showed that the electrical conductivity index of 64.3% of the samples have an optimal level, 71.4% have the limit of Iran national standard and only 3.6% of them have the WHO standard. The TDS index, too, did not reach national standards in any of the samples and in 82.1% of the samples this index was on the national standard limit. As per this index, only 32.1% of the samples were in the WHO standards.

  8. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  9. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    Nuncio Q, A.E.

    1998-01-01

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  10. Skin Absorbed Doses from Full Mouth Standard Intraoral Radiography in Bisecting Angle and Paralleling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Nah, Kyung Soo; Kim, Ae Ji [Dept. of Oral Radiology, College of Dentistry, Pusan National University, Pusan (Korea, Republic of); Doh, Shi Hong [Dept. of Applied physics . National Fisheries University of Pusan Department of Radiotherapy, Pusan (Korea, Republic of); Kim, Hyun Ja [Dept. of Oral Radiology, Baptist Hospital, Pusan (Korea, Republic of); Yoo, Meong Jin [Dept. of Radiology, College of Dentistry, Pusan National University, Pusan (Korea, Republic of)

    1990-08-15

    This study was performed to measure the skin absorbed doses from full mouth standard intraoral radiography(14 exposures) in bisecting angle and paralleling techniques. Thermoluminescent dosimeters were used in a phantom. Circular tube collimator (60 mm in diameter, 20 cm in length) and rectangular collimator (35 mm X 44 mm, 40 cm in length) were set for bisecting angle and paralleling techniques respectively. All measurement sites were classified into 8 groups according to distance from each point of central rays. The results were as follows: 1. The skin absorbed doses from the paralleling technique were significantly decreased than those from the bisecting technique in both points at central ray and points away from central ray. The percentage rats of decrease were greater at points away from central ray than those at central ray. 2. The skin absorbed doses at the lens of eye, parotid gland, submandibular gland and thyroid region were significantly decreased in paralleling technique, but those of the midline of palate remained similar in both techniques. 3. The highest doses were measured at the site 20 mm above the point of central ray for the mandibular premolars in bisecting angle technique and at the point of central ray for the mandibular premolars in paralleling techniques. The lowest doses were measured at the thyroid region in both techniques.

  11. Skin Absorbed Doses from Full Mouth Standard Intraoral Radiography in Bisecting Angle and Paralleling techniques

    International Nuclear Information System (INIS)

    Nah, Kyung Soo; Kim, Ae Ji; Doh, Shi Hong; Kim, Hyun Ja; Yoo, Meong Jin

    1990-01-01

    This study was performed to measure the skin absorbed doses from full mouth standard intraoral radiography(14 exposures) in bisecting angle and paralleling techniques. Thermoluminescent dosimeters were used in a phantom. Circular tube collimator (60 mm in diameter, 20 cm in length) and rectangular collimator (35 mm X 44 mm, 40 cm in length) were set for bisecting angle and paralleling techniques respectively. All measurement sites were classified into 8 groups according to distance from each point of central rays. The results were as follows: 1. The skin absorbed doses from the paralleling technique were significantly decreased than those from the bisecting technique in both points at central ray and points away from central ray. The percentage rats of decrease were greater at points away from central ray than those at central ray. 2. The skin absorbed doses at the lens of eye, parotid gland, submandibular gland and thyroid region were significantly decreased in paralleling technique, but those of the midline of palate remained similar in both techniques. 3. The highest doses were measured at the site 20 mm above the point of central ray for the mandibular premolars in bisecting angle technique and at the point of central ray for the mandibular premolars in paralleling techniques. The lowest doses were measured at the thyroid region in both techniques.

  12. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    Neergaard, Helle; Leitch, Claire

    2015-01-01

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  13. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  14. Radiological error: analysis, standard setting, targeted instruction and teamworking

    International Nuclear Information System (INIS)

    FitzGerald, Richard

    2005-01-01

    Diagnostic radiology does not have objective benchmarks for acceptable levels of missed diagnoses [1]. Until now, data collection of radiological discrepancies has been very time consuming. The culture within the specialty did not encourage it. However, public concern about patient safety is increasing. There have been recent innovations in compiling radiological interpretive discrepancy rates which may facilitate radiological standard setting. However standard setting alone will not optimise radiologists' performance or patient safety. We must use these new techniques in radiological discrepancy detection to stimulate greater knowledge sharing, targeted instruction and teamworking among radiologists. Not all radiological discrepancies are errors. Radiological discrepancy programmes must not be abused as an instrument for discrediting individual radiologists. Discrepancy rates must not be distorted as a weapon in turf battles. Radiological errors may be due to many causes and are often multifactorial. A systems approach to radiological error is required. Meaningful analysis of radiological discrepancies and errors is challenging. Valid standard setting will take time. Meanwhile, we need to develop top-up training, mentoring and rehabilitation programmes. (orig.)

  15. Standardization of the Descemet membrane endothelial keratoplasty technique: Outcomes of the first 450 consecutive cases.

    Science.gov (United States)

    Satué, M; Rodríguez-Calvo-de-Mora, M; Naveiras, M; Cabrerizo, J; Dapena, I; Melles, G R J

    2015-08-01

    To evaluate the clinical outcome of the first 450 consecutive cases after Descemet membrane endothelial keratoplasty (DMEK), as well as the effect of standardization of the technique. Comparison between 3 groups: Group I: (cases 1-125), as the extended learning curve; Group II: (cases 126-250), transition to technique standardization; Group III: (cases 251-450), surgery with standardized technique. Best corrected visual acuity, endothelial cell density, pachymetry and intra- and postoperative complications were evaluated before, and 1, 3 and 6 months after DMEK. At 6 months after surgery, 79% of eyes reached a best corrected visual acuity of≥0.8 and 43%≥1.0. Mean preoperative endothelial cell density was 2,530±220 cells/mm2 and 1,613±495 at 6 months after surgery. Mean pachymetry measured 668±92 μm and 526±46 μm pre- and (6 months) postoperatively, respectively. There were no significant differences in best corrected visual acuity, endothelial cell density and pachymetry between the 3 groups (P > .05). Graft detachment presented in 17.3% of the eyes. The detachment rate declined from 24% to 12%, and the rate of secondary surgeries from 9.6% to 3.5%, from group I to III respectively. Visual outcomes and endothelial cell density after DMEK are independent of the technique standardization. However, technique standardization may have contributed to a lower graft detachment rate and a relatively low number of secondary interventions required. As such, DMEK may become the first choice of treatment in corneal endothelial disease. Copyright © 2014 Sociedad Española de Oftalmología. Published by Elsevier España, S.L.U. All rights reserved.

  16. Visual Evaluation Techniques for Skill Analysis.

    Science.gov (United States)

    Brown, Eugene W.

    1982-01-01

    Visual evaluation techniques provide the kinesiologist with a method of evaluating physical skill performance. The techniques are divided into five categories: (1) vantage point; (2) movement simplification; (3) balance and stability; (4) movement relationships; and (5) range of movement. (JN)

  17. European standardization activities on residual stress analysis by neutron diffraction

    CERN Document Server

    Youtsos, A G

    2002-01-01

    A main objective of a recently completed European research project, RESTAND - residual stress standard using neutron diffraction, was to develop industrial confidence in the application of the neutron-diffraction technique for residual stress measurement and its principal deliverable was a relevant draft code of practice. In fact this draft standard was jointly developed within RESTAND and VAMAS TWA 20 - an international pre-normative research activity. As no such standard is yet available, on the basis of this draft standard document the European Standards Committee on Non-Destructive Testing (CEN TC/138) has established a new ad hoc Work Group (AHG7). The objective of this group is the development of a European pre-standard on a 'test method for measurement of residual stress by neutron diffraction'. The document contains the proposed protocol for making the measurements. It includes the scope of the method, an outline of the technique, the calibration and measurement procedures recommended, and details of ...

  18. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  19. Dependency Coefficient in Computerized GALS Examination Utilizing Motion Analysis Techniques

    Directory of Open Access Journals (Sweden)

    Hamed Shahidian

    2013-04-01

    Full Text Available Objectives: The GALS (Gait, Arms, Legs and Spine examination is a compact version of standard procedures used by rheumatologists to determine musculoskeletal disorders in patients. Computerization of such a clinical procedure is necessary to ensure an objective evaluation. This article presents the first steps in such an approach by outlining a procedure to use motion analysis techniques as a new method for GALS examination. Methods: A 3D motion pattern was obtained from two subject groups using a six camera motion analysis system. The range of motion associated with GALS was consequently determined using a MATLAB program. Results: The range of motion (ROM of the two subject groups was determined, the validity of the approach was outlined, and the symmetry of movement on both sides of the body was quantified through introduction of a dependency coefficient. Discussion: Analysis of GALS examination and diagnosis of musculoskeletal problems could be addressed more accurately and reliably by adopting motion analysis techniques. Furthermore, introduction of a dependency coefficient offers a wide spectrum of prospective applications in neuromuscular studies .

  20. A survey on development of neutron standards and neutron measurement technique corresponding to various fields

    International Nuclear Information System (INIS)

    Matsumoto, Tetsuro

    2007-01-01

    Various uses of neutrons are being watched in many fields such as industry, medical technology and radiation protection. Especially, high energy neutrons above 15 MeV are important in a radiation exposure issue of an aircraft and a soft error issue of a semiconductor. Therefore neutron fluence standards for the high energy region are very important. However, the standards are not almost provided all over the world. Three reasons are mainly considered: (1) poor measurement techniques for the high energy neutrons, (2) a small number of high energy neutron facilities and (3) lack of nuclear data for high energy particle reactions. In this paper, the present status of the measurement techniques, the facilities and the nuclear data is investigated and discussed. In NMIJ/AIST, the 19.0-MeV neutron fluence standard will be established by 2010, and development of high energy neutron standards above 20 MeV is also examined. An outline of the development of the high energy neutron standards is also shown in this paper. (author)

  1. Evaluation of the overlapping of posterior teeth in two techniques of improved interproximal panoramic program and standard panoramic

    Directory of Open Access Journals (Sweden)

    Goodarzi pour D

    2010-06-01

    Full Text Available "nBackground and Aims: Overlapping of the proximal surfaces of posterior teeth in the panoramic radiography is a major concern. Therefore, an option has been developed in the panoramic unit of Planmeca Promax, namely improved interproximal mode. This mode causes lower horizental angle with the teeth contact region during the unit rotation decreasing overlapping of the panoramic images of the posterior teeth especially premolar teeth. The present study was done to compare the overlapping of posterior teeth using two techniques of improved interproximal panoramic program and standard panoramic. "nMaterials and Methods: In this diagnostic study, 32 patients requiring panoramic radiographies at their posterior teeth during their routine diagnosis and treatment process with the mean age of 27.3 years were participated. No patients showed crowding of posterior teeth or missed and restored posterior teeth. The participants' panoramic radiographies were randomly taken by two techniques of improved interproximal panoramic and standard panoramic using Planmeca Promax device. The overlapping of the panoramic images was blindly assessed by an oral radiologist. The overlapping in both techniques was reported by frequency and percentage. The comparisons were done by Chi-square test between two techniques and the odds ratio of overlapping was estimated using regression analysis. "nResults: In standard panoramic techniques, 38.5% (148 contacts of 384 contacts of the proximal surfaces overlapped while the overlapping of the proximal surfaces was observed in 18.8% (72 contacts of 384 overall contacts in improved interproximal technique. Significant differences were noted between two techniques regarding overlapping (P<0.001. Also 66.4% and 39.1% of 4-5 teeth contacts overlapped in standard and improved techniques. The values were reported to be 39.1% and 12.5% in contacts of 5-6 teeth and 10.2% and 4.7% in the contacts of 6-7 teeth in both techniques

  2. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    Science.gov (United States)

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  3. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    International Nuclear Information System (INIS)

    Jha, Abhinav K; Frey, Eric C; Caffo, Brian

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  4. Quench correction in liquid scintillation counting by a combined internal standard-samples channels ratio technique

    International Nuclear Information System (INIS)

    Dahlberg, E.

    1982-01-01

    A well-known problem in liquid scintillation counting (LSC) is that radioactivity cannot be measured with 100% efficiency, e.g., due to ''quenching'', which thus needs be corrected for. Three methods (viz., those of internal standard (IS), samples channels ratio (SCR), and external standard channels ratio (ESCR) are in common use to accomplish quench correction. None of these methods is ideal. This paper shows that a combination of the IS and SCR methods (IS-SCR) ameliorates the major disadvantages of both techniques and the disadvantage of the SCR technique at low count rates have been eliminated in the IS-SCR method, which also has a low volume dependence compared to the IS and ESCR methods. The IS-SCR method is not affected by time-dependent diffusion of solutes and solvents into the walls of plastic counting vials, which is a major drawback of the ESCR technique. Used with a simple linear regression technique, the IS-SCR quench curves may be linearized over wide ranges of efficiencies. In view of the wide-spread application of LSC, the IS-SCR technique is therefore likely to be useful to many investigators. 2 figures, 2 tables

  5. Assessment of Snared-Loop Technique When Standard Retrieval of Inferior Vena Cava Filters Fails

    International Nuclear Information System (INIS)

    Doody, Orla; Noe, Geertje; Given, Mark F.; Foley, Peter T.; Lyon, Stuart M.

    2009-01-01

    Purpose To identify the success and complications related to a variant technique used to retrieve inferior vena cava filters when simple snare approach has failed. Methods A retrospective review of all Cook Guenther Tulip filters and Cook Celect filters retrieved between July 2006 and February 2008 was performed. During this period, 130 filter retrievals were attempted. In 33 cases, the standard retrieval technique failed. Retrieval was subsequently attempted with our modified retrieval technique. Results The retrieval was successful in 23 cases (mean dwell time, 171.84 days; range, 5-505 days) and unsuccessful in 10 cases (mean dwell time, 162.2 days; range, 94-360 days). Our filter retrievability rates increased from 74.6% with the standard retrieval method to 92.3% when the snared-loop technique was used. Unsuccessful retrieval was due to significant endothelialization (n = 9) and caval penetration by the filter (n = 1). A single complication occurred in the group, in a patient developing pulmonary emboli after attempted retrieval. Conclusion The technique we describe increased the retrievability of the two filters studied. Hook endothelialization is the main factor resulting in failed retrieval and continues to be a limitation with these filters.

  6. Annual Book of ASTM Standards, Part 23: Water; Atmospheric Analysis.

    Science.gov (United States)

    American Society for Testing and Materials, Philadelphia, PA.

    Standards for water and atmospheric analysis are compiled in this segment, Part 23, of the American Society for Testing and Materials (ASTM) annual book of standards. It contains all current formally approved ASTM standard and tentative test methods, definitions, recommended practices, proposed methods, classifications, and specifications. One…

  7. Hurdles run technique analysis in the 400m hurdles

    OpenAIRE

    Drtina, Martin

    2010-01-01

    Hurdles run technique analysis in the 400m hurdles Thesis objectives: The main objective is to compare the technique hurdles run in the race tempo on the track 400 m hurdles at the selected probands. Tasks are identified kinematic parameters separately for each proband and identify their weaknesses in technique. Method: Analysis techniques hurdles run was done by using 3D kinematic analysis. Observed space-time events were recorded on two digital cameras. Records was transferred to a suitable...

  8. Computer simulation, nuclear techniques and surface analysis

    Directory of Open Access Journals (Sweden)

    Reis, A. D.

    2010-02-01

    Full Text Available This article is about computer simulation and surface analysis by nuclear techniques, which are non-destructive. The “energy method of analysis” for nuclear reactions is used. Energy spectra are computer simulated and compared with experimental data, giving target composition and concentration profile information. Details of prediction stages are given for thick flat target yields. Predictions are made for non-flat targets having asymmetric triangular surface contours. The method is successfully applied to depth profiling of 12C and 18O nuclei in thick targets, by deuteron (d,p and proton (p,α induced reactions, respectively.

    Este artículo trata de simulación por ordenador y del análisis de superficies mediante técnicas nucleares, que son no destructivas. Se usa el “método de análisis en energía” para reacciones nucleares. Se simulan en ordenador espectros en energía que se comparan con datos experimentales, de lo que resulta la obtención de información sobre la composición y los perfiles de concentración de la muestra. Se dan detalles de las etapas de las predicciones de espectros para muestras espesas y planas. Se hacen predicciones para muestras no planas que tienen contornos superficiales triangulares asimétricos. Este método se aplica con éxito en el cálculo de perfiles en profundidad de núcleos de 12C y de 18O en muestras espesas a través de reacciones (d,p y (p,α inducidas por deuterones y protones, respectivamente.

  9. Advanced Packaging Materials and Techniques for High Power TR Module: Standard Flight vs. Advanced Packaging

    Science.gov (United States)

    Hoffman, James Patrick; Del Castillo, Linda; Miller, Jennifer; Jenabi, Masud; Hunter, Donald; Birur, Gajanana

    2011-01-01

    The higher output power densities required of modern radar architectures, such as the proposed DESDynI [Deformation, Ecosystem Structure, and Dynamics of Ice] SAR [Synthetic Aperture Radar] Instrument (or DSI) require increasingly dense high power electronics. To enable these higher power densities, while maintaining or even improving hardware reliability, requires advances in integrating advanced thermal packaging technologies into radar transmit/receive (TR) modules. New materials and techniques have been studied and compared to standard technologies.

  10. Risk Analysis as Regulatory Science: Toward The Establishment of Standards.

    Science.gov (United States)

    Murakami, Michio

    2016-09-01

    Understanding how to establish standards is essential for risk communication and also provides perspectives for further study. In this paper, the concept of risk analysis as regulatory science for the establishment of standards is demonstrated through examples of standards for evacuation and provisional regulation values in foods and drinking water. Moreover, academic needs for further studies related to standards are extracted. The concepts of the traditional 'Standard I', which has a paternalistic orientation, and 'Standard II', established through stakeholder consensus, are then systemized by introducing the current status of the new standards-related movement that developed after the Fukushima nuclear power plant accident, and the perspectives of the standards are discussed. Preparation of standards on the basis of stakeholder consensus through intensive risk dialogue before a potential nuclear power plant accident is suggested to be a promising approach to ensure a safe society and enhance subjective well-being. © The Author 2016. Published by Oxford University Press.

  11. Risk Analysis as Regulatory Science: Toward The Establishment of Standards

    Science.gov (United States)

    Murakami, Michio

    2016-01-01

    Understanding how to establish standards is essential for risk communication and also provides perspectives for further study. In this paper, the concept of risk analysis as regulatory science for the establishment of standards is demonstrated through examples of standards for evacuation and provisional regulation values in foods and drinking water. Moreover, academic needs for further studies related to standards are extracted. The concepts of the traditional ‘Standard I’, which has a paternalistic orientation, and ‘Standard II’, established through stakeholder consensus, are then systemized by introducing the current status of the new standards-related movement that developed after the Fukushima nuclear power plant accident, and the perspectives of the standards are discussed. Preparation of standards on the basis of stakeholder consensus through intensive risk dialogue before a potential nuclear power plant accident is suggested to be a promising approach to ensure a safe society and enhance subjective well-being. PMID:27475751

  12. Impact of Standardized Communication Techniques on Errors during Simulated Neonatal Resuscitation.

    Science.gov (United States)

    Yamada, Nicole K; Fuerch, Janene H; Halamek, Louis P

    2016-03-01

    Current patterns of communication in high-risk clinical situations, such as resuscitation, are imprecise and prone to error. We hypothesized that the use of standardized communication techniques would decrease the errors committed by resuscitation teams during neonatal resuscitation. In a prospective, single-blinded, matched pairs design with block randomization, 13 subjects performed as a lead resuscitator in two simulated complex neonatal resuscitations. Two nurses assisted each subject during the simulated resuscitation scenarios. In one scenario, the nurses used nonstandard communication; in the other, they used standardized communication techniques. The performance of the subjects was scored to determine errors committed (defined relative to the Neonatal Resuscitation Program algorithm), time to initiation of positive pressure ventilation (PPV), and time to initiation of chest compressions (CC). In scenarios in which subjects were exposed to standardized communication techniques, there was a trend toward decreased error rate, time to initiation of PPV, and time to initiation of CC. While not statistically significant, there was a 1.7-second improvement in time to initiation of PPV and a 7.9-second improvement in time to initiation of CC. Should these improvements in human performance be replicated in the care of real newborn infants, they could improve patient outcomes and enhance patient safety. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  13. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  14. Analysis of Lipoasiprated Following Centrifugation: Wet Versus Dry Harvesting Technique.

    Science.gov (United States)

    Agostini, Tommaso; Spinelli, Giuseppe; Perello, Raffella; Bani, Daniele; Boccalini, Giulia

    2016-09-01

    The success of lipotransfer strongly depends on the harvesting, processing, and placement of the lipoaspirated samples. This study was designed to assess the histomorphometric characteristics and viability of fat harvested using different techniques (wet and dry) following centrifugation, as described by Coleman. The study enrolled 85 consecutive, nonrandomized, healthy patients from March 2010 to December 2014 (45 males and 40 females). The mean age was 40 years (range, 18-59 years), and the mean body mass index was 25.8 (range, 24-32). The authors performed a histological analysis (hematoxylin/eosin), morphometry (ImageJ 1.33 free-share image analysis software), and a viability assessment (Trypan Blue exclusion test; Sigma-Aldrich, Milan, Italy) of the lipoaspirated samples. The hematoxylin and eosin-stained sections exhibited similar features; in particular, clear-cut morphological signs of adipocyte disruption, apoptosis, or necrosis were not detected in the examined samples. Morphometry confirmed the visual findings, and the values of the mean surface area of the adipocyte vacuoles were not significantly different. Additionally, the adipocyte viability was not significantly different in the analyzed fat tissue samples. The results from this study showed, for the first time, that there is not a reduction in the viability of fat grafts harvested with the dry or wet technique following centrifugation according to Coleman technique. Both methods of fat harvesting collect viable cells, which are not influenced by standard centrifugation. The fat grafts harvested and processed by this technique could be used in clinical settings without increasing the reabsorption rate. V.

  15. Neutron Activation Analysis with k0 Standardization

    International Nuclear Information System (INIS)

    Pomme, S.

    1998-01-01

    SCK-CEN's programme on Neutron Activation Analysis with k 0 -standardisation aims to: (1) develop and implement k 0 -standardisation method for NAA; (2) to exploit the inherent qualities of NAA such as accuracy, traceability, and multi-element capability; (3) to acquire technical spin-off for nuclear measurements services. Main achievements in 1997 are reported

  16. Cochlear implant simulator for surgical technique analysis

    Science.gov (United States)

    Turok, Rebecca L.; Labadie, Robert F.; Wanna, George B.; Dawant, Benoit M.; Noble, Jack H.

    2014-03-01

    Cochlear Implant (CI) surgery is a procedure in which an electrode array is inserted into the cochlea. The electrode array is used to stimulate auditory nerve fibers and restore hearing for people with severe to profound hearing loss. The primary goals when placing the electrode array are to fully insert the array into the cochlea while minimizing trauma to the cochlea. Studying the relationship between surgical outcome and various surgical techniques has been difficult since trauma and electrode placement are generally unknown without histology. Our group has created a CI placement simulator that combines an interactive 3D visualization environment with a haptic-feedback-enabled controller. Surgical techniques and patient anatomy can be varied between simulations so that outcomes can be studied under varied conditions. With this system, we envision that through numerous trials we will be able to statistically analyze how outcomes relate to surgical techniques. As a first test of this system, in this work, we have designed an experiment in which we compare the spatial distribution of forces imparted to the cochlea in the array insertion procedure when using two different but commonly used surgical techniques for cochlear access, called round window and cochleostomy access. Our results suggest that CIs implanted using round window access may cause less trauma to deeper intracochlear structures than cochleostomy techniques. This result is of interest because it challenges traditional thinking in the otological community but might offer an explanation for recent anecdotal evidence that suggests that round window access techniques lead to better outcomes.

  17. Multidimensional scaling technique for analysis of magnetic storms ...

    Indian Academy of Sciences (India)

    Abstract. Multidimensional scaling is a powerful technique for analysis of data. The latitudinal dependenceof geomagnetic field variation in horizontal component (H) during magnetic storms is analysed in this paper by employing this technique.

  18. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  19. Development of sulfide calibration standards for the laser ablation inductively-coupled plasma mass spectrometry technique

    Science.gov (United States)

    Wilson, S.A.; Ridley, W.I.; Koenig, A.E.

    2002-01-01

    The requirements of standard materials for LA-ICP-MS analysis have been difficult to meet for the determination of trace elements in sulfides. We describe a method for the production of synthetic sulfides by precipitation from solution. The method is detailed by the production of approximately 200 g of a material, PS-1, with a suite of chalcophilic trace elements in an Fe-Zn-Cu-S matrix. Preliminary composition data, together with an evaluation of the homogeneity for individual elements, suggests that this type of material meets the requirements for a sulfide calibration standard that allows for quantitative analysis. Contamination of the standard with Na suggests that H2S gas may prove a better sulfur source for future experiments. We recommend that calibration data be collected in whatever mode is closest to that employed for the analysis of the unknown material, because of variable fractionation effects as a function of analytical mode. For instance, if individual spot analyses are attempted on unknown sample, then a raster of several individual spot analyses, not a continuous scan, should be collected and averaged for the standard. Hg and Au are exceptions to the above and calibration data should always be collected in a scanning mode. Au is more heterogeneously distributed than other trace metals and large-area scans are required to provide an average value for calibration purposes. We emphasize that the values given in Table 1 are preliminary values. Further chemical characterization of this standard, through a round-robin analysis program, will allow the USGS to provide both certified and recommended values for individual elements. The USGS has developed PS-1 as a potential new LA-ICP-MS standard for use by the analytical community, and requests for this material should be addressed to S. Wilson. However, it is stressed that an important aspect of the method described here is the flexibility for individual investigators to produce sulfides with a wide range

  20. Surface analysis and techniques in biology

    CERN Document Server

    Smentkowski, Vincent S

    2014-01-01

    This book highlights state-of-the-art surface analytical instrumentation, advanced data analysis tools, and the use of complimentary surface analytical instrumentation to perform a complete analysis of biological systems.

  1. Recent trends in particle size analysis techniques

    Science.gov (United States)

    Kang, S. H.

    1984-01-01

    Recent advances and developments in the particle-sizing technologies are briefly reviewed in accordance with three operating principles including particle size and shape descriptions. Significant trends of the particle size analysing equipment recently developed show that compact electronic circuitry and rapid data processing systems were mainly adopted in the instrument design. Some newly developed techniques characterizing the particulate system were also introduced.

  2. Inverse Filtering Techniques in Speech Analysis | Nwachuku ...

    African Journals Online (AJOL)

    inverse filtering' has been applied. The unifying features of these techniques are presented, namely: 1. a basis in the source-filter theory of speech production, 2. the use of a network whose transfer function is the inverse of the transfer function of ...

  3. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Burtis, C.A.

    1986-10-01

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  4. Standard practice for monitoring atmospheric SO2 using the sulfation plate technique

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1997-01-01

    1.1 This practice covers a weighted average effective SO2 level for a 30-day interval through the use of the sulfation plate method, a technique for estimating the effective SO2 content of the atmosphere, and especially with regard to the atmospheric corrosion of stationary structures or panels. This practice is aimed at determining SO2 levels rather than sulfuric acid aerosol or acid precipitation. 1.2 The results of this practice correlate approximately with volumetric SO2 concentrations, although the presence of dew or condensed moisture tends to enhance the capture of SO2 into the plate. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  5. An analysis of induction motor testing techniques

    International Nuclear Information System (INIS)

    Soergel, S.

    1996-01-01

    There are two main failure mechanisms in induction motors: bearing related and stator related. The Electric Power Research Institute (EPRI) conducted a study which was completed in 1985, and found that near 37% of all failures were attributed to stator problems. Another data source for motor failures is the Nuclear Plant Reliability Data System (NPRDS). This database reveals that approximately 55% of all motors were identified as being degraded before failure occurred. Of these, approximately 35% were due to electrical faults. These are the faults which this paper will attempt to identify through testing techniques. This paper is a discussion of the current techniques used to predict incipient failure of induction motors. In the past, the main tests were those to assess the integrity of the ground insulation. However, most insulation failures are believed to involve turn or strand insulation, which makes traditional tests alone inadequate for condition assessment. Furthermore, these tests have several limitations which need consideration when interpreting the results. This paper will concentrate on predictive maintenance techniques which detect electrical problems. It will present appropriate methods and tests, and discuss the strengths and weaknesses of each

  6. [Analysis on standardization of patient posture for acupuncture treatment].

    Science.gov (United States)

    Lu, Yonghui

    2018-02-12

    The standardization of patient posture for acupuncture treatment was discussed. According to the opinions in Neijing ( Inner Canon of Huangdi ), combined with the clinical practice of acupuncture, it was believed that the patient posture for acupuncture treatment should be standardized based on Neijing . The standardized patient posture was the foundation of acupuncture, the need of blood flow and requirement of acupuncture technique. The combination of three elements was beneficial for the traveling of spirit- qi through meridian-acupoint, which could regulate balance of yin and yang to treat disease. In addition, the principles and methods of standardization of patient posture was proposed, and the important clinical significance of standardization of patient posture for acupuncture treatment was highlighted.

  7. Comparisons of neural networks to standard techniques for image classification and correlation

    Science.gov (United States)

    Paola, Justin D.; Schowengerdt, Robert A.

    1994-01-01

    Neural network techniques for multispectral image classification and spatial pattern detection are compared to the standard techniques of maximum-likelihood classification and spatial correlation. The neural network produced a more accurate classification than maximum-likelihood of a Landsat scene of Tucson, Arizona. Some of the errors in the maximum-likelihood classification are illustrated using decision region and class probability density plots. As expected, the main drawback to the neural network method is the long time required for the training stage. The network was trained using several different hidden layer sizes to optimize both the classification accuracy and training speed, and it was found that one node per class was optimal. The performance improved when 3x3 local windows of image data were entered into the net. This modification introduces texture into the classification without explicit calculation of a texture measure. Larger windows were successfully used for the detection of spatial features in Landsat and Magellan synthetic aperture radar imagery.

  8. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting ...

  9. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis

    Science.gov (United States)

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-01-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches. PMID:18496587

  10. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis.

    Science.gov (United States)

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-03-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches.

  11. Trend Filtering Techniques for Time Series Analysis

    OpenAIRE

    López Arias, Daniel

    2016-01-01

    Time series can be found almost everywhere in our lives and because of this being capable of analysing them is an important task. Most of the time series we can think of are quite noisy, being this one of the main problems to extract information from them. In this work we use Trend Filtering techniques to try to remove this noise from a series and understand the underlying trend of the series, that gives us information about the behaviour of the series aside from the particular...

  12. Techniques for Intelligence Analysis of Networks

    National Research Council Canada - National Science Library

    Cares, Jeffrey R

    2005-01-01

    ...) there are significant intelligence analysis manifestations of these properties; and (4) a more satisfying theory of Networked Competition than currently exists for NCW/NCO is emerging from this research...

  13. Standardization: using comparative maintenance costs in an economic analysis

    OpenAIRE

    Clark, Roger Nelson

    1987-01-01

    Approved for public release; distribution is unlimited This thesis investigates the use of comparative maintenance costs of functionally interchangeable equipments in similar U.S. Navy shipboard applications in an economic analysis of standardization. The economics of standardization, life-cycle costing, and the Navy 3-M System are discussed in general. An analysis of 3-M System maintenance costs for a selected equipment, diesel engines, is conducted. The potential use of comparative ma...

  14. Dynamic speckle analysis using multivariate techniques

    International Nuclear Information System (INIS)

    López-Alonso, José M; Alda, Javier; Rabal, Héctor; Grumel, Eduardo; Trivi, Marcelo

    2015-01-01

    In this work we use principal components analysis to characterize dynamic speckle patterns. This analysis quantitatively identifies different dynamics that could be associated to physical phenomena occurring in the sample. We also found the contribution explained by each principal component, or by a group of them. The method analyzes the paint drying process over a hidden topography. It can be used for fast screening and identification of different dynamics in biological or industrial samples by means of dynamic speckle interferometry. (paper)

  15. Advanced Imaging Techniques for Multiphase Flows Analysis

    Science.gov (United States)

    Amoresano, A.; Langella, G.; Di Santo, M.; Iodice, P.

    2017-08-01

    Advanced numerical techniques, such as fuzzy logic and neural networks have been applied in this work to digital images acquired on two applications, a centrifugal pump and a stationary spray in order to define, in a stochastic way, the gas-liquid interface evolution. Starting from the numeric matrix representing the image it is possible to characterize geometrical parameters and the time evolution of the jet. The algorithm used works with the fuzzy logic concept to binarize the chromatist of the pixels, depending them, by using the difference of the light scattering for the gas and the liquid phase.. Starting from a primary fixed threshold, the applied technique, can select the ‘gas’ pixel from the ‘liquid’ pixel and so it is possible define the first most probably boundary lines of the spray. Acquiring continuously the images, fixing a frame rate, a most fine threshold can be select and, at the limit, the most probably geometrical parameters of the jet can be detected.

  16. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  17. Uncertainty analysis technique for OMEGA Dante measurements

    International Nuclear Information System (INIS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  18. Uncertainty Analysis Technique for OMEGA Dante Measurements

    International Nuclear Information System (INIS)

    May, M.J.; Widmann, K.; Sorce, C.; Park, H.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  19. 48 CFR 215.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for...

  20. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... ensure a fair and reasonable price. Examples of such techniques include, but are not limited to, the... to the cost or price analysis of the service or product being proposed should also be included in the... techniques. (a) General. The objective of proposal analysis is to ensure that the final agreed-to price is...

  1. Outcome of a standardized technique of preputial preservation surgery for phimosis: A single institutional experience

    Directory of Open Access Journals (Sweden)

    Kamalesh Pal

    2014-01-01

    Full Text Available Introduction: Pathological phimosis or preputial stenosis is a distressing problem in children leading to recurrent balanoposthitis, ballooning of prepuce, and rarely back pressure changes in the urinary tract. Circumcision has been the standard of care for such situations, although recently, various alternatives to circumcision have been reported in the literature. Most of these techniques are often complex and are met with poor acceptance. Moreover, besides personal preferences (in Hindus, advantages of preputial preservation is increasingly being realized recently. Materials and Methods: A prospective study was carried out in which a simple standardized technique of preputioplasty (dorsal slit was used in 40 pediatric preputial stenosis cases. The outcome of this procedure including cosmesis and parental satisfaction was evaluated. Results: The average duration of this procedure was from 10 to 25 min with no intraoperative complications. The cosmetic outcome was good in 62.5%, satisfactory in 30%, and poor in 7.5% of cases. All of the boys had retractable prepuce with no functional problems. There was 100% parental satisfaction. None of the patients required a redo procedure or circumcision. Conclusion: A dorsal slit of adequate length i.e.; 1/3 rd the length from the corona to the tip leads to a satisfactory cosmetic outcome in more than 92% of cases. Preputioplasty is a safe and simple alternative to more radical procedure of circumcision.

  2. European standardization activities on residual stress analysis by neutron diffraction

    International Nuclear Information System (INIS)

    Youtsos, A.G.; Ohms, C.

    2002-01-01

    A main objective of a recently completed European research project, RESTAND - residual stress standard using neutron diffraction, was to develop industrial confidence in the application of the neutron-diffraction technique for residual stress measurement and its principal deliverable was a relevant draft code of practice. In fact this draft standard was jointly developed within RESTAND and VAMAS TWA 20 - an international pre-normative research activity. As no such standard is yet available, on the basis of this draft standard document the European Standards Committee on Non-Destructive Testing (CEN TC/138) has established a new ad hoc Work Group (AHG7). The objective of this group is the development of a European pre-standard on a 'test method for measurement of residual stress by neutron diffraction'. The document contains the proposed protocol for making the measurements. It includes the scope of the method, an outline of the technique, the calibration and measurement procedures recommended, and details of how the strain data should be analysed to calculate stresses and establish the reliability of the results obtained. (orig.)

  3. European standardization activities on residual stress analysis by neutron diffraction

    Science.gov (United States)

    Youtsos, A. G.; Ohms, C.

    A main objective of a recently completed European research project, RESTAND - residual stress standard using neutron diffraction, was to develop industrial confidence in the application of the neutron-diffraction technique for residual stress measurement and its principal deliverable was a relevant draft code of practice. In fact this draft standard was jointly developed within RESTAND and VAMAS TWA 20 - an international pre-normative research activity. As no such standard is yet available, on the basis of this draft standard document the European Standards Committee on Non-Destructive Testing (CEN TC/138) has established a new ad hoc Work Group (AHG7). The objective of this group is the development of a European pre-standard on a `test method for measurement of residual stress by neutron diffraction'. The document contains the proposed protocol for making the measurements. It includes the scope of the method, an outline of the technique, the calibration and measurement procedures recommended, and details of how the strain data should be analysed to calculate stresses and establish the reliability of the results obtained.

  4. Preliminary results of standard quantitative analysis by ED-XRF

    International Nuclear Information System (INIS)

    Lara, Alessandro L. de; Franca, Alana C.; Neto, Manoel R.M.; Paschuk, Sergei A.

    2013-01-01

    A comparison between the results of elemental concentration proposed by XRS-FP software , using data obtained by EDXRF technique with those obtained by stoichiometric calculation was performed. For this purpose, five standard samples of known compounds were produced: two lead- oxide, magnesium chloride and iodine in controlled amounts. The compounds were subsequently mixed and compressed to form tablets. The samples were irradiated in three endpoints, according to an orientation. The measurements were performed at the Laboratory for Radiological UTFPR using X-123SDD the Amptek detector and X-ray tube with silver target from the same manufacturer. The operating conditions tube were 05μA current to a 40 kV voltage. Finally, the 15 spectra analyzed with the software to determine the concentration of chlorine, iodine and lead. Data from this analysis were compared with results expected in stoichiometric calculations. The data provided by the program, a convergence of results, indicating homogeneity of the samples was found. Compared to the stoichiometric calculation, a considerable discrepancy that may be the result of a misconfiguration or contamination of the sample was found. At the end, we created a proposal for continuation of the work using an auxiliary calculation should be developed in the next step

  5. Comparison of four different techniques to evaluate the elastic properties of phantom in elastography: is there a gold standard?

    Science.gov (United States)

    Oudry, Jennifer; Lynch, Ted; Vappou, Jonathan; Sandrin, Laurent; Miette, Véronique

    2014-10-07

    Elastographic techniques used in addition to imaging techniques (ultrasound, resonance magnetic or optical) provide new clinical information on the pathological state of soft tissues. However, system-dependent variation in elastographic measurements may limit the clinical utility of these measurements by introducing uncertainty into the measurement. This work is aimed at showing differences in the evaluation of the elastic properties of phantoms performed by four different techniques: quasi-static compression, dynamic mechanical analysis, vibration-controlled transient elastography and hyper-frequency viscoelastic spectroscopy. Four Zerdine® gel materials were tested and formulated to yield a Young's modulus over the range of normal and cirrhotic liver stiffnesses. The Young's modulus and the shear wave speed obtained with each technique were compared. Results suggest a bias in elastic property measurement which varies with systems and highlight the difficulty in finding a reference method to determine and assess the elastic properties of tissue-mimicking materials. Additional studies are needed to determine the source of this variation, and control for them so that accurate, reproducible reference standards can be made for the absolute measurement of soft tissue elasticity.

  6. Comparison of ankle-brachial index measured by an automated oscillometric apparatus with that by standard Doppler technique in vascular patients

    DEFF Research Database (Denmark)

    Korno, M.; Eldrup, N.; Sillesen, H.

    2009-01-01

    was calculated twice using both the methods on both legs. MATERIALS AND METHODS: We tested the automated oscillometric blood pressure device, CASMED 740, for measuring ankle and arm blood pressure and compared it with the current gold standard, the hand-held Doppler technique, by the Bland-Altman analysis....... RESULTS: Using the Doppler-derived ABI as the gold standard, the sensitivity and specificity of the oscillometric method for determining an ABI Udgivelsesdato: 2009/11...

  7. Analysis of Jordanian Cigarettes Using XRF Techniques

    International Nuclear Information System (INIS)

    Kullab, M.; Ismail, A.; AL-kofahi, M.

    2002-01-01

    Sixteen brands of Jordanian cigarettes were analyzed using X-ray Fluorescence (XRF) techniques. These cigarettes were found to contain the elements: Si, S, Cl, K, Ca, P, Ti, Mn, Fe, Cu, Zn, Br.Rb and Sr. The major elements with concentrations of more than 1% by weight were Cl,K and Ca. The elements with minor concentrations, Between 0.1 and 1% by weight, were Si, S and P. The trace elements with concentrations below 0.1% by weight were Ti, Mn, Fe, Cu, Zn, Br, Rb and Sr. The toxicity of some trace elements, like Br, Rb, and Sr, which are present in some brands of Jordanian cigarettes, is discussed. (Author's) 24 refs., 1 tab., 1 fig

  8. Establishing working standards of chromosome aberrations analysis for biological dosimetry

    International Nuclear Information System (INIS)

    Bui Thi Kim Luyen; Tran Que; Pham Ngoc Duy; Nguyen Thi Kim Anh; Ha Thi Ngoc Lien

    2015-01-01

    Biological dosimetry is an dose assessment method using specify bio markers of radiation. IAEA (International Atomic Energy Agency) and ISO (International Organization for Standardization) defined that dicentric chromosome is specify for radiation, it is a gold standard for biodosimetry. Along with the documents published by IAEA, WHO, ISO and OECD, our results of study on the chromosome aberrations induced by radiation were organized systematically in nine standards that dealing with chromosome aberration test and micronucleus test in human peripheral blood lymphocytes in vitro. This standard addresses: the reference dose-effect for dose estimation, the minimum detection levels, cell culture, slide preparation, scoring procedure for chromosome aberrations use for biodosimetry, the criteria for converting aberration frequency into absorbed dose, reporting of results. Following these standards, the automatic analysis devices were calibrated for improving biological dosimetry method. This standard will be used to acquire and maintain accreditation of the Biological Dosimetry laboratory in Nuclear Research Institute. (author)

  9. Evaluating Dynamic Analysis Techniques for Program Comprehension

    NARCIS (Netherlands)

    Cornelissen, S.G.M.

    2009-01-01

    Program comprehension is an essential part of software development and software maintenance, as software must be sufficiently understood before it can be properly modified. One of the common approaches in getting to understand a program is the study of its execution, also known as dynamic analysis.

  10. INVERSE FILTERING TECHNIQUES IN SPEECH ANALYSIS

    African Journals Online (AJOL)

    Dr Obe

    features in the speech process: (i) the resonant structure of the vocal-tract transfer function, i.e, formant analysis,. (ii) the glottal wave,. (iii) the fundamental frequency or pitch of the sound. During the production of speech, the configuration of the articulators: the vocal tract tongue, teeth, lips, etc, changes from one sound to.

  11. Microstructure analysis using SAXS/USAXS techniques

    International Nuclear Information System (INIS)

    Okuda, Hiroshi; Ochiai, Shojiro

    2010-01-01

    Introduction to small-angle X-ray scattering (SAXS) and ultra small-angle X-ray scattering (USAXS) is presented. SAXS is useful for microstructure analysis of age-hardenable alloys containing precipitates with several to several tens of nanometers in size. On the other hand, USAXS is appropriate to examine much larger microstructural heterogeneities, such as inclusions, voids, and large precipitates whose size is typically around one micrometer. Combining these two scattering methods, and sometimes also with diffractions, it is possible to assess the hierarchical structure of the samples in-situ and nondestructively, ranging from phase identification, quantitative analysis of precipitation structures upto their mesoscopic aggregates, large voids and inclusions. From technical viewpoint, USAXS requires some specific instrumentation for its optics. However, once a reasonable measurement was made, the analysis for the intensity is the same as that for conventional SAXS. In the present article, short introduction of conventional SAXS is presented, and then, the analysis is applied for a couple of USAXS data obtained for well-defined oxide particles whose average diameters are expected to be about 0.3 micrometers. (author)

  12. Closing the gap: accelerating the translational process in nanomedicine by proposing standardized characterization techniques

    Science.gov (United States)

    Khorasani, Ali A; Weaver, James L; Salvador-Morales, Carolina

    2014-01-01

    On the cusp of widespread permeation of nanomedicine, academia, industry, and government have invested substantial financial resources in developing new ways to better treat diseases. Materials have unique physical and chemical properties at the nanoscale compared with their bulk or small-molecule analogs. These unique properties have been greatly advantageous in providing innovative solutions for medical treatments at the bench level. However, nanomedicine research has not yet fully permeated the clinical setting because of several limitations. Among these limitations are the lack of universal standards for characterizing nanomaterials and the limited knowledge that we possess regarding the interactions between nanomaterials and biological entities such as proteins. In this review, we report on recent developments in the characterization of nanomaterials as well as the newest information about the interactions between nanomaterials and proteins in the human body. We propose a standard set of techniques for universal characterization of nanomaterials. We also address relevant regulatory issues involved in the translational process for the development of drug molecules and drug delivery systems. Adherence and refinement of a universal standard in nanomaterial characterization as well as the acquisition of a deeper understanding of nanomaterials and proteins will likely accelerate the use of nanomedicine in common practice to a great extent. PMID:25525356

  13. Study of the standard direct costs of various techniques of advanced endoscopy. Comparison with surgical alternatives.

    Science.gov (United States)

    Loras, Carme; Mayor, Vicenç; Fernández-Bañares, Fernando; Esteve, Maria

    2018-03-12

    The complexity of endoscopy has carried out an increase in cost that has a direct effect on the healthcare systems. However, few studies have analyzed the cost of advanced endoscopic procedures (AEP). To carry out a calculation of the standard direct costs of AEP, and to make a financial comparison with their surgical alternatives. Calculation of the standard direct cost in carrying out each procedure. An endoscopist detailed the time, personnel, materials, consumables, recovery room time, stents, pathology and medication used. The cost of surgical procedures was the average cost recorded in the hospital. Thirty-eight AEP were analyzed. The technique showing lowest cost was gastroscopy + APC (€116.57), while that with greatest cost was ERCP with cholangioscopy + stent placement (€5083.65). Some 34.2% of the procedures registered average costs of €1000-2000. In 57% of cases, the endoscopic alternative was 2-5 times more cost-efficient than surgery, in 31% of cases indistinguishable or up to 1.4 times more costly. Standard direct cost of the majority of AEP is reported using a methodology that enables easy application in other centers. For the most part, endoscopic procedures are more cost-efficient than the corresponding surgical procedure. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  14. Closing the gap: accelerating the translational process in nanomedicine by proposing standardized characterization techniques.

    Science.gov (United States)

    Khorasani, Ali A; Weaver, James L; Salvador-Morales, Carolina

    2014-01-01

    On the cusp of widespread permeation of nanomedicine, academia, industry, and government have invested substantial financial resources in developing new ways to better treat diseases. Materials have unique physical and chemical properties at the nanoscale compared with their bulk or small-molecule analogs. These unique properties have been greatly advantageous in providing innovative solutions for medical treatments at the bench level. However, nanomedicine research has not yet fully permeated the clinical setting because of several limitations. Among these limitations are the lack of universal standards for characterizing nanomaterials and the limited knowledge that we possess regarding the interactions between nanomaterials and biological entities such as proteins. In this review, we report on recent developments in the characterization of nanomaterials as well as the newest information about the interactions between nanomaterials and proteins in the human body. We propose a standard set of techniques for universal characterization of nanomaterials. We also address relevant regulatory issues involved in the translational process for the development of drug molecules and drug delivery systems. Adherence and refinement of a universal standard in nanomaterial characterization as well as the acquisition of a deeper understanding of nanomaterials and proteins will likely accelerate the use of nanomedicine in common practice to a great extent.

  15. Standard Compliant Hazard and Threat Analysis for the Automotive Domain

    Directory of Open Access Journals (Sweden)

    Kristian Beckers

    2016-06-01

    Full Text Available The automotive industry has successfully collaborated to release the ISO 26262 standard for developing safe software for cars. The standard describes in detail how to conduct hazard analysis and risk assessments to determine the necessary safety measures for each feature. However, the standard does not concern threat analysis for malicious attackers or how to select appropriate security countermeasures. We propose the application of ISO 27001 for this purpose and show how it can be applied together with ISO 26262. We show how ISO 26262 documentation can be re-used and enhanced to satisfy the analysis and documentation demands of the ISO 27001 standard. We illustrate our approach based on an electronic steering column lock system.

  16. Open Partial Nephrectomy in Renal Cancer: A Feasible Gold Standard Technique in All Hospitals

    Directory of Open Access Journals (Sweden)

    J. M. Cozar

    2008-01-01

    Full Text Available Introduction. Partial nephrectomy (PN is playing an increasingly important role in localized renal cell carcinoma (RCC as a true alternative to radical nephrectomy. With the greater experience and expertise of surgical teams, it has become an alternative to radical nephrectomy in young patients when the tumor diameter is 4 cm or less in almost all hospitals since cancer-specific survival outcomes are similar to those obtained with radical nephrectomy. Materials and Methods. The authors comment on their own experience and review the literature, reporting current indications and outcomes including complications. The surgical technique of open partial nephrectomy is outlined. Conclusions. Nowadays, open PN is the gold standard technique to treat small renal masses, and all nonablative techniques must pass the test of time to be compared to PN. It is not ethical for patients to undergo radical surgery just because the urologists involved do not have adequate experience with PN. Patients should be involved in the final treatment decision and, when appropriate, referred to specialized centers with experience in open or laparoscopic partial nephrectomies.

  17. MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG.

    Science.gov (United States)

    Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong

    2017-01-01

    Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies.

  18. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  19. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    1998-01-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  20. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  1. Evaluation of Visualization Using a 50/50 (Contrast Media/Glucose 5% Solution) Technique for Radioembolization as an Alternative to a Standard Sandwich Technique.

    Science.gov (United States)

    Paprottka, Karolin J; Todica, Andrei; Ilhan, Harun; Rübenthaler, Johannes; Schoeppe, Franziska; Michl, Marlies; Heinemann, Volker; Bartenstein, Peter; Reiser, Maximilian F; Paprottka, Philipp M

    2017-11-01

    Radioembolization (RE) with 90 yttrium ( 90 Y) resin microspheres generally employs a sandwich technique with separate sequential administration of contrast medium (CM), followed by vehicle (e.g., glucose 5% [G5] solution), then 90 Y resin microspheres (in G5), then G5, and then CM again to avoid contact of CM and microspheres under fluoroscopic guidance. This study evaluates the visualization quality and safety of a modified sandwich technique with a 50/50-mixture of CM (Imeron 300) and G5 for administration of 90 Y resin microspheres. A retrospective analysis of 81 RE procedures in patients with primary or secondary liver tumors was performed. The quality of angiographic visualization of the hepatic vessels was assessed before the first injection and immediately before the whole dose has been injected. Visualization and flow rate were graded on a 5-point scale: 1 = very good to 5 = not visible/no antegrade flow. Univariate logistic regression models and multiple linear regression models were used to evaluate the prognostic variables associated with visualization and flow scores. Visualization quality was inversely related to flow rate, the lower the flow rate the better the grade of the visualization. Visualization quality was also inversely related to body-mass-index (BMI). Performing RE with the 50/50-CM/G5 mixture resulted in a mean injection time for 1 GBq of 15 min. No clinically significant adverse events, including radiation-induced liver disease were reported. RE with a 50/50-mixture of CM and G5 for administration of 90 Y resin microspheres in a modified sandwich technique is a safe administration alternative and provides good visualization of hepatic vessels, which is inversely dependent on flow rate and BMI. Injection time was reduced compared with our experience with the standard sandwich technique.

  2. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    1986-12-01

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  3. Concrete blocks. Analysis of UNE, ISO en standards and comparison with other international standards

    Directory of Open Access Journals (Sweden)

    Álvarez Alonso, Marina

    1990-12-01

    Full Text Available This paper attempts to describe the recently approved UNE standards through a systematic analysis of the main specifications therein contained and the values considered for each of them, as well as the drafts for ISO and EN concrete block standards. Furthermore, the study tries to place the set of ISO standards in the international environment through a comparative analysis against a representative sample of the standards prevailing in various geographical regions of the globe to determine the analogies and differences among them. PALABRAS CLAVE: albañilería, análisis de sistemas, bloque de hormigón, muros de fábrica, normativa KEY WORDS: masonry, system analysis, concrete blocks, masonry walls, standards

    En este trabajo se pretende describir la reciente aprobada normativa UNE, analizando sistemáticamente las principales prescripciones contempladas y los valores considerados para cada una de ellas, así como los proyectos de Norma ISO, y EN sobre bloques de hormigón. Asimismo se intenta situar la normativa UNE en al ámbito internacional, haciendo un análisis comparativo con una representación de Normas de distintas regiones geográficas del mundo, determinando sus analogías y diferencias.

  4. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  5. Optimized inspection techniques and structural analysis in lifetime management

    International Nuclear Information System (INIS)

    Aguado, M.T.; Marcelles, I.

    1993-01-01

    Preservation of the option of extending the service lifetime of a nuclear power plant beyond its normal design lifetime requires correct remaining lifetime management from the very beginning of plant operation. The methodology used in plant remaining lifetime management is essentially based on the use of standard inspections, surveillance and monitoring programs and calculations, such as thermal-stress and fracture mechanics analysis. The inspection techniques should be continuously optimized, in order to be able to detect and dimension existing defects with the highest possible degree of accuracy. The information obtained during the inspection is combined with the historical data of the components: design, quality, operation, maintenance, and transients, and with the results of destructive testing, fracture mechanics and thermal fatigue analysis. These data are used to estimate the remaining lifetime of nuclear power plant components, systems and structures with the highest degree possible of accuracy. The use of this methodology allows component repairs and replacements to be reduced or avoided and increases the safety levels and availability of the nuclear power plant. Use of this strategy avoids the need for heavy investments at the end of the licensing period

  6. SPI Trend Analysis of New Zealand Applying the ITA Technique

    Directory of Open Access Journals (Sweden)

    Tommaso Caloiero

    2018-03-01

    Full Text Available A natural temporary imbalance of water availability, consisting of persistent lower-than-average or higher-than-average precipitation, can cause extreme dry and wet conditions that adversely impact agricultural yields, water resources, infrastructure, and human systems. In this study, dry and wet periods in New Zealand were expressed using the Standardized Precipitation Index (SPI. First, both the short term (3 and 6 months and the long term (12 and 24 months SPI were estimated, and then, possible trends in the SPI values were detected by means of a new graphical technique, the Innovative Trend Analysis (ITA, which allows the trend identification of the low, medium, and high values of a series. Results show that, in every area currently subject to drought, an increase in this phenomenon can be expected. Specifically, the results of this paper highlight that agricultural regions on the eastern side of the South Island, as well as the north-eastern regions of the North Island, are the most consistently vulnerable areas. In fact, in these regions, the trend analysis mainly showed a general reduction in all the values of the SPI: that is, a tendency toward heavier droughts and weaker wet periods.

  7. Standardization of Image Quality Analysis – ISO 19264

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad; Wüller, Dietmar

    2016-01-01

    There are a variety of image quality analysis tools available for the archiving world, which are based on different test charts and analysis algorithms. ISO has formed a working group in 2012 to harmonize these approaches and create a standard way of analyzing the image quality for archiving...

  8. Bootstrap Standard Error Estimates in Dynamic Factor Analysis

    Science.gov (United States)

    Zhang, Guangjian; Browne, Michael W.

    2010-01-01

    Dynamic factor analysis summarizes changes in scores on a battery of manifest variables over repeated measurements in terms of a time series in a substantially smaller number of latent factors. Algebraic formulae for standard errors of parameter estimates are more difficult to obtain than in the usual intersubject factor analysis because of the…

  9. New analytical techniques for cuticle chemical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schulten, H.R. [Fachhochschule Fresenius, Dept. of Trace Analysis, Wiesbaden (Germany)

    1994-12-31

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author`s integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  10. New analytical techniques for cuticle chemical analysis

    International Nuclear Information System (INIS)

    Schulten, H.R.

    1994-01-01

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author's integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  11. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  12. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  13. Wavelet transform techniques and signal analysis

    International Nuclear Information System (INIS)

    Perez, R.B.; Mattingly, J.K.; Tennessee Univ., Knoxville, TN; Perez, J.S.

    1993-01-01

    Traditionally, the most widely used signal analysis tool is the Fourier transform which, by producing power spectral densities (PSDs), allows time dependent signals to be studied in the frequency domain. However, the Fourier transform is global -- it extends over the entire time domain -- which makes it ill-suited to study nonstationary signals which exhibit local temporal changes in the signal's frequency content. To analyze nonstationary signals, the family of transforms commonly designated as short-time Fourier transforms (STFTs), capable of identifying temporally localized changes in the signal's frequency content, were developed by employing window functions to isolate temporal regions of the signal. For example, the Gabor STFT uses a Gaussian window. However, the applicability of STFTs is limited by various inadequacies. The Wavelet transform (NW), recently developed by Grossman and Morlet and explored in depth by Daubechies (2) and Mallat, remedies the inadequacies of STFTs. Like the Fourier transform, the WT can be implemented as a discrete transform (DWT) or as a continuous (integral) transform (CWT). This paper briefly illustrates some of the potential applications of the wavelet transform algorithms to signal analysis

  14. Metabolomic analysis using porcine skin: a pilot study of analytical techniques

    OpenAIRE

    Wu, Julie; Fiehn, Oliver; Armstrong, April W

    2014-01-01

    Background: Metabolic byproducts serve as indicators of the chemical processes and can provide valuable information on pathogenesis by measuring the amplified output. Standardized techniques for metabolome extraction of skin samples serve as a critical foundation to this field but have not been developed. Objectives: We sought to determine the optimal cell lysage techniques for skin sample preparation and to compare GC-TOF-MS and UHPLC-QTOF-MS for metabolomic analysis. ...

  15. A standardized infrared imaging technique that specifically detects UCP1-mediated thermogenesis in vivo.

    Science.gov (United States)

    Crane, Justin D; Mottillo, Emilio P; Farncombe, Troy H; Morrison, Katherine M; Steinberg, Gregory R

    2014-07-01

    The activation and expansion of brown adipose tissue (BAT) has emerged as a promising strategy to counter obesity and the metabolic syndrome by increasing energy expenditure. The subsequent testing and validation of novel agents that augment BAT necessitates accurate pre-clinical measurements in rodents regarding the capacity for BAT-derived thermogenesis. We present a novel method to measure BAT thermogenesis using infrared imaging following β3-adrenoreceptor stimulation in mice. We show that the increased body surface temperature observed using this method is due solely to uncoupling protein-1 (UCP1)-mediated thermogenesis and that this technique is able to discern differences in BAT activity in mice acclimated to 23 °C or thermoneutrality (30 °C). These findings represent the first standardized method utilizing infrared imaging to specifically detect UCP1 activity in vivo.

  16. Techniques in micromagnetic simulation and analysis

    Science.gov (United States)

    Kumar, D.; Adeyeye, A. O.

    2017-08-01

    Advances in nanofabrication now allow us to manipulate magnetic material at micro- and nanoscales. As the steps of design, modelling and simulation typically precede that of fabrication, these improvements have also granted a significant boost to the methods of micromagnetic simulations (MSs) and analyses. The increased availability of massive computational resources has been another major contributing factor. Magnetization dynamics at micro- and nanoscale is described by the Landau-Lifshitz-Gilbert (LLG) equation, which is an ordinary differential equation (ODE) in time. Several finite difference method (FDM) and finite element method (FEM) based LLG solvers are now widely use to solve different kind of micromagnetic problems. In this review, we present a few patterns in the ways MSs are being used in the pursuit of new physics. An important objective of this review is to allow one to make a well informed decision on the details of simulation and analysis procedures needed to accomplish a given task using computational micromagnetics. We also examine the effect of different simulation parameters to underscore and extend some best practices. Lastly, we examine different methods of micromagnetic analyses which are used to process simulation results in order to extract physically meaningful and valuable information.

  17. Development of chemical analysis techniques: pt. 3

    International Nuclear Information System (INIS)

    Kim, K.J.; Chi, K.Y.; Choi, G.C.

    1981-01-01

    For the purpose of determining trace rare earths a spectrofluorimetric method has been studied. Except Ce and Tb, the fluorescence intensities are not enough to allow satisfactory analysis. Complexing agents such as tungstate and hexafluoroacetylacetone should be employed to increase fluorescence intensities. As a preliminary experiment for the separation of individual rare earth element and uranium, the distribution coefficient, % S here, are obtained on the Dowex 50 W against HCl concentration by a batch method. These % S data are utilized to obtain elution curves. The % S data showed a minimum at around 4 M HCl. To understand this previously known phenomenon the adsorption of Cl - on Dowex 50 W is examined as a function of HCl concentration and found to be decreasing while % S of rare earths increasing. It is interpreted that Cl - and rare earth ions are moved into the resin phase separately and that the charge and the charge densities of these ions are responsible for the different % S curves. Dehydration appears to play an important role in the upturn of the % S curves at higher HCl concentrations

  18. A standardized technique for high-pressure cooling of protein crystals.

    Science.gov (United States)

    Quirnheim Pais, David; Rathmann, Barbara; Koepke, Juergen; Tomova, Cveta; Wurzinger, Paul; Thielmann, Yvonne

    2017-12-01

    Cryogenic temperatures slow down secondary radiation damage during data collection from macromolecular crystals. In 1973, cooling at high pressure was identified as a method for cryopreserving crystals in their mother liquor [Thomanek et al. (1973). Acta Cryst. A29, 263-265]. Results from different groups studying different crystal systems indicated that the approach had merit, although difficulties in making the process work have limited its widespread use. Therefore, a simplified and reliable technique has been developed termed high-pressure cooling (HPC). An essential requirement for HPC is to protect crystals in capillaries. These capillaries form part of new sample holders with SPINE standard dimensions. Crystals are harvested with the capillary, cooled at high pressure (220 MPa) and stored in a cryovial. This system also allows the usage of the standard automation at the synchrotron. Crystals of hen egg-white lysozyme and concanavalin A have been successfully cryopreserved and yielded data sets to resolutions of 1.45 and 1.35 Å, respectively. Extensive work has been performed to define the useful working range of HPC in capillaries with 250 µm inner diameter. Three different 96-well crystallization screens that are most frequently used in our crystallization facility were chosen to study the formation of amorphous ice in this cooling setup. More than 89% of the screening solutions were directly suitable for HPC. This achievement represents a drastic improvement for crystals that suffered from cryoprotection or were not previously eligible for cryoprotection.

  19. Dosimetric comparison of intensity modulated radiotherapy techniques and standard wedged tangents for whole breast radiotherapy

    International Nuclear Information System (INIS)

    Fong, Andrew; Bromley, Regina; Beat, Mardi; Vien, Din; Dineley, Jude; Morgan, Graeme

    2009-01-01

    Full text: Prior to introducing intensity modulated radiotherapy (IMRT) for whole breast radiotherapy (WBRT) into our department we undertook a comparison of the dose parameters of several IMRT techniques and standard wedged tangents (SWT). Our aim was to improve the dose distribution to the breast and to decrease the dose to organs at risk (OAR): heart, lung and contralateral breast (Contra Br). Treatment plans for 20 women (10 right-sided and 10 left-sided) previously treated with SWT for WBRT were used to compare (a) SWT; (b) electronic compensators IMRT (E-IMRT); (c) tangential beam IMRT (T-IMRT); (d) coplanar multi-field IMRT (CP-IMRT); and (e) non-coplanar multi-field IMRT (NCP-IMRT). Plans for the breast were compared for (i) dose homogeneity (DH); (ii) conformity index (CI); (iii) mean dose; (iv) maximum dose; (v) minimum dose; and dose to OAR were calculated (vi) heart; (vii) lung and (viii) Contra Br. Compared with SWT, all plans except CP-IMRT gave improvement in at least two of the seven parameters evaluated. T-IMRT and NCP-IMRT resulted in significant improvement in all parameters except DH and both gave significant reduction in doses to OAR. As on initial evaluation NCP-IMRT is likely to be too time consuming to introduce on a large scale, T-IMRT is the preferred technique for WBRT for use in our department.

  20. Network modeling and analysis technique for the evaluation of nuclear safeguards systems effectiveness

    International Nuclear Information System (INIS)

    Grant, F.H. III; Miner, R.J.; Engi, D.

    1978-01-01

    Nuclear safeguards systems are concerned with the physical protection and control of nuclear materials. The Safeguards Network Analysis Procedure (SNAP) provides a convenient and standard analysis methodology for the evaluation of safeguards system effectiveness. This is achieved through a standard set of symbols which characterize the various elements of safeguards systems and an analysis program to execute simulation models built using the SNAP symbology. The reports provided by the SNAP simulation program enable analysts to evaluate existing sites as well as alternative design possibilities. This paper describes the SNAP modeling technique and provides an example illustrating its use

  1. Network modeling and analysis technique for the evaluation of nuclear safeguards systems effectiveness

    International Nuclear Information System (INIS)

    Grant, F.H. III; Miner, R.J.; Engi, D.

    1979-02-01

    Nuclear safeguards systems are concerned with the physical protection and control of nuclear materials. The Safeguards Network Analysis Procedure (SNAP) provides a convenient and standard analysis methodology for the evaluation of safeguards system effectiveness. This is achieved through a standard set of symbols which characterize the various elements of safeguards systems and an analysis program to execute simulation models built using the SNAP symbology. The reports provided by the SNAP simulation program enable analysts to evaluate existing sites as well as alternative design possibilities. This paper describes the SNAP modeling technique and provides an example illustrating its use

  2. 48 CFR 815.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Proposal analysis techniques. 815.404-1 Section 815.404-1 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... techniques. (a) Contracting officers are responsible for the technical and administrative sufficiency of the...

  3. Canalplasty: the technique and the analysis of its results

    NARCIS (Netherlands)

    van Spronsen, Erik; Ebbens, Fenna A.; Mirck, Peter G. B.; van Wettum, Cathelijne H. M.; van der Baan, Sieberen

    2013-01-01

    To describe the technique for canalplasty as performed in the Academic Medical Center, Amsterdam, the Netherlands and to present the results of this technique. Retrospective chart analysis. Charts of patients who underwent a canalplasty prodedure between 2001 and 2010 were reviewed for indication

  4. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  5. Search for the top quark using multivariate analysis techniques

    International Nuclear Information System (INIS)

    Bhat, P.C.

    1994-08-01

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the eμ channel and neural networks to the e+jets channel

  6. The “excluding” suture technique for surgical closure of ventricular septal defects: A retrospective study comparing the standard technique

    Directory of Open Access Journals (Sweden)

    Roy Varghese

    2016-01-01

    Conclusion: Surgical closure of VSDs can be accomplished by placing sutures along the margins or away with comparable results. The incidence of CHB, however, seems to be less when the “excluding” technique is employed.

  7. New analysis technique for K-edge densitometry spectra

    International Nuclear Information System (INIS)

    Hsue, Sin-Tao; Collins, M.L.

    1995-01-01

    A method for simulating absorption edge densitometry has been developed. This program enables one to simulate spectra containing any combination of special nuclear materials (SNM) in solution. The method has been validated with an analysis method using a single SNM in solution or a combination of two types of SNM separated by a Z of 2. A new analysis technique for mixed solutions has been developed. This new technique has broader applications and eliminates the need for bias correction

  8. Reduced Rate of Dehiscence After Implementation of a Standardized Fascial Closure Technique in Patients Undergoing Emergency Laparotomy

    DEFF Research Database (Denmark)

    Tolstrup, Mai-Britt; Watt, Sara Kehlet; Gögenur, Ismail

    2017-01-01

    is lacking. We aimed to investigate whether this technique would reduce the rate of dehiscence. METHODS: A standardized procedure of closing the midline laparotomy by using a "small steps" technique of continuous suturing with a slowly absorbable (polydioxanone) suture material in a wound-suture ratio...

  9. Research on digital multi-channel pulse height analysis techniques

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun; Ao Qi

    2005-01-01

    Multi-channel pulse height analysis techniques are developing in the direction of digitalization. Based on digital signal processing techniques, digital multi-channel analyzers are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper analyzes key techniques of digital nuclear pulse processing. With MATLAB software, main algorithms are simulated, such as trapezoidal shaping, digital baseline estimation, digital pole-zero/zero-pole compensation, poles and zeros identification. The preliminary general scheme of digital MCA is discussed, as well as some other important techniques about its engineering design. All these lay the foundation of developing homemade digital nuclear spectrometers. (authors)

  10. Commercial Discount Rate Estimation for Efficiency Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fujita, K. Sydny [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-04-13

    Underlying each of the Department of Energy's (DOE's) federal appliance and equipment standards are a set of complex analyses of the projected costs and benefits of regulation. Any new or amended standard must be designed to achieve significant additional energy conservation, provided that it is technologically feasible and economically justified (42 U.S.C. 6295(o)(2)(A)). A proposed standard is considered economically justified when its benefits exceed its burdens, as represented by the projected net present value of costs and benefits. DOE performs multiple analyses to evaluate the balance of costs and benefits of commercial appliance and equipment e efficiency standards, at the national and individual building or business level, each framed to capture different nuances of the complex impact of standards on the commercial end user population. The Life-Cycle Cost (LCC) analysis models the combined impact of appliance first cost and operating cost changes on a representative commercial building sample in order to identify the fraction of customers achieving LCC savings or incurring net cost at the considered efficiency levels.1 Thus, the choice of commercial discount rate value(s) used to calculate the present value of energy cost savings within the Life-Cycle Cost model implicitly plays a key role in estimating the economic impact of potential standard levels.2 This report is intended to provide a more in-depth discussion of the commercial discount rate estimation process than can be readily included in standard rulemaking Technical Support Documents (TSDs).

  11. Standard Guide for Wet Sieve Analysis of Ceramic Whiteware Clays

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This guide covers the wet sieve analysis of ceramic whiteware clays. This guide is intended for use in testing shipments of clay as well as for plant control tests. 1.2 The values stated in inch-pound units are to be regarded as standard. The values given in parentheses are mathematical conversions to SI units that are provided for information only and are not considered standard. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  12. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  13. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  14. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  15. National Bureau of Standards coal flyash (SRM 1633a) as a multielement standard for instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Korotev, R.L.

    1987-01-01

    The U.S. National Bureau of Standards standard reference material 1633a (coal flyash) was standardized for the concentrations of 29 elements against chemical standards by instrumental neutron activation analysis. United States Geological Survey basalt standard BCR-1 was analyzed concurrently as a check. SRM 1633a is a good multielement comparator standard for geochemical analysis for 25 of the elements analyzed and is a better standard than rock-powder SRMs commonly used. Analytical data for USGS DTS-1, PCC-1, GSP-1, BIR-1, DNC-1, and W-2; NBS SRMs 278 and 688; and GIT-IWG (French) anorthosite AN-G are also presented. (author)

  16. Quantitative Analysis of Polyacrylamide Grafted on Polylactide Film Surfaces Employing Spectroscopic Techniques.

    Science.gov (United States)

    Rahman, Mijanur; Opaprakasit, Pakorn

    2017-11-01

    Standard techniques for quantitative measurement of polyacrylamide (PAm) contents grafted on polylactide (PLA) film substrates, P(LA- g-Am- co-MBAm), which are commonly used as cell culture substrates or scaffolds, and pH-sensitive absorbents have been developed with X-ray photoelectron (XPS), proton-nuclear magnetic resonance ( 1 H-NMR), and Fourier transform infrared (FT-IR) spectroscopy. The techniques are then applied to examine P(LA- g-Am- co-MBAm) samples prepared from two separate photo-initiator/co-initiator systems. Efficiency and accuracy of the techniques are compared. The results from all techniques are in good agreement, indicating high analysis precisions, although FT-IR technique provides additional advantages, in terms of short analysis time, ease of sample preparation, and accessibility of a machine. The results indicate that the riboflavin (RF) initiator system has higher grafting efficiency than its camphorquinone (CQ) counterpart. These standard techniques can be applied in the analysis of these materials and further modified for quantitative analysis of other grafting systems.

  17. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    Science.gov (United States)

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  18. Assessment of maceration techniques used to remove soft tissue from bone in cut mark analysis.

    Science.gov (United States)

    King, Christine; Birch, Wendy

    2015-01-01

    Maceration techniques employed in forensics must be effective without compromising the bone's integrity and morphology, and prevent destruction of evidence. Techniques must also be fast, safe, easily obtainable and inexpensive; not all techniques currently employed are appropriate for forensic use. To evaluate the most suitable approach, seven techniques including current and new methodologies were applied to fresh, fleshed porcine ribs exhibiting cut marks. A sample size of 30 specimens per technique was examined under scanning electron microscopy at the cut mark and the surrounding uncompromised regions; a scoring system of effectiveness was applied. The previously unpublished microwave method fared best for bone and cut mark preservation. Sodium hypochlorite destroyed cut marks, and was deemed unsuitable for forensic analysis. No single technique fulfilled all criteria; however, this study provides a benchmark for forensic anthropologists to select the most appropriate method for their situation, while maintaining the high standards required by forensic science. © 2015 American Academy of Forensic Sciences.

  19. Preparation of uranium standard solutions for x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Wong, C.M.; Cate, J.L.; Pickles, W.L.

    1978-03-01

    A method has been developed for gravimetrically preparing uranium nitrate standards with an estimated mean error of 0.1% (1 sigma) and a maximum error of 0.2% (1 sigma) for the total uranium weight. Two source materials, depleted uranium dioxide powder and NBS Standard Reference Material 960 uranium metal, were used to prepare stock solutions. The NBS metal proved to be superior because of the small but inherent uncertainty in the stoichiometry of the uranium oxide. These solutions were used to prepare standards in a freeze-dried configuration suitable for x-ray fluorescence analysis. Both gravimetric and freeze-drying techniques are presented. Volumetric preparation was found to be unsatisfactory for 0.1% precision for the sample size of interest. One of the primary considerations in preparing uranium standards for x-ray fluorescence analysis is the development of a technique for dispensing a 50-μl aliquot of a standard solution with a precision of 0.1% and an accuracy of 0.1%. The method developed corrects for variation in aliquoting and for evaporation loss during weighing. Two sets, each containing 50 standards have been produced. One set has been retained by LLL and one set retained by the Savannah River project

  20. Standard model for safety analysis report of fuel reprocessing plants

    International Nuclear Information System (INIS)

    1979-12-01

    A standard model for a safety analysis report of fuel reprocessing plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.) [pt

  1. Standard model for safety analysis report of fuel fabrication plants

    International Nuclear Information System (INIS)

    1980-09-01

    A standard model for a safety analysis report of fuel fabrication plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.) [pt

  2. A brain impact stress analysis using advanced discretization meshless techniques.

    Science.gov (United States)

    Marques, Marco; Belinha, Jorge; Dinis, Lúcia Maria Js; Natal Jorge, Renato

    2018-03-01

    This work has the objective to compare the mechanical behaviour of a brain impact using an alternative numerical meshless technique. Thus, a discrete geometrical model of a brain was constructed using medical images. This technique allows to achieve a discretization with realistic geometry, allowing to define locally the mechanical properties according to the medical images colour scale. After defining the discrete geometrical model of the brain, the essential and natural boundary conditions were imposed to reproduce a sudden impact force. The analysis was performed using the finite element analysis and the radial point interpolation method, an advanced discretization technique. The results of both techniques are compared. When compared with the finite element analysis, it was verified that meshless methods possess a higher convergence rate and that they are capable of producing smoother variable fields.

  3. [Abdominothoracic esophageal resection according to Ivor Lewis with intrathoracic anastomosis : standardized totally minimally invasive technique].

    Science.gov (United States)

    Runkel, N; Walz, M; Ketelhut, M

    2015-05-01

    The clinical and scientific interest in minimally invasive techniques for esophagectomy (MIE) are increasing; however, the intrathoracic esophagogastric anastomosis remains a surgical challenge and lacks standardization. Surgeons either transpose the anastomosis to the cervical region or perform hybrid thoracotomy for stapler access. This article reports technical details and early experiences with a completely laparoscopic-thoracoscopic approach for Ivor Lewis esophagectomy without additional thoracotomy. The extent of radical dissection follows clinical guidelines. Laparoscopy is performed with the patient in a beach chair position and thoracoscopy in a left lateral decubitus position using single lung ventilation. The anvil of the circular stapler is placed transorally into the esophageal stump. The specimen and gastric conduit are exteriorized through a subcostal rectus muscle split incision. The stapler body is placed into the gastric conduit and both are advanced through the abdominal mini-incision transhiatally into the right thoracic cavity, where the anastomosis is constructed. Data were collected prospectively and analyzed retrospectively. A total of 23 non-selected consecutive patients (mean age 69 years, range 46-80 years) with adenocarcinoma (n = 19) or squamous cell carcinoma (n = 4) were surgically treated between June 2010 and July 2013. Neoadjuvant therapy was performed in 15 patients resulting in 10 partial and 4 complete remissions. There were no technical complications and no conversions. Mean operative time was 305 min (range 220-441 min). The median lymph node count was 16 (range 4-42). An R0 resection was achieved in 91 % of patients and 3 anastomotic leaks occurred which were successfully managed endoscopically. There were no postoperative deaths. The intrathoracic esophagogastric anastomosis during minimally invasive Ivor Lewis esophagectomy can be constructed in a standardized fashion without an additional thoracotomy

  4. Methods for preparing comparative standards and field samples for neutron activation analysis of soil

    International Nuclear Information System (INIS)

    Glasgow, D.C.; Dyer, F.F.; Robinson, L.

    1994-01-01

    One of the more difficult problems associated with comparative neutron activation analysis (CNAA) is the preparation of standards which are tailor-made to the desired irradiation and counting conditions. Frequently, there simply is not a suitable standard available commercially, or the resulting gamma spectrum is convoluted with interferences. In a recent soil analysis project, the need arose for standards which contained about 35 elements. In response, a computer spreadsheet was developed to calculate the appropriate amount of each element so that the resulting gamma spectrum is relatively free of interferences. Incorporated in the program are options for calculating all of the irradiation and counting parameters including activity produced, necessary flux/bombardment time, counting time, and appropriate source-to-detector distance. The result is multi-element standards for CNAA which have optimal concentrations. The program retains ease of use without sacrificing capability. In addition to optimized standard production, a novel soil homogenization technique was developed which is a low cost, highly efficient alternative to commercially available homogenization systems. Comparative neutron activation analysis for large scale projects has been made easier through these advancements. This paper contains details of the design and function of the NAA spreadsheet and innovative sample handling techniques

  5. Standardization of MIP technique in three-dimensional CT portography: usefulness in evaluation of portosystemic collaterals in cirrhotic patients

    International Nuclear Information System (INIS)

    Kim, Jong Gi; Kim, Yong; Kim, Chang Won; Lee, Jun Woo; Lee, Suk Hong

    2003-01-01

    To assess the usefulness of three-dimensional CT portography using a standardized maximum intensity projection (MIP) technique for the evaluation of portosystemic collaterals in cirrhotic patients. In 25 cirrhotic patients with portosystemic collaterals, three-phase CT using a multide-tector-row helical CT scanner was performed to evaluate liver disease. Late arterial-phase images were transferred to an Advantage Windows 3.1 workstation (Gener Electric). Axial images were reconstructed by means of three-dimensional CT portography, using both a standardized and a non-standardized MIP technique, and the respective reconstruction times were determined. Three-dimensional CT portography with the standardized technique involved eight planes, namely the spleno-portal confluence axis (coronal, lordotic coronal, lordotic coronal RAO 30 .deg. C, and lordotic coronal LAO 30 .deg. C), the left renal vein axis (lordotic coronal), and axial MIP images (lower esophagus level, gastric fundus level and splenic hilum). The eight MIP images obtained in each case were interpreted by two radiologists, who reached a consensus in their evaluation. The portosystemic collaterals evaluated were as follows: left gastric vein dilatation; esophageal, paraesophageal, gastric, and splenic varix; paraumbilical vein dilatation; gastro-renal, spleno-renal, and gastro-spleno-renal shunt; mesenteric, retroperitoneal, and omental collaterals. The average reconstruction time using the non-standardized MIP technique was 11 minutes 23 seconds, and with the standardized technique, the time was 6 minutes 5 seconds. Three-dimensional CT portography with the standardized technique demonstrated left gastric vein dilatation (n=25), esophageal varix (n=18), paraesophageal varix (n=13), gastric varix (n=4), splenic varix (n=4), paraumbilical vein dilatation (n=4), gastro-renal shunt (n=3), spleno-renal shunt (n=3), and gastro-spleno-renal shunt (n=1). Using three-dimensional CT protography and the non-standardized

  6. Standardization of the radioimmunoassay technique for the determination of human gastrin and its clinical application

    International Nuclear Information System (INIS)

    Peig Ginabredra, M.G.

    1989-01-01

    It was developed and standardized a system of radioimmunoassay for the determination of gastrin, employing synthetic human gastrin for radioiodination and preparation of standard as well as specific antibody raised rabbits. The hormone was labeled with 125 I by the Cloramine T techique and purified by anion exchange chromatography in QAE-Sephadex A-25, being determined its specific activity. The tracer thus obtained was submitted to analysis of purity by poliacrilamide gel eletrophoresis and precipitation of proteins by trichloroacetic acid. Its stability evaluated according to the time of storage, being its purity and adequation for the use in radioimmunoassay also compared to a tracer obtained from a commercial diagnosis kit. The assays were performed by incubation of radioiodinated gastrin, standard gastrin prepared in plasma free from this hormone (from zero to 500 pmol/l) or samples to be assayed with the antiserum for 4 days at 4 0 C. The separation between the free gastrin and the gastrin bound to the antibody was carried out by adsorption of the free hormone to the charcoal, whose ideal concentration was previously determined. Plasma free from gastrin was obtained from time-expired blood bank plasma submitted to extraction with charcoal. When performed the quality control, this radioimmunoassay was shown specific, accurate, precise and sensitive, allowing the performance of valid assays. Its validation was even confirmed by clear discrimination not only of the gastrin concentration in subjects with very low levels (gastrectomized) and extremely high levels (Zollinger-Ellison syndrome) as well as gastrin concentrations in subjects with other diseases, such as Chagas disease, pernicious anemia and chronic renal failure. (author) [pt

  7. Development, improvement and calibration of neutronic reaction rates measurements: elaboration of a standard techniques basis

    International Nuclear Information System (INIS)

    Hudelot, J.P.

    1998-06-01

    In order to improve and to validate the neutronics calculation schemes, perfecting integral measurements of neutronics parameters is necessary. This thesis focuses on the conception, the improvement and the development of neutronics reaction rates measurements, and aims at building a base of standard techniques. Two subjects are discussed. The first one deals with direct measurements by fission chambers. A short presentation of the different usual techniques is given. Then, those last ones are applied through the example of doubling time measurements on the EOLE facility during the MISTRAL 1 experimental programme. Two calibration devices of fission chambers are developed: a thermal column located in the central part of the MINERVE facility, and a calibration cell using a pulsed high flux neutron generator and based on the discrimination of the energy of the neutrons with a time-of-flight method. This second device will soon allow to measure the mass of fission chambers with a precision of about 1 %. Finally, the necessity of those calibrations will be shown through spectral indices measurements in core MISTRAL 1 (UO 2 ) and MISTRAL 2 (MOX) of the EOLE facility. In each case, the associated calculation schemes, performed using the Monte Carlo MCNP code with the ENDF-BV library, will be validated. Concerning the second one, the goal is to develop a method for measuring the modified conversion ratio of 238 U (defined as the ratio of 238 U capture rate to total fission rate) by gamma-ray spectrometry of fuel rods. Within the framework of the MISTRAL 1 and MISTRAL 2 programmes, the measurement device, the experimental results and the spectrometer calibration are described. Furthermore, the MCNP calculations of neutron self-shielding and gamma self-absorption are validated. It is finally shown that measurement uncertainties are better than 1 %. The extension of this technique to future modified conversion ratio measurements for 242 Pu (on MOX rods) and 232 Th (on

  8. Application of nuclear analysis techniques in ancient chinese porcelain

    International Nuclear Information System (INIS)

    Feng Songlin; Xu Qing; Feng Xiangqian; Lei Yong; Cheng Lin; Wang Yanqing

    2005-01-01

    Ancient ceramic was fired with porcelain clay. It contains various provenance information and age characteristic. It is the scientific foundation of studying Chinese porcelain to analyze and research the ancient ceramic with modern analysis methods. According to the property of nuclear analysis technique, its function and application are discussed. (authors)

  9. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  10. Regional environmental analysis and management: New techniques for current problems

    Science.gov (United States)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  11. [TXRF technique and quantitative analysis of mollusc teeth].

    Science.gov (United States)

    Tian, Y; Liu, K; Wu, X; Zheng, S

    1999-06-01

    Total reflection X-ray fluorescence (TXRF) analysis technique and the instrument with a short path, high efficiency, low power and small volume are briefly presented. The detection limit of the system are at pg-level for Cu and Mo target excitation. Teeth of a marine mollusc were measured quantitatively and the spectrum and analysis results were given.

  12. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  13. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  14. Compassionate care? A critical discourse analysis of accreditation standards.

    Science.gov (United States)

    Whitehead, Cynthia; Kuper, Ayelet; Freeman, Risa; Grundland, Batya; Webster, Fiona

    2014-06-01

    We rely upon formal accreditation and curricular standards to articulate the priorities of professional training. The language used in standards affords value to certain constructs and makes others less apparent. Leveraging standards can be a useful way for educators to incorporate certain elements into training. This research was designed to look for ways to embed the teaching and practice of compassionate care into Canadian family medicine residency training. We conducted a Foucauldian critical discourse analysis of compassionate care in recent formal family medicine residency training documents. Critical discourse analysis is premised on the notion that language is connected to practices and to what is accorded value and power. We assembled an archive of texts and examined them to analyse how compassionate care is constructed, how notions of compassionate care relate to other key ideas in the texts, and the implications of these framings. There were very few words, metaphors or statements that related to concepts of compassionate care in our archive. Even potential proxies, notably the doctor-patient relationship and patient-centred care, were not primarily depicted in ways that linked them to ideas of compassion or caring. There was a reduction in language related to compassionate care in the 2013 standards compared with the standards published in 2006. Our research revealed negative findings and a relative absence of the construct of compassionate care in our archival documents. This work demonstrates how a shift in curricular focus can have the unintended consequence of making values that are taken for granted less visible. Given that standards shape training, we must pay attention not only to what we include, but also to what we leave out of formal documents. We risk losing important professional values from training programmes if they are not explicitly highlighted in our standards. © 2014 John Wiley & Sons Ltd.

  15. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  16. Development of the standards for probabilistic analysis of security

    International Nuclear Information System (INIS)

    Nelson, P. F.; Gonzalez C, M.

    2008-01-01

    The standard of the American Society of Mechanical Engineers (ASME) for Analysis Probability of Security (APS), for applications in nuclear plants, it was limited originally to an APS Level 1 of internal events. However, the recent efforts taken by the committee of administration of nuclear risk of the ASME, together with the committee for standards informed in risk of the American Nuclear Society (ANS), they have taken place an improved standard that the combines standard original ASME of APS Level internal events, fires inside the plant and external events, with a reserved place for events that happen to low powers and put out. This integrated standard will be used for the nuclear plants and the regulators to carry out applications informed in risk. The use of the APS has matured to the point that the programs of risk management have been developed that its is being used as part of the taking of decisions making in the nuclear facilities. The standard provides approaches to evaluate the technical capacities of an APS, relative to a matter in particular that allows them to the specialists in APS to determine if the elements of the APS are technically appropriate with regard to an application informed in particular risk. Informed applications in risk like inspection in service and technical specifications informed in risk they save time and resources, not alone to the plants, but to the regulator also. (Author)

  17. Standardization of pulmonary ventilation technique using volume-controlled ventilators in rats with congenital diaphragmatic hernia

    Directory of Open Access Journals (Sweden)

    Rodrigo Melo Gallindo

    Full Text Available OBJECTIVE: To standardize a technique for ventilating rat fetuses with Congenital Diaphragmatic Hernia (CDH using a volume-controlled ventilator. METHODS: Pregnant rats were divided into the following groups: a control (C; b exposed to nitrofen with CDH (CDH; and c exposed to nitrofen without CDH (N-. Fetuses of the three groups were randomly divided into the subgroups ventilated (V and non-ventilated (N-V. Fetuses were collected on day 21.5 of gestation, weighed and ventilated for 30 minutes using a volume-controlled ventilator. Then the lungs were collected for histological study. We evaluated: body weight (BW, total lung weight (TLW, left lung weight (LLW, ratios TLW / BW and LLW / BW, morphological histology of the airways and causes of failures of ventilation. RESULTS: BW, TLW, LLW, TLW / BW and LLW / BW were higher in C compared with N- (p 0.05. The morphology of the pulmonary airways showed hypoplasia in groups N- and CDH, with no difference between V and N-V (p <0.05. The C and N- groups could be successfully ventilated using a tidal volume of 75 ìl, but the failure of ventilation in the CDH group decreased only when ventilated with 50 ìl. CONCLUSION: Volume ventilation is possible in rats with CDH for a short period and does not alter fetal or lung morphology.

  18. On criteria for examining analysis quality with standard reference material

    International Nuclear Information System (INIS)

    Yang Huating

    1997-01-01

    The advantages and disadvantages and applicability of some criteria for examining analysis quality with standard reference material are discussed. The combination of the uncertainties of the instrument examined and the reference material should be determined on the basis of specific situations. Without the data of the instrument's uncertainty, it would be applicable to substitute the standard deviation multiplied by certain times for the uncertainty. The result of the examining should not result in more error reported in routine measurements than it really is. Over strict examining should also be avoided

  19. Soil texture analysis by laser diffraction - standardization needed

    DEFF Research Database (Denmark)

    Callesen, Ingeborg; Palviainen, M.; Kjønaas, O. Janne

    2017-01-01

    Soil texture is a central soil quality property. Laser diffraction (LD) for determination of particle size distribution (PSD) is now widespread due to easy analysis and low cost. However, pretreatment methods and interpretation of the resulting soil PSD’s are not standardized. Comparison of LD data...... and many newer; ISO 13320:2009). PSD uncertainty caused by pretreatments and PSD bias caused by plate-shaped clay particles still calls for more method standardization work. If LD is used more generally, new pedotransfer functions for other soil properties (e.g water retention) based on sieving...

  20. Managing Software Project Risks (Analysis Phase) with Proposed Fuzzy Regression Analysis Modelling Techniques with Fuzzy Concepts

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2014-01-01

    The aim of this paper is to propose new mining techniques by which we can study the impact of different risk management techniques and different software risk factors on software analysis development projects. The new mining technique uses the fuzzy multiple regression analysis techniques with fuzzy concepts to manage the software risks in a software project and mitigating risk with software process improvement. Top ten software risk factors in analysis phase and thirty risk management techni...

  1. Provenience studies using neutron activation analysis: the role of standardization

    Energy Technology Data Exchange (ETDEWEB)

    Harbottle, G

    1980-01-01

    This paper covers the historical background of chemical analysis of archaeological artifacts which dates back to 1790 to the first application of neutron activation analysis to archaeological ceramics and goes on to elaborate on the present day status of neutron activation analysis in provenience studies, and the role of standardization. In principle, the concentrations of elements in a neutron-activated specimen can be calculated from an exact knowledge of neutron flux, its intensity, duration and spectral (energy) distribution, plus an exact gamma ray count calibrated for efficiency, corrected for branching rates, etc. However, in practice it is far easier to compare one's unknown to a standard of known or assumed composition. The practice has been for different laboratories to use different standards. With analyses being run in the thousands throughout the world, a great benefit would be derived if analyses could be exchanged among all users and/or generators of data. The emphasis of this paper is on interlaboratory comparability of ceramic data; how far are we from it, what has been proposed in the past to achieve this goal, and what is being proposed. All of this may be summarized under the general heading of Analytical Quality Control - i.e., how to achieve precise and accurate analysis. The author proposes that anyone wishing to analyze archaeological ceramics should simply use his own standard, but attempt to calibrate that standard as nearly as possible to absolute (i.e., accurate) concentration values. The relationship of Analytical Quality Control to provenience location is also examined.

  2. Provenience studies using neutron activation analysis: the role of standardization

    International Nuclear Information System (INIS)

    Harbottle, G.

    1980-01-01

    This paper covers the historical background of chemical analysis of archaeological artifacts which dates back to 1790 to the first application of neutron activation analysis to archaeological ceramics and goes on to elaborate on the present day status of neutron activation analysis in provenience studies, and the role of standardization. In principle, the concentrations of elements in a neutron-activated specimen can be calculated from an exact knowledge of neutron flux, its intensity, duration and spectral (energy) distribution, plus an exact gamma ray count calibrated for efficiency, corrected for branching rates, etc. However, in practice it is far easier to compare one's unknown to a standard of known or assumed composition. The practice has been for different laboratories to use different standards. With analyses being run in the thousands throughout the world, a great benefit would be derived if analyses could be exchanged among all users and/or generators of data. The emphasis of this paper is on interlaboratory comparability of ceramic data; how far are we from it, what has been proposed in the past to achieve this goal, and what is being proposed. All of this may be summarized under the general heading of Analytical Quality Control - i.e., how to achieve precise and accurate analysis. The author proposes that anyone wishing to analyze archaeological ceramics should simply use his own standard, but attempt to calibrate that standard as nearly as possible to absolute (i.e., accurate) concentration values. The relationship of Analytical Quality Control to provenience location is also examined

  3. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.

    The thesis describes and develops the theoretical foundations of the Random Decrement technique, while giving several examples of modal analysis of large building constructions (bridges). The connection between modal parameters and Random Decrement functions is described theoretically....... The efficiency of the Random Decrement technique for the estimation of correlation functions is compared to other equivalent methods (FFT, Direct method). It is shown that the Random Decrement technique can be as much as a hundred times faster than other methods. The theory behind the Random Decrement technique...... is expanded to include both a vector formulation that increases speed considerably, and a new method for the prediction of the variance of the estimated Random Decrement functions. The thesis closes with a number of examples of modal analysis of bridges exposed to natural (ambient) load....

  4. Recommendations for a proposed standard for performing systems analysis

    International Nuclear Information System (INIS)

    LaChance, J.; Whitehead, D.; Drouin, M.

    1998-01-01

    In August 1995, the Nuclear Regulatory Commission (NRC) issued a policy statement proposing improved regulatory decisionmaking by increasing the use of PRA [probabilistic risk assessment] in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. A key aspect in using PRA in risk-informed regulatory activities is establishing the appropriate scope and attributes of the PRA. In this regard, ASME decided to develop a consensus PRA Standard. The objective is to develop a PRA Standard such that the technical quality of nuclear plant PRAs will be sufficient to support risk-informed regulatory applications. This paper presents examples recommendations for the systems analysis element of a PRA for incorporation into the ASME PRA Standard

  5. Modification of the cranial closing wedge ostectomy technique for the treatment of canine cruciate disease. Description and comparison with standard technique.

    Science.gov (United States)

    Wallace, A M; Addison, E S; Smith, B A; Radke, H; Hobbs, S J

    2011-01-01

    To describe a modification of the cranial closing wedge ostectomy (CCWO) technique and to compare its efficacy to the standard technique on cadaveric specimens. The standard and modified CCWO technique were applied to eight pairs of cadaveric tibiae. The following parameters were compared following the ostectomy: degrees of plateau levelling achieved (degrees), tibial long axis shift (degrees), reduction in tibial length (mm), area of bone wedge removed (cm²), and the area of proximal fragment (cm²). The size of the removed wedge of bone and the reduction in tibial length were significantly less with the modified CCWO technique. The modified CCWO has two main advantages. Firstly a smaller wedge is removed, allowing a greater preservation of bone stock in the proximal tibia, which is advantageous for implant placement. Secondly, the tibia is shortened to a lesser degree, which might reduce the risk of recurvatum, fibular fracture and patella desmitis. These factors are particularly propitious for the application of this technique to Terrier breeds with excessive tibial plateau angle, where large angular corrections are required. The modified CCWO is equally effective for plateau levelling and results in an equivalent tibial long-axis shift. A disadvantage with the modified technique is that not all of the cross sectional area of the distal fragment contributes to load sharing at the osteotomy.

  6. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  7. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  8. Quantitative Image Analysis Techniques with High-Speed Schlieren Photography

    Science.gov (United States)

    Pollard, Victoria J.; Herron, Andrew J.

    2017-01-01

    Optical flow visualization techniques such as schlieren and shadowgraph photography are essential to understanding fluid flow when interpreting acquired wind tunnel test data. Output of the standard implementations of these visualization techniques in test facilities are often limited only to qualitative interpretation of the resulting images. Although various quantitative optical techniques have been developed, these techniques often require special equipment or are focused on obtaining very precise and accurate data about the visualized flow. These systems are not practical in small, production wind tunnel test facilities. However, high-speed photography capability has become a common upgrade to many test facilities in order to better capture images of unsteady flow phenomena such as oscillating shocks and flow separation. This paper describes novel techniques utilized by the authors to analyze captured high-speed schlieren and shadowgraph imagery from wind tunnel testing for quantification of observed unsteady flow frequency content. Such techniques have applications in parametric geometry studies and in small facilities where more specialized equipment may not be available.

  9. Image analysis techniques for the study of turbulent flows

    Science.gov (United States)

    Ferrari, Simone

    In this paper, a brief review of Digital Image Analysis techniques employed in Fluid Mechanics for the study of turbulent flows is given. Particularly the focus is on the techniques developed by the research teams the Author worked in, that can be considered relatively "low cost" techniques. Digital Image Analysis techniques have the advantage, when compared to the traditional techniques employing physical point probes, to be non-intrusive and quasi-continuous in space, as every pixel on the camera sensor works as a single probe: consequently, they allow to obtain two-dimensional or three-dimensional fields of the measured quantity in less time. Traditionally, the disadvantages are related to the frequency of acquisition, but modern high-speed cameras are typically able to acquire at frequencies from the order of 1 KHz to the order of 1 MHz. Digital Image Analysis techniques can be employed to measure concentration, temperature, position, displacement, velocity, acceleration and pressure fields with similar equipment and setups, and can be consequently considered as a flexible and powerful tool for measurements on turbulent flows.

  10. Image analysis techniques for the study of turbulent flows

    Directory of Open Access Journals (Sweden)

    Ferrari Simone

    2017-01-01

    Full Text Available In this paper, a brief review of Digital Image Analysis techniques employed in Fluid Mechanics for the study of turbulent flows is given. Particularly the focus is on the techniques developed by the research teams the Author worked in, that can be considered relatively “low cost” techniques. Digital Image Analysis techniques have the advantage, when compared to the traditional techniques employing physical point probes, to be non-intrusive and quasi-continuous in space, as every pixel on the camera sensor works as a single probe: consequently, they allow to obtain two-dimensional or three-dimensional fields of the measured quantity in less time. Traditionally, the disadvantages are related to the frequency of acquisition, but modern high-speed cameras are typically able to acquire at frequencies from the order of 1 KHz to the order of 1 MHz. Digital Image Analysis techniques can be employed to measure concentration, temperature, position, displacement, velocity, acceleration and pressure fields with similar equipment and setups, and can be consequently considered as a flexible and powerful tool for measurements on turbulent flows.

  11. ERROR ANALYSIS FOR THE AIRBORNE DIRECT GEOREFERINCING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    A. S. Elsharkawy

    2016-10-01

    Full Text Available Direct Georeferencing was shown to be an important alternative to standard indirect image orientation using classical or GPS-supported aerial triangulation. Since direct Georeferencing without ground control relies on an extrapolation process only, particular focus has to be laid on the overall system calibration procedure. The accuracy performance of integrated GPS/inertial systems for direct Georeferencing in airborne photogrammetric environments has been tested extensively in the last years. In this approach, the limiting factor is a correct overall system calibration including the GPS/inertial component as well as the imaging sensor itself. Therefore remaining errors in the system calibration will significantly decrease the quality of object point determination. This research paper presents an error analysis for the airborne direct Georeferencing technique, where integrated GPS/IMU positioning and navigation systems are used, in conjunction with aerial cameras for airborne mapping compared with GPS/INS supported AT through the implementation of certain amount of error on the EOP and Boresight parameters and study the effect of these errors on the final ground coordinates. The data set is a block of images consists of 32 images distributed over six flight lines, the interior orientation parameters, IOP, are known through careful camera calibration procedure, also 37 ground control points are known through terrestrial surveying procedure. The exact location of camera station at time of exposure, exterior orientation parameters, EOP, is known through GPS/INS integration process. The preliminary results show that firstly, the DG and GPS-supported AT have similar accuracy and comparing with the conventional aerial photography method, the two technologies reduces the dependence on ground control (used only for quality control purposes. Secondly, In the DG Correcting overall system calibration including the GPS/inertial component as well as the

  12. An integrated technique for the analysis of skin bite marks.

    Science.gov (United States)

    Bernitz, Herman; Owen, Johanna H; van Heerden, Willie F P; Solheim, Tore

    2008-01-01

    The high number of murder, rape, and child abuse cases in South Africa has led to increased numbers of bite mark cases being heard in high courts. Objective analysis to match perpetrators to bite marks at crime scenes must be able to withstand vigorous cross-examination to be of value in conviction of perpetrators. An analysis technique is described in four stages, namely determination of the mark to be a human bite mark, pattern association analysis, metric analysis and comparison with the population data, and illustrated by a real case study. New and accepted techniques are combined to determine the likelihood ratio of guilt expressed as one of a range of conclusions described in the paper. Each stage of the analysis adds to the confirmation (or rejection) of concordance between the dental features present on the victim and the dentition of the suspect. The results illustrate identification to a high degree of certainty.

  13. Standardization of the PCR technique for the detection of delta toxin in Staphylococcus spp.

    Directory of Open Access Journals (Sweden)

    C. Marconi

    2005-06-01

    Full Text Available Coagulase-negative staphylococci (CNS, components of the normal flora of neonates, have emerged as important opportunistic pathogens of nosocomial infections that occur in neonatal intensive care units. Some authors have reported the ability of some CNS strains, particularly Staphylococcus epidermidis, to produce a toxin similar to S. aureus delta toxin. This toxin is an exoprotein that has a detergent action on the membranes of various cell types resulting in rapid cell lysis. The objectives of the present study were to standardize the Polymerase Chain Reaction (PCR technique for the detection of the gene responsible for the production of delta toxin (hld gene in staphylococcal species isolated from catheters and blood cultures obtained from neonates, and to compare the results to those obtained with the phenotypic synergistic hemolysis method. Detection of delta toxin by the phenotypic and genotypic method yielded similar results for the S. aureus isolates. However, in S. epidermidis, a higher positivity was observed for PCR (97.4% compared to the synergistic hemolysis method (86.8%. Among CNS, S. epidermidis was the most frequent isolate and was a delta toxin producer. Staphylococcus simulans and S. warneri tested positive by the phenotypic method, but their positivity was not confirmed by PCR for the hld gene detection. These results indicate that different genes might be responsible for the production of this toxin in different CNS species, requiring highly specific primers for their detection. PCR was found to be a rapid and reliable method for the detection of the hld gene in S. aureus and S. epidermidis.

  14. Standardizing Handoff Communication: Content Analysis of 27 Handoff Mnemonics.

    Science.gov (United States)

    Nasarwanji, Mahiyar F; Badir, Aysel; Gurses, Ayse P

    2016-01-01

    This study synthesizes information contained in 27 mnemonics to identify what information should be communicated during a handoff. Clustering and content analysis resulted in 12 primary information clusters that should be communicated. Given the large amount of information identified, it would be beneficial to use a structured handoff communication tool developed using a participatory approach. In addition, we recommend local standardization of information communicated during handoffs with variation across settings.

  15. 1997 Accession Medical Standards Analysis & Research Activity (AMSARA) Annual Report

    Science.gov (United States)

    1998-05-01

    Chronic knee pain 22 2.3 Adjustment disorder 16 2.7 Hearing 22 2.3 Bone injury-lower extremities 14 2.4 Eating disorder 22 2.3 Hypertension 12 2.0...Genetic Influences in Childhood-Onset Psychiatric Disorders : Autism and Attention-Deficit/Hyperactivity Disorder . Am J Hum Genet 1997;60:1276-1282...active duty Attention-Deficit/Hyperactivity Disorder Armed Forces Qualifying Test Academic Skills Defect Accession Medical Standards Analysis and

  16. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    Science.gov (United States)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  17. "Heidelberg standard examination" and "Heidelberg standard procedures" - Development of faculty-wide standards for physical examination techniques and clinical procedures in undergraduate medical education.

    Science.gov (United States)

    Nikendei, C; Ganschow, P; Groener, J B; Huwendiek, S; Köchel, A; Köhl-Hackert, N; Pjontek, R; Rodrian, J; Scheibe, F; Stadler, A-K; Steiner, T; Stiepak, J; Tabatabai, J; Utz, A; Kadmon, M

    2016-01-01

    The competent physical examination of patients and the safe and professional implementation of clinical procedures constitute essential components of medical practice in nearly all areas of medicine. The central objective of the projects "Heidelberg standard examination" and "Heidelberg standard procedures", which were initiated by students, was to establish uniform interdisciplinary standards for physical examination and clinical procedures, and to distribute them in coordination with all clinical disciplines at the Heidelberg University Hospital. The presented project report illuminates the background of the initiative and its methodological implementation. Moreover, it describes the multimedia documentation in the form of pocketbooks and a multimedia internet-based platform, as well as the integration into the curriculum. The project presentation aims to provide orientation and action guidelines to facilitate similar processes in other faculties.

  18. A Comparative Analysis of Three Proposed Federal Renewable Electricity Standards

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Logan, Jeffrey [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States); Short, Walter [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2009-05-01

    This paper analyzes potential impacts of proposed national renewable electricity standard (RES) legislation. An RES is a mandate requiring certain electricity retailers to provide a minimum share of their electricity sales from qualifying renewable power generation. The analysis focuses on draft bills introduced individually by Senator Jeff Bingaman and Representative Edward Markey, and jointly by Representative Henry Waxman and Markey. The analysis uses NREL's Regional Energy Deployment System (ReEDS) model to evaluate the impacts of the proposed RES requirements on the U.S. energy sector in four scenarios.

  19. Comparative Analysis of Three Proposed Federal Renewable Electricity Standards

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, P.; Logan, J.; Bird, L.; Short, W.

    2009-05-01

    This paper analyzes potential impacts of proposed national renewable electricity standard (RES) legislation. An RES is a mandate requiring certain electricity retailers to provide a minimum share of their electricity sales from qualifying renewable power generation. The analysis focuses on draft bills introduced individually by Senator Jeff Bingaman and Representative Edward Markey, and jointly by Representative Henry Waxman and Markey. The analysis uses NREL's Regional Energy Deployment System (ReEDS) model to evaluate the impacts of the proposed RES requirements on the U.S. energy sector in four scenarios.

  20. Standard Test Method for Oxygen Content Using a 14-MeV Neutron Activation and Direct-Counting Technique

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This test method covers the measurement of oxygen concentration in almost any matrix by using a 14-MeV neutron activation and direct-counting technique. Essentially, the same system may be used to determine oxygen concentrations ranging from over 50 % to about 10 g/g, or less, depending on the sample size and available 14-MeV neutron fluence rates. Note 1 - The range of analysis may be extended by using higher neutron fluence rates, larger samples, and higher counting efficiency detectors. 1.2 This test method may be used on either solid or liquid samples, provided that they can be made to conform in size, shape, and macroscopic density during irradiation and counting to a standard sample of known oxygen content. Several variants of this method have been described in the technical literature. A monograph is available which provides a comprehensive description of the principles of activation analysis using a neutron generator (1). 1.3 The values stated in either SI or inch-pound units are to be regarded...

  1. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  2. Comparison of anthropometry with photogrammetry based on a standardized clinical photographic technique using a cephalostat and chair.

    Science.gov (United States)

    Han, Kihwan; Kwon, Hyuk Joon; Choi, Tae Hyun; Kim, Jun Hyung; Son, Daegu

    2010-03-01

    The aim of this study was to standardize clinical photogrammetric techniques, and to compare anthropometry with photogrammetry. To standardize clinical photography, we have developed a photographic cephalostat and chair. We investigated the repeatability of the standardized clinical photogrammetric technique. Then, with 40 landmarks, a total of 96 anthropometric measurement items was obtained from 100 Koreans. Ninety six photogrammetric measurements from the same subjects were also obtained from standardized clinical photographs using Adobe Photoshop version 7.0 (Adobe Systems Corporation, San Jose, CA, USA). The photogrammetric and anthropometric measurement data (mm, degree) were then compared. A coefficient was obtained by dividing the anthropometric measurements by the photogrammetric measurements. The repeatability of the standardized photography was statistically significantly high (p=0.463). Among the 96 measurement items, 44 items were reliable; for these items the photogrammetric measurements were not different to the anthropometric measurements. The remaining 52 items must be classified as unreliable. By developing a photographic cephalostat and chair, we have standardized clinical photogrammetric techniques. The reliable set of measurement items can be used as anthropometric measurements. For unreliable measurement items, applying a suitable coefficient to the photogrammetric measurement allows the anthropometric measurement to be obtained indirectly.

  3. Conference on Techniques of Nuclear and Conventional Analysis and Applications

    International Nuclear Information System (INIS)

    2012-01-01

    Full text : With their wide scope, particularly in the areas of environment, geology, mining, industry and life sciences; analysis techniques are of great importance in research as fundamental and applied. The Conference on Techniques for Nuclear and Conventional Analysis and Applications (TANCA) are Registered in the national strategy of opening of the University and national research centers on their local, national and international levels. This conference aims to: Promoting nuclear and conventional analytical techniques; Contribute to the creation of synergy between the different players involved in these techniques include, Universities, Research Organizations, Regulatory Authorities, Economic Operators, NGOs and others; Inform and educate potential users of the performance of these techniques; Strengthen exchanges and links between researchers, industry and policy makers; Implement a program of inter-laboratory comparison between Moroccan one hand, and their foreign counterparts on the other; Contribute to the research training of doctoral students and postdoctoral scholars. Given the relevance and importance of the issues related to environment and impact on cultural heritage, this fourth edition of TANCA is devoted to the application of analytical techniques for conventional and nuclear Questions ied to environment and its impact on cultural heritage.

  4. Performance Analysis of Modified Drain Gating Techniques for Low Power and High Speed Arithmetic Circuits

    Directory of Open Access Journals (Sweden)

    Shikha Panwar

    2014-01-01

    Full Text Available This paper presents several high performance and low power techniques for CMOS circuits. In these design methodologies, drain gating technique and its variations are modified by adding an additional NMOS sleep transistor at the output node which helps in faster discharge and thereby providing higher speed. In order to achieve high performance, the proposed design techniques trade power for performance in the delay critical sections of the circuit. Intensive simulations are performed using Cadence Virtuoso in a 45 nm standard CMOS technology at room temperature with supply voltage of 1.2 V. Comparative analysis of the present circuits with standard CMOS circuits shows smaller propagation delay and lesser power consumption.

  5. Instrumental Neutron Activation Analysis Technique using Subsecond Radionuclides

    DEFF Research Database (Denmark)

    Nielsen, H.K.; Schmidt, J.O.

    1987-01-01

    The fast irradiation facility Mach-1 installed at the Danish DR 3 reactor has been used in boron determinations by means of Instrumental Neutron Activation Analysis using12B with 20-ms half-life. The performance characteristics of the system are presented and boron determinations of NBS standard...

  6. Comparison of the THERP quantitative tables with the human reliability analysis techniques of second generation

    International Nuclear Information System (INIS)

    Alvarenga, Marco Antonio Bayout; Fonseca, Renato Alves

    2009-01-01

    The methodology THERP is classified as a Human Reliability Analysis (HRA) technique of first generation and its emergence was an important initial step for the development of HRA techniques in the industry. Due to the fact of being a first generation technique, THERP quantification tables of human errors are based on a taxonomy that does not take into account the human errors mechanisms. Concerning the three cognitive levels in the Rasmussen framework for the cognitive information processing in human beings, THERP deals in most cases with errors that happen in the perceptual-motor level (stimulus-response). In the rules level, this technique can work better using the time dependent probabilities curves of diagnosis errors, obtained in nuclear power plants simulators. Nevertheless, this is done without processing any error mechanisms. Another deficiency is the fact that the performance shaping factors are in limited number. Furthermore, the influences (predictable or not) of operational context, arising from operational deviations of the most probable (in terms of occurrence probabilities) standard scenarios beside the consequent operational tendencies (operator actions) are not estimated. This work makes a critical analysis of these deficiencies and it points out possible solutions in order to modify the THERP tables, seeking a realistic quantification, that does not underestimate or overestimate the human errors probabilities when applying the HRA techniques to nuclear power plants. The critical analysis is accomplished through a qualitative comparison between THERP, a HRA technique of first generation, with CREAM, as well as ATHEANA, which are HRA techniques of second generation. (author)

  7. Analysis and suggestions on standard system for general nuclear instruments

    International Nuclear Information System (INIS)

    Xiong Zhenglong

    1999-08-01

    The standard system has been analyzed and researched for the general nuclear instruments and propounded following suggestions against the problems in standard's system: seriously adopting the international standards and recommending Chinese standards toward the world; appropriately regularizing the system's frame and the standard's configurations to make it more scientific, perfect and applicable; enhancing the construction of technical and basic standards, promoting the standardization of entire nuclear instruments; replenishing the standards of the testing methods and straightening out the standard's level, further completing the standard's system. In short, all of them are to enhance quality, readability and maneuverability of standards, to exert sufficiently the effects of standards

  8. Comparing dynamical systems concepts and techniques for biomechanical analysis

    OpenAIRE

    van Emmerik, Richard E.A.; Ducharme, Scott W.; Amado, Avelino C.; Hamill, Joseph

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new stat...

  9. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  10. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da; Jordao, Elizabete

    2008-01-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  11. Multidimensional scaling technique for analysis of magnetic storms ...

    Indian Academy of Sciences (India)

    R.Narasimhan(krishtel emaging) 1461 1996 Oct 15 13:05:22

    Multidimensional scaling is a powerful technique for analysis of data. The latitudinal dependence of geomagnetic field ..... at best an approximation of the real situation but still it may contain a surprising amount of useful .... (oscillations) is a function of latitude and local time. Close to the dip equator just south of Trivan-.

  12. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    Science.gov (United States)

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  13. Study and analysis of wavelet based image compression techniques ...

    African Journals Online (AJOL)

    This paper presented comprehensive study with performance analysis of very recent Wavelet transform based image compression techniques. Image compression is one of the necessities for such communication. The goals of image compression are to minimize the storage requirement and communication bandwidth.

  14. Evolution of the sedimentation technique for particle size distribution analysis

    International Nuclear Information System (INIS)

    Maley, R.

    1998-01-01

    After an introduction on the significance of particle size measurements, sedimentation methods are described, with emphasis on the evolution of the gravitational approach. The gravitational technique based on mass determination by X-ray adsorption allows fast analysis by automation and easy data handling, in addition to providing the accuracy required by quality control and research applications [it

  15. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...

  16. Metric Distance Ranking Technique for Fuzzy Critical Path Analysis ...

    African Journals Online (AJOL)

    In this paper, fuzzy critical path analysis of a project network is carried out. Metric distance ranking technique is used to order fuzzy numbers during the forward and backward pass computations to obtain the earliest start, earliest finish, latest start and latest finish times of the project's activities. A numerical example is ...

  17. Technologies and microstructures for separation techniques in chemical analysis

    NARCIS (Netherlands)

    Spiering, Vincent L.; Spiering, V.L.; Lammerink, Theodorus S.J.; Jansen, Henricus V.; van den Berg, Albert; Fluitman, J.H.J.

    1996-01-01

    The possibilities for microtechnology in chemical analysis and separation techniques are discussed. The combination of the materials and the dimensions of structures can limit the sample and waste volumes on the one hand, but also increases the performance of the chemical systems. Especially in high

  18. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Villiers, André de; Alberts, Phillipus; Tredoux, Andreas G.J.; Nieuwoudt, Hélène H.

    2012-01-01

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  19. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    Science.gov (United States)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  20. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  1. A robust X-ray fluorescence technique for multielemental analysis of solid samples.

    Science.gov (United States)

    Kallithrakas-Kontos, Nikolaos; Foteinis, Spyros; Paigniotaki, Katherine; Papadogiannakis, Minos

    2016-02-01

    X-ray fluorescence (XRF) quantitation software programs are widely used for analyzing environmental samples due to their versatility but at the expense of accuracy. In this work, we propose an accurate, robust, and versatile technique for multielemental X-ray fluorescence analytical applications, by spiking solid matrices with standard solutions. National Institute of Standards and Technology (NIST)-certified soil standards were spiked with standard solutions, mixed well, desiccated, and analyzed by an energy dispersive XRF. Homogenous targets were produced and low error calibration curves, for the added and not added, neighboring, elements, were obtained. With the addition of few elements, the technique provides reliable multielemental analysis, even for concentrations of the order of milligram per kilogram (ppm). When results were compared to the ones obtained from XRF commercial quantitation software programs, which are widely used in environmental monitoring and assessment applications, they were found to fit certified values better. Moreover, in all examined cases, results were reliable. Hence, this technique can also be used to overcome difficulties associated with interlaboratory consistency and for cross-validating results. The technique was applied to samples with an environmental interest, collected from a ship/boat repainting area. Increased copper, zinc, and lead loads were observed (284, 270, and 688 mg/kg maximum concentrations in soil, respectively), due to vessels being paint stripped and repainted.

  2. State of the art in feedstuff analysis: a technique-oriented perspective.

    Science.gov (United States)

    Cheli, Federica; Battaglia, Debora; Pinotti, Luciano; Baldi, Antonella

    2012-09-26

    The need for global feed supply traceability, the high-throughput testing demands of feed industry, and regulatory enforcement drive the need for feed analysis and make extremely complex the issue of the control and evaluation of feed quality, safety, and functional properties, all of which contribute to the very high number of analyses that must be performed. Feed analysis, with respect to animal nutritional requirements, health, reproduction, and production, should be multianalytically approached. In addition to standard methods of chemical analysis, new methods for evaluation of feed composition and functional properties, authenticity, and safety have been developed. Requirements for new analytical methods emphasize performance, sensitivity, reliability, speed, simplified use, low cost for high volume, and routine assays. This review provides an overview of the most used and promising methods for feed analysis. The review is intentionally focused on the following techniques: classical chemical analysis; in situ and in vitro methods; analytical techniques coupled with chemometric tools (NIR and sensors); and cell-based bioassays. This review describes both the potential and limitations of each technique and discusses the challenges that need to be overcome to obtain validated and standardized methods of analysis for a complete and global feed evaluation and characterization.

  3. Standard Test Method for Determining Thermal Neutron Reaction Rates and Thermal Neutron Fluence Rates by Radioactivation Techniques

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 The purpose of this test method is to define a general procedure for determining an unknown thermal-neutron fluence rate by neutron activation techniques. It is not practicable to describe completely a technique applicable to the large number of experimental situations that require the measurement of a thermal-neutron fluence rate. Therefore, this method is presented so that the user may adapt to his particular situation the fundamental procedures of the following techniques. 1.1.1 Radiometric counting technique using pure cobalt, pure gold, pure indium, cobalt-aluminum, alloy, gold-aluminum alloy, or indium-aluminum alloy. 1.1.2 Standard comparison technique using pure gold, or gold-aluminum alloy, and 1.1.3 Secondary standard comparison techniques using pure indium, indium-aluminum alloy, pure dysprosium, or dysprosium-aluminum alloy. 1.2 The techniques presented are limited to measurements at room temperatures. However, special problems when making thermal-neutron fluence rate measurements in high-...

  4. [Health protection for rural workers: the need to standardize techniques for quantifying dermal exposure to pesticides].

    Science.gov (United States)

    Selmi, Giuliana da Fontoura Rodrigues; Trapé, Angelo Zanaga

    2014-05-01

    Quantification of dermal exposure to pesticides in rural workers, used in risk assessment, can be performed with different techniques such as patches or whole body evaluation. However, the wide variety of methods can jeopardize the process by producing disparate results, depending on the principles in sample collection. A critical review was thus performed on the main techniques for quantifying dermal exposure, calling attention to this issue and the need to establish a single methodology for quantification of dermal exposure in rural workers. Such harmonization of different techniques should help achieve safer and healthier working conditions. Techniques that can provide reliable exposure data are an essential first step towards avoiding harm to workers' health.

  5. Does leaf chemistry differentially affect breakdown in tropical vs temperate streams? Importance of standardized analytical techniques to measure leaf chemistry

    Science.gov (United States)

    Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert

    2009-01-01

    Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...

  6. Portable optical frequency standard based on sealed gas-filled hollow-core fiber using a novel encapsulation technique

    DEFF Research Database (Denmark)

    Triches, Marco; Brusch, Anders; Hald, Jan

    2015-01-01

    A portable stand-alone optical frequency standard based on a gas-filled hollow-core photonic crystal fiber is developed to stabilize a fiber laser to the 13C2H2 P(16) (ν1 + ν3) transition at 1542 nm using saturated absorption. A novel encapsulation technique is developed to permanently seal...

  7. Method development for arsenic analysis by modification in spectrophotometric technique

    Directory of Open Access Journals (Sweden)

    M. A. Tahir

    2012-01-01

    Full Text Available Arsenic is a non-metallic constituent, present naturally in groundwater due to some minerals and rocks. Arsenic is not geologically uncommon and occurs in natural water as arsenate and arsenite. Additionally, arsenic may occur from industrial discharges or insecticide application. World Health Organization (WHO and Pakistan Standard Quality Control Authority have recommended a permissible limit of 10 ppb for arsenic in drinking water. Arsenic at lower concentrations can be determined in water by using high tech instruments like the Atomic Absorption Spectrometer (hydride generation. Because arsenic concentration at low limits of 1 ppb can not be determined easily with simple spectrophotometric technique, the spectrophotometric technique using silver diethyldithiocarbamate was modified to achieve better results, up to the extent of 1 ppb arsenic concentration.

  8. Extracted image analysis: a technique for deciphering mediated portrayals.

    Science.gov (United States)

    Berg, D H; Coutts, L B

    1995-01-01

    A technique for analyzing print media that we have developed as a consequence of our interest in the portrayal of women in menstrual product advertising is reported. The technique, which we call extracted image analysis, involves a unique application of grounded theory and the concomitant heuristic use of the concept of ideal type (Weber, 1958). It provides a means of heuristically conceptualizing the answer to a variant of the "What is going on here?" question asked in analysis of print communication, that is, "Who is being portrayed/addressed here?" Extracted image analysis involves the use of grounded theory to develop ideal typologies. Because the technique re-constructs the ideal types embedded in a communication, it possesses considerable potential as a means of identifying the profiles of members of identifiable groups held by the producers of the directed messages. In addition, the analysis of such portrayals over time would be particularly well suited to extracted image analysis. A number of other possible applications are also suggested.

  9. [Comparison study on polymerase chain reaction (PCR) and standard culture technique in detecting mycobacterium tuberculosis to diagnose of joint tuberculosis].

    Science.gov (United States)

    Sun, Yong-sheng; Wen, Jian-min; Lü, Wei-xin; Lou, Si-quan; Jiao, Chang-geng; Yang, Su-min; Xu, Hai-bin; Duan, Yong-zhuang

    2009-07-01

    To study the role of PCR technique in detection of mycobacterium tuberculosis in the samples from joint tuberculosis, and to evaluate the clinical value of PCR in diagnosis of joint tuberculosis. From June 1993 to August 2001, PCR was used to detect DNA of mycobacterium tuberculosis, and the standard culture was applied to detect mycobacterium tuberculosis. Mycobacterium tuberculosis were respectively blindly by the two techniques in the samples obtained from 95 patients with joint tuberculosis (55 males and 40 females, the age ranging from 2 to 75 years, with an average of 34 years). The positive rate of mycobacterium tuberculosis detection was calculated. In the detection of mycobacterium tuberculosis, positive rate was 82% (78/95) in PCR technique, and 16% (15/95) in standard culture technique. There were statistical differences between the two groups (chi2=67, Ptechnique is a rapid, simple, sensitive and specific method for detection of mycobacterium tuberculosis in the samples of joint tuberculosis, showing more marked advantages than the standard culture technique. It is valuable in the early rapid diagnosis and differential diagnosis of joint tuberculosis.

  10. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  11. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  12. Expert training with standardized operative technique helps establish a successful penile prosthetics program for urologic resident education.

    Science.gov (United States)

    King, Ashley B; Klausner, Adam P; Johnson, Corey M; Moore, Blake W; Wilson, Steven K; Grob, B Mayer

    2011-10-01

    The challenge of resident education in urologic surgery programs is to overcome disparity imparted by diverse patient populations, limited training times, and inequalities in the availability of expert surgical educators. Specifically, in the area of prosthetic urology, only a small proportion of programs have full-time faculty available to train residents in this discipline. To examine whether a new model using yearly training sessions from a recognized expert can establish a successful penile prosthetics program and result in better outcomes, higher case volumes, and willingness to perform more complex surgeries. A recognized expert conducted one to two operative training sessions yearly to teach standardized technique for penile prosthetics to residents. Each session consisted of three to four operative cases performed under the direct supervision of the expert. Retrospective data were collected from all penile prosthetic operations before (February, 2000 to June, 2004: N = 44) and after (July, 2004 to October, 2007: N = 79) implementation of these sessions. Outcomes reviewed included patient age, race, medical comorbidities, operative time, estimated blood loss, type of prosthesis, operative approach, drain usage, length of stay, and complications including revision/explantation rates. Statistical analysis was performed using Student's t-tests, Fisher's tests, and survival curves using the Kaplan-Meier technique (P value ≤ 0.05 to define statistical significance). Patient characteristics were not significantly different pre- vs. post-training. Operative time and estimated blood loss significantly decreased. Inflatable implants increased from 19/44 (43.2%, pre-training) to 69/79 (87.3%, post-training) (P prosthetics surgery. © 2011 International Society for Sexual Medicine.

  13. FINE-GRAINEDCELLULAR CONCRETE CREEP ANALYSIS TECHNIQUE WITH CONSIDERATION FORCARBONATION

    Directory of Open Access Journals (Sweden)

    M. A. Gaziev

    2015-01-01

    Full Text Available The article considers the creep and creep deformation analysis technique in fine-grainedcellular concrete with consideration for carbonation and assurance requirements for the repairing properties and seismic stability. The procedure for determining the creep of fine-grainedcellular concrete is proposed with account of its carbonationby atmospheric carbon dioxide. It has been found theoretically and experimentally that the proposed technique allows obtaining reproducible results and can be recommended for creep determination of fine-grainedcellular concretes, including repairingones, taking into account their carbonation.

  14. Maximum entropy technique in the doublet structure analysis

    International Nuclear Information System (INIS)

    Belashev, B.Z.; Panebrattsev, Yu.A.; Shakhaliev, Eh.I.; Soroko, L.M.

    1998-01-01

    The Maximum Entropy Technique (MENT) for solution of the inverse problems is explained. The effective computer program for resolution of the nonlinear equations system encountered in the MENT has been developed and tested. The possibilities of the MENT have been demonstrated on the example of the MENT in the doublet structure analysis of noisy experimental data. The comparison of the MENT results with results of the Fourier algorithm technique without regularization is presented. The tolerant noise level is equal to 30% for MENT and only 0.1% for the Fourier algorithm

  15. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  16. Evaluation of Damping Using Frequency Domain Operational Modal Analysis Techniques

    DEFF Research Database (Denmark)

    Bajric, Anela; Georgakis, Christos T.; Brincker, Rune

    2015-01-01

    separated and closely spaced modes. Finally, the results of the numerical study are presented, in which the error of the structural damping estimates obtained by each OMA technique is shown for a range of damping levels. From this, it is clear that there are notable differences in accuracy between......Operational Modal Analysis (OMA) techniques provide in most cases reasonably accurate estimates of structural frequencies and mode shapes. In contrast though, they are known to often produce uncertain structural damping estimates, which is mainly due to inherent random and/or bias errors...

  17. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  18. Psychovisual masks and intelligent streaming RTP techniques for the MPEG-4 standard

    Science.gov (United States)

    Mecocci, Alessandro; Falconi, Francesco

    2003-06-01

    . In our implementation we adopted the Visual Brain theory i.e. the study of what the "psychic eye" can get from a scene. According to this theory, a Psychomask Image Analysis (PIA) module has been developed to extract the visually homogeneous regions of the background. The PIA module produces two complementary masks one for the visually low variance zones and one for the higly variable zones; these zones are compressed with different strategies and encoded into two multiplexed streams. From practical experiments it turned out that the separate coding is advantageous only if the low variance zones exceed 50% of the whole background area (due to the overhead given by the need of transmitting the zone masks). The SLIC module takes care of deciding the appropriate transmission modality by analyzing the results produced by the PIA module. The main features of this codec are: low bitrate, good image quality and coding speed. The current implementation runs in real-time on standard PC platforms, the major limitation being the fixed position of the acquisition sensor. This limitation is due to the difficulties in separating moving objects from the background when the acquisition sensor moves. Our current real-time segmentation module does not produce suitable results if the acquisition sensor moves (only slight oscillatory movements are tolerated). In any case, the system is particularly suitable for tele surveillance applications at low bit-rates, where the camera is usually fixed or alternates among some predetermined positions (our segmentation module is capable of accurately separate moving objects from the static background when the acquisition sensor stops, even if different scenes are seen as a result of the sensor displacements). Moreover, the proposed architecture is general, in the sense that when real-time, robust segmentation systems (capable of separating objects in real-time from the background while the sensor itself is moving) will be available, they can be

  19. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  20. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  1. Nuclear techniques of analysis in diamond synthesis and annealing

    International Nuclear Information System (INIS)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J.

    1996-01-01

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs

  2. Development of fault diagnostic technique using reactor noise analysis

    International Nuclear Information System (INIS)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B.

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  3. Nuclear microprobe analysis of the standard reference materials

    International Nuclear Information System (INIS)

    Jaksic, M.; Fazinic, S.; Bogdanovic, I.; Tadic, T.

    2002-01-01

    Most of the presently existing Standard Reference Materials (SRM) for nuclear analytical methods are certified for the analyzed mass of the order of few hundred mg. Typical mass of sample which is analyzed by PIXE or XRF methods is very often below 1 mg. By the development of focused proton or x-ray beams, masses which can be typically analyzed go down to μg or even ng level. It is difficult to make biological or environmental SRMs which can give desired homogeneity at such low scale. However, use of fundamental parameter quantitative evaluation procedures (absolute method), minimize needs for SRMs. In PIXE and micro PIXE setup at our Institute, fundamental parameter approach is used. For exact calibration of the quantitative analysis procedure just one standard sample is needed. In our case glass standards which showed homogeneity down to micron scale were used. Of course, it is desirable to use SRMs for quality assurance, and therefore need for homogenous materials can be justified even for micro PIXE method. In this presentation, brief overview of PIXE setup calibration is given, along with some recent results of tests of several SRMs

  4. Model Standards and Techniques for Control of Radon in New Residential Buildings

    Science.gov (United States)

    This document is intended to serve as a model for use to develop and adopt building codes, appendices to codes, or standards specifically applicable to unique local or regional radon control requirements.

  5. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  6. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  7. Performance of an iterative two-stage bayesian technique for population pharmacokinetic analysis of rich data sets

    NARCIS (Netherlands)

    Proost, Johannes H.; Eleveld, Douglas J.

    2006-01-01

    Purpose. To test the suitability of an Iterative Two-Stage Bayesian (ITSB) technique for population pharmacokinetic analysis of rich data sets, and to compare ITSB with Standard Two-Stage (STS) analysis and nonlinear Mixed Effect Modeling (MEM). Materials and Methods. Data from a clinical study with

  8. Small area analysis using micro-diffraction techniques

    International Nuclear Information System (INIS)

    Goehner, Raymond P.; Tissot, Ralph G. Jr.; Michael, Joseph R.

    2000-01-01

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 microm to 100 microm. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30 microm glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has been

  9. When Is Hub Gene Selection Better than Standard Meta-Analysis?

    Science.gov (United States)

    Langfelder, Peter; Mischel, Paul S.; Horvath, Steve

    2013-01-01

    Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when) hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data). Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis) and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility) in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA) in three comprehensive and unbiased empirical studies: (1) Finding genes predictive of lung cancer survival, (2) finding methylation markers related to age, and (3) finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1). However, standard meta-analysis methods perform as good as (if not better than) a consensus network approach in terms of validation success (criterion 2). The article also reports a comparison of meta-analysis techniques applied to

  10. When is hub gene selection better than standard meta-analysis?

    Directory of Open Access Journals (Sweden)

    Peter Langfelder

    Full Text Available Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data. Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA in three comprehensive and unbiased empirical studies: (1 Finding genes predictive of lung cancer survival, (2 finding methylation markers related to age, and (3 finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1. However, standard meta-analysis methods perform as good as (if not better than a consensus network approach in terms of validation success (criterion 2. The article also reports a comparison of meta-analysis techniques

  11. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  12. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  13. Contributions to flow techniques and mass spectrometry in water analysis

    OpenAIRE

    Santos, Inês Carvalho dos

    2015-01-01

    In this thesis, the use of different flow systems was exploited along with the use of different detection techniques for the development of simple, robust, and automated analytical procedures. With the purpose to perform in-line sample handling and pretreatment operations, different separation units were used. The main target for these methods was waters samples. The first procedure was based on a sequential injection analysis (SIA) system for carbon speciation (alkalinity, dis...

  14. Analysis of Indian silver coins by EDXRF technique

    International Nuclear Information System (INIS)

    Tripathy, B.B.; Rautray, T.R.; Das, Satya R.; Das, Manas R.; Vijayan, V.

    2009-01-01

    The analysis of some of the Indian silver coins during British rule were analysed by Energy Dispersive X-Ray Fluorescence Technique. Eight elements namely Cr, Fe, Ni, Cu, Zn, As, Ag and Pb were estimated in this study which also seems to indicate the fragmentation as well as the impoverishment of the power for the regimes that had produced the studied coins. While Cu and Ag were present as major elements, other elements were found to be present in minor concentration. (author)

  15. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  16. An ASIC Low Power Primer Analysis, Techniques and Specification

    CERN Document Server

    Chadha, Rakesh

    2013-01-01

    This book provides an invaluable primer on the techniques utilized in the design of low power digital semiconductor devices.  Readers will benefit from the hands-on approach which starts form the ground-up, explaining with basic examples what power is, how it is measured and how it impacts on the design process of application-specific integrated circuits (ASICs).  The authors use both the Unified Power Format (UPF) and Common Power Format (CPF) to describe in detail the power intent for an ASIC and then guide readers through a variety of architectural and implementation techniques that will help meet the power intent.  From analyzing system power consumption, to techniques that can employed in a low power design, to a detailed description of two alternate standards for capturing the power directives at various phases of the design, this book is filled with information that will give ASIC designers a competitive edge in low-power design. Starts from the ground-up and explains what power is, how it is measur...

  17. Meta-analysis of surgical techniques for preventing parotidectomy sequelae.

    Science.gov (United States)

    Curry, Joseph M; King, Nancy; Reiter, David; Fisher, Kyle; Heffelfinger, Ryan N; Pribitkin, Edmund A

    2009-01-01

    To conduct a meta-analysis of the literature on surgical methods for the prevention of Frey syndrome and concave facial deformity after parotidectomy. A PubMed search through February 2008 identified more than 60 English-language studies involving surgical techniques for prevention of these parameters. Analyzed works included 15 retrospective or prospective controlled studies reporting quantitative data for all included participants for 1 or more of the measured parameters in patients who had undergone parotidectomy. Report quality was assessed by the strength of taxonomy recommendation (SORT) score. Data were directly extracted from reports and dichotomized into positive and negative outcomes. The statistical significance was then calculated. The mean SORT score for all studies was 2.34, and the mean SORT score for all the analyzed studies was 1.88. Meta-analysis for multiple techniques to prevent symptomatic Frey syndrome, positive starch-iodine test results, and contour deformity favored intervention with a cumulative odds ratio (OR) of 3.88 (95% confidence interval [CI], 2.81-5.34); OR, 3.66 (95% CI; 2.32-5.77); and OR, 5.25 (95% CI, 3.57-7.72), respectively. Meta-analysis of operative techniques to prevent symptomatic Frey syndrome, positive starch-iodine test results, and facial asymmetry suggests that such methods are likely to reduce the incidence of these complications after parotidectomy.

  18. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... and simplifcation of the design of practical vision systems....... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...

  19. Characterization of decommissioned reactor internals: Monte Carlo analysis technique

    International Nuclear Information System (INIS)

    Reid, B.D.; Love, E.F.; Luksic, A.T.

    1993-03-01

    This study discusses computer analysis techniques for determining activation levels of irradiated reactor component hardware to yield data for the Department of Energy's Greater-Than-Class C Low-Level Radioactive Waste Program. The study recommends the Monte Carlo Neutron/Photon (MCNP) computer code as the best analysis tool for this application and compares the technique to direct sampling methodology. To implement the MCNP analysis, a computer model would be developed to reflect the geometry, material composition, and power history of an existing shutdown reactor. MCNP analysis would then be performed using the computer model, and the results would be validated by comparison to laboratory analysis results from samples taken from the shutdown reactor. The report estimates uncertainties for each step of the computational and laboratory analyses; the overall uncertainty of the MCNP results is projected to be ±35%. The primary source of uncertainty is identified as the material composition of the components, and research is suggested to address that uncertainty

  20. Standard practice for examination of welds using the alternating current field measurement technique

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This practice describes procedures to be followed during alternating current field measurement examination of welds for baseline and service-induced surface breaking discontinuities. 1.2 This practice is intended for use on welds in any metallic material. 1.3 This practice does not establish weld acceptance criteria. 1.4 The values stated in either inch-pound units or SI units are to be regarded separately as standard. The values stated in each system might not be exact equivalents; therefore, each system shall be used independently of the other. 1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  1. Applying Standard Independent Verification and Validation (IVV) Techniques Within an Agile Framework: Is There a Compatibility Issue?

    Science.gov (United States)

    Dabney, James B.; Arthur, James Douglas

    2017-01-01

    Agile methods have gained wide acceptance over the past several years, to the point that they are now a standard management and execution approach for small-scale software development projects. While conventional Agile methods are not generally applicable to large multi-year and mission-critical systems, Agile hybrids are now being developed (such as SAFe) to exploit the productivity improvements of Agile while retaining the necessary process rigor and coordination needs of these projects. From the perspective of Independent Verification and Validation (IVV), however, the adoption of these hybrid Agile frameworks is becoming somewhat problematic. Hence, we find it prudent to question the compatibility of conventional IVV techniques with (hybrid) Agile practices.This paper documents our investigation of (a) relevant literature, (b) the modification and adoption of Agile frameworks to accommodate the development of large scale, mission critical systems, and (c) the compatibility of standard IVV techniques within hybrid Agile development frameworks. Specific to the latter, we found that the IVV methods employed within a hybrid Agile process can be divided into three groups: (1) early lifecycle IVV techniques that are fully compatible with the hybrid lifecycles, (2) IVV techniques that focus on tracing requirements, test objectives, etc. are somewhat incompatible, but can be tailored with a modest effort, and (3) IVV techniques involving an assessment requiring artifact completeness that are simply not compatible with hybrid Agile processes, e.g., those that assume complete requirement specification early in the development lifecycle.

  2. Free-field reciprocity calibration of laboratory standard (LS) microphones using a time selective technique

    DEFF Research Database (Denmark)

    Rasmussen, Knud; Barrera Figueroa, Salvador

    2006-01-01

    Although the basic principle of reciprocity calibration of microphones in a free field is simple, the practical problems are complicated due to the low signal-to-noise ratio and the influence of cross talk and reflections from the surroundings. The influence of uncorrelated noise can be reduced...... by conventional narrow-band filtering and time averaging, while correlated signals like cross talk and reflections can be eliminated by using time-selective postprocessing techniques. The technique used at DPLA overcomes both these problems using a B&K Pulse analyzer in the SSR mode (steady state response......) and an FFT-based time-selective technique. The complex electrical transfer impedance is measured in linear frequency steps from a few kHz to about three times the resonance frequency of the microphones. The missing values at low frequencies are estimated from a detailed knowledge of the pressure...

  3. Application of Microfluidic Techniques to Pyrochemical Salt Sampling and Analysis

    International Nuclear Information System (INIS)

    Pereira, C.; Launiere, C.; Smith, N.

    2015-01-01

    Microfluidic techniques enable production of micro-samples of molten salt for analysis by at-line and off-line sensors and detectors. These sampling systems are intended for implementation in an electrochemical used fuel treatment facility as part of the material balance and control system. Microfluidics may reduce random statistical error associated with sampling inhomogeneity because a large number of uniform sub-microlitre droplets may be generated and successively analyzed. The approach combines two immiscible fluids in a microchannel under laminar flow conditions to generate slug flows. Because the slug flow regime is characterized by regularly sized and spaced droplets, it is commonly used in low-volume/high-throughput assays of aqueous and organic phases. This scheme is now being applied to high-temperature molten salts in combination with a second fluid that is stable at elevated temperatures. The microchip systems are being tested to determine the channel geometries and absolute and relative phase flow rates required to achieve stable slug flow. Because imaging is difficult at the 5000 C process temperatures the fluorescence of salt ions under ultraviolet illumination is used to discern flow regimes. As molten chloride melts are optically transparent, UV-visible light spectroscopy is also being explored as a spectroscopic technique for integration with at-line microchannel systems to overcome some of the current challenges to in situ analysis. A second technique that is amenable to droplet analysis is Laser-induced Breakdown Spectroscopy (LIBS). A pneumatic droplet generator is being interfaced with a LIBS system for analysis of molten salts at near-process temperatures. Tests of the pneumatic generator are being run using water and molten salts, and in tandem with off-line analysis of the salt droplets with a LIBS spectrometer. (author)

  4. Application of unsupervised analysis techniques to lung cancer patient data.

    Science.gov (United States)

    Lynch, Chip M; van Berkel, Victor H; Frieboes, Hermann B

    2017-01-01

    This study applies unsupervised machine learning techniques for classification and clustering to a collection of descriptive variables from 10,442 lung cancer patient records in the Surveillance, Epidemiology, and End Results (SEER) program database. The goal is to automatically classify lung cancer patients into groups based on clinically measurable disease-specific variables in order to estimate survival. Variables selected as inputs for machine learning include Number of Primaries, Age, Grade, Tumor Size, Stage, and TNM, which are numeric or can readily be converted to numeric type. Minimal up-front processing of the data enables exploring the out-of-the-box capabilities of established unsupervised learning techniques, with little human intervention through the entire process. The output of the techniques is used to predict survival time, with the efficacy of the prediction representing a proxy for the usefulness of the classification. A basic single variable linear regression against each unsupervised output is applied, and the associated Root Mean Squared Error (RMSE) value is calculated as a metric to compare between the outputs. The results show that self-ordering maps exhibit the best performance, while k-Means performs the best of the simpler classification techniques. Predicting against the full data set, it is found that their respective RMSE values (15.591 for self-ordering maps and 16.193 for k-Means) are comparable to supervised regression techniques, such as Gradient Boosting Machine (RMSE of 15.048). We conclude that unsupervised data analysis techniques may be of use to classify patients by defining the classes as effective proxies for survival prediction.

  5. The Effect of Long-Term Therapeutics, Prophylaxis and Screening Techniques on Aircrew Medical Standards.

    Science.gov (United States)

    1981-03-01

    350. voel RA, Kirch DC, LeFree MT, Rainwater JO, Jensen DP, Steele PP. Thallium-201 myocardial perfusion scintiaraphy: results of standard aad multi...Reference 3. Further Reference 4.Security Clasification of Document A;ARD4"P-3 10 ISBN 92-835-0288-4 UNCLASSIFIE) 5 Originator Advi,,ory Group for

  6. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  7. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  8. Admissions Standards and the Use of Key Marketing Techniques by United States' Colleges and Universities.

    Science.gov (United States)

    Goldgehn, Leslie A.

    1989-01-01

    A survey of admissions deans and directors investigated the use and perceived effectiveness of 15 well-known marketing techniques: advertising, advertising research, a marketing plan, market positioning, market segmentation, marketing audit, marketing research, pricing, program and service accessibility, program development, publicity, target…

  9. Standard hazard analysis, critical control point and hotel management

    Directory of Open Access Journals (Sweden)

    Vujačić Vesna

    2017-01-01

    Full Text Available Tourism is a dynamic category which is continuously evolving in the world. Specificities that have to be respected in the execution in relation to the food industry are connected with the fact that the main differences which exist regarding the food serving procedure in catering, numerous complex recipes and production technologies, staff fluctuation, old equipment. For an effective and permanent implementation, the HACCP concept is very important for building a serious base. In this case, the base is represented by the people handling the food. This paper presents international ISO standards, the concept of HACCP and the importance of its application in the tourism and hospitality industry. The concept of HACCP is a food safety management system through the analysis and control of biological, chemical and physical hazards in the entire process, from raw material production, procurement, handling, to manufacturing, distribution and consumption of the finished product. The aim of this paper is to present the importance of the application of HACCP concept in tourism and hotel management as a recognizable international standard.

  10. International intercomparison of standards for low collision kerma rates in air by means of low dose TLD techniques

    International Nuclear Information System (INIS)

    Spanne, P.; Carlsson, C.A.; Carlsson, G.A.

    1984-01-01

    An international intercomparison of standards for calibration of radiation protection instruments is described. The intercomparison involved collision kerma rates in air in the range 5.7 to 7.400 μGy.h -1 and was performed using low dose TLD techniques with TL-LiF dosemeters. The dosemeters were specifically tested with regard to their dose rate dependence, but none was found. (author)

  11. Assessing the service quality of Iran military hospitals: Joint Commission International standards and Analytic Hierarchy Process (AHP) technique.

    Science.gov (United States)

    Bahadori, Mohammadkarim; Ravangard, Ramin; Yaghoubi, Maryam; Alimohammadzadeh, Khalil

    2014-01-01

    Military hospitals are responsible for preserving, restoring and improving the health of not only armed forces, but also other people. According to the military organizations strategy, which is being a leader and pioneer in all areas, providing quality health services is one of the main goals of the military health care organizations. This study was aimed to evaluate the service quality of selected military hospitals in Iran based on the Joint Commission International (JCI) standards and comparing these hospitals with each other and ranking them using the analytic hierarchy process (AHP) technique in 2013. This was a cross-sectional and descriptive study conducted on five military hospitals, selected using the purposive sampling method, in 2013. Required data collected using checklists of accreditation standards and nominal group technique. AHP technique was used for prioritizing. Furthermore, Expert Choice 11.0 was used to analyze the collected data. Among JCI standards, the standards of access to care and continuity of care (weight = 0.122), quality improvement and patient safety (weight = 0.121) and leadership and management (weight = 0.117) had the greatest importance, respectively. Furthermore, in the overall ranking, BGT (weight = 0.369), IHM (0.238), SAU (0.202), IHK (weight = 0.125) and SAB (weight = 0.066) ranked first to fifth, respectively. AHP is an appropriate technique for measuring the overall performance of hospitals and their quality of services. It is a holistic approach that takes all hospital processes into consideration. The results of the present study can be used to improve hospitals performance through identifying areas, which are in need of focus for quality improvement and selecting strategies to improve service quality.

  12. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    Science.gov (United States)

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  13. Analysis techniques for two-dimensional infrared data

    Science.gov (United States)

    Winter, E. M.; Smith, M. C.

    1978-01-01

    In order to evaluate infrared detection and remote sensing systems, it is necessary to know the characteristics of the observational environment. For both scanning and staring sensors, the spatial characteristics of the background may be more of a limitation to the performance of a remote sensor than system noise. This limitation is the so-called spatial clutter limit and may be important for systems design of many earth application and surveillance sensors. The data used in this study is two dimensional radiometric data obtained as part of the continuing NASA remote sensing programs. Typical data sources are the Landsat multi-spectral scanner (1.1 micrometers), the airborne heat capacity mapping radiometer (10.5 - 12.5 micrometers) and various infrared data sets acquired by low altitude aircraft. Techniques used for the statistical analysis of one dimensional infrared data, such as power spectral density (PSD), exceedance statistics, etc. are investigated for two dimensional applicability. Also treated are two dimensional extensions of these techniques (2D PSD, etc.), and special techniques developed for the analysis of 2D data.

  14. Gas chromatographic isolation technique for compound-specific radiocarbon analysis

    International Nuclear Information System (INIS)

    Uchida, M.; Kumamoto, Y.; Shibata, Y.; Yoneda, M.; Morita, M.; Kawamura, K.

    2002-01-01

    Full text: We present here a gas chromatographic isolation technique for the compound-specific radiocarbon analysis of biomarkers from the marine sediments. The biomarkers of fatty acids, hydrocarbon and sterols were isolated with enough amount for radiocarbon analysis using a preparative capillary gas chromatograph (PCGC) system. The PCGC systems used here is composed of an HP 6890 GC with FID, a cooled injection system (CIS, Gerstel, Germany), a zero-dead-volume effluent splitter, and a cryogenic preparative collection device (PFC, Gerstel). For AMS analysis, we need to separate and recover sufficient quantity of target individual compounds (>50 μgC). Yields of target compounds from C 14 n-alkanes to C 40 to C 30 n-alkanes and approximately that of 80% for higher molecular weights compounds more than C 30 n-alkanes. Compound specific radiocarbon analysis of organic compounds, as well as compound-specific stable isotope analysis, provide valuable information on the origins and carbon cycling in marine system. Above PCGC conditions, we applied compound-specific radiocarbon analysis to the marine sediments from western north Pacific, which showed the possibility of a useful chronology tool for estimating the age of sediment using organic matter in paleoceanographic study, in the area where enough amounts of planktonic foraminifera for radiocarbon analysis by accelerator mass spectrometry (AMS) are difficult to obtain due to dissolution of calcium carbonate. (author)

  15. Quality of Standard Reference Materials for Short Time Activation Analysis

    International Nuclear Information System (INIS)

    Ismail, S.S.; Oberleitner, W.

    2003-01-01

    Some environmental reference materials (CFA-1633 b, IAEA-SL-1, SARM-1,BCR-176, Coal-1635, IAEA-SL-3, BCR-146, and SRAM-5) were analysed by short-time activation analysis. The results show that these materials can be classified in three groups, according to their activities after irradiation. The obtained results were compared in order to create a quality index for determination of short-lived nuclides at high count rates. It was found that Cfta is not a suitable standard for determining very short-lived nuclides (half-lives<1 min) because the activity it produces is 15-fold higher than that SL-3. Biological reference materials, such as SRM-1571, SRM-1573, SRM-1575, SRM-1577, IAEA-392, and IAEA-393, were also investigated by a higher counting efficiency system. The quality of this system and its well-type detector for investigating short-lived nuclides was discussed

  16. Impact during equine locomotion: techniques for measurement and analysis.

    Science.gov (United States)

    Burn, J F; Wilson, A; Nason, G P

    1997-05-01

    Impact is implicated in the development of several types of musculoskeletal injury in the horse. Characterisation of impact experienced during strenuous exercise is an important first step towards understanding the mechanism for injury. Measurement and analysis of large, short duration impacts is difficult. The measurement system must be able to record transient peaks and high frequencies accurately. The analysis technique must be able to characterise the impact signal in time and frequency. This paper presents a measurement system and analysis technique for the characterisation of large impacts. A piezo-electric accelerometer was securely mounted on the dorsal surface of the horses hoof. Saddle mounted charge amplifiers and a 20 m coaxial cable transferred these data to a PC based logging system. Data were down-loaded onto a UNIX workstation and analysed using a proprietary statistics package. The values of parameters calculated from the time series data were comparable to those of other authors. A wavelet decomposition showed that the frequency profile of the signal changed with time. While most spectral energy was seen at impact, a significant amount of energy was contained in the signal immediately following impact. Over 99% of this energy was contained in frequencies less than 1250 Hz. The sampling rate and the frequency response of a measurement system for recording impact should be chosen carefully to prevent loss or corruption of data. Time scale analysis using a wavelet decomposition is a powerful technique which can be used to characterise impact data. The use of contour plots provides a highly visual representation of the time and frequency localisation of power during impact.

  17. Colombeau's generalized functions and non-standard analysis

    International Nuclear Information System (INIS)

    Todorov, T.D.

    1987-10-01

    Using some methods of the Non-Standard Analysis we modify one of Colombeau's classes of generalized functions. As a result we define a class ε-circumflex of the so-called meta-functions which possesses all good properties of Colombeau's generalized functions, i.e. (i) ε-circumflex is an associative and commutative algebra over the system of the so-called complex meta-numbers C-circumflex; (ii) Every meta-function has partial derivatives of any order (which are meta-functions again); (iii) Every meta-function is integrable on any compact set of R n and the integral is a number from C-circumflex; (iv) ε-circumflex contains all tempered distributions S', i.e. S' is contained in ε' isomorphically with respect to all linear operations (including the differentiation). Thus, within the class ε-circumflex the problem of multiplication of the tempered distributions is satisfactorily solved (every two distributions in S' have a well-defined product in ε-circumflex). The crucial point is that C-circumflex is a field in contrast to the system of Colombeau's generalized numbers C-bar which is a ring only (C-bar is the counterpart of C-circumflex in Colombeau's theory). In this way we simplify and improve slightly the properties of the integral and notion of ''values of the meta-functions'' as well as the properties of the whole class ε-circumflex itself if compared with the original Colombeau theory. And, what is maybe more important, we clarify the connection between the Non-Standard Analysis and Colombeau's theory of new generalized functions in the framework of which the problem of multiplication of distributions was recently solved. (author). 14 refs

  18. Dispersion analysis techniques within the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  19. Fault tree technique: advances in probabilistic and logical analysis

    International Nuclear Information System (INIS)

    Clarotti, C.A.; Amendola, A.; Contini, S.; Squellati, G.

    1982-01-01

    Fault tree reliability analysis is used for assessing the risk associated to systems of increasing complexity (phased mission systems, systems with multistate components, systems with non-monotonic structure functions). Much care must be taken to make sure that fault tree technique is not used beyond its correct validity range. To this end a critical review of mathematical foundations of reliability fault tree analysis is carried out. Limitations are enlightened and potential solutions to open problems are suggested. Moreover an overview is given on the most recent developments in the implementation of an integrated software (SALP-MP, SALP-NOT, SALP-CAFT Codes) for the analysis of a wide class of systems

  20. Multi-element analysis of lubricant oil by WDXRF technique using thin-film sample preparation

    International Nuclear Information System (INIS)

    Scapin, M. A.; Salvador, V. L. R.; Lopes, C. D.; Sato, I. M.

    2006-01-01

    The quantitative analysis of the chemical elements in matrices like oils or gels represents a challenge for the analytical chemists. The classics methods or instrumental techniques such as atomic absorption spectrometry (AAS) and plasma optical emission spectrometry (ICP-OES) need chemical treatments, mainly sample dissolution and degradation processes. X-ray fluorescence technique allows a direct and multi-element analysis without previous sample treatments. In this work, a sensible method for the determination of elements Mg, Al, Si, P, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Mo, Ag, Sn, Ba and Pb in lubricating oil is presented. The x-ray fluorescence (WDXRF) technique using linear regression method and thin film sample preparation was used. The validation of the methodology (repeatability and accuracy) was obtained by the analysis of the standard reference materials SRM Alpha AESAR lot 703527D, applying the Chauvenet, Cochrane, ANOVA and Z-score statistical tests. The method presents a relative standard deviation lower than 10% for all the elements, except for Pb determination (RSD Pb 15%). The Z-score values for all the elements were in the range -2 < Z < 2, indicating a very good accuracy.(Full text)

  1. Some problems of calibration technique in charged particle activation analysis

    International Nuclear Information System (INIS)

    Krasnov, N.N.; Zatolokin, B.V.; Konstantinov, I.O.

    1977-01-01

    It is shown that three different approaches to calibration technique based on the use of average cross-section, equivalent target thickness and thick target yield are adequate. Using the concept of thick target yield, a convenient charged particle activation equation is obtained. The possibility of simultaneous determination of two impurities, from which the same isotope is formed, is pointed out. The use of the concept of thick target yield facilitates the derivation of a simple formula for an absolute and comparative methods of analysis. The methodical error does not exceed 10%. Calibration technique and determination of expected sensitivity based on the thick target yield concept is also very convenient because experimental determination of thick target yield values is a much simpler procedure than getting activation curve or excitation function. (T.G.)

  2. A Simple low-cost device enables four epi-illumination techniques on standard light microscopes.

    Science.gov (United States)

    Ishmukhametov, Robert R; Russell, Aidan N; Wheeler, Richard J; Nord, Ashley L; Berry, Richard M

    2016-02-08

    Back-scattering darkfield (BSDF), epi-fluorescence (EF), interference reflection contrast (IRC), and darkfield surface reflection (DFSR) are advanced but expensive light microscopy techniques with limited availability. Here we show a simple optical design that combines these four techniques in a simple low-cost miniature epi-illuminator, which inserts into the differential interference-contrast (DIC) slider bay of a commercial microscope, without further additions required. We demonstrate with this device: 1) BSDF-based detection of Malarial parasites inside unstained human erythrocytes; 2) EF imaging with and without dichroic components, including detection of DAPI-stained Leishmania parasite without using excitation or emission filters; 3) RIC of black lipid membranes and other thin films, and 4) DFSR of patterned opaque and transparent surfaces. We believe that our design can expand the functionality of commercial bright field microscopes, provide easy field detection of parasites and be of interest to many users of light microscopy.

  3. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    Beck, L.

    2013-01-01

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author) [fr

  4. Single Particle Tracking: Analysis Techniques for Live Cell Nanoscopy

    Science.gov (United States)

    Relich, Peter Kristopher, II

    Single molecule experiments are a set of experiments designed specifically to study the properties of individual molecules. It has only been in the last three decades where single molecule experiments have been applied to the life sciences; where they have been successfully implemented in systems biology for probing the behaviors of sub-cellular mechanisms. The advent and growth of super-resolution techniques in single molecule experiments has made the fundamental behaviors of light and the associated nano-probes a necessary concern amongst life scientists wishing to advance the state of human knowledge in biology. This dissertation disseminates some of the practices learned in experimental live cell microscopy. The topic of single particle tracking is addressed here in a format that is designed for the physicist who embarks upon single molecule studies. Specifically, the focus is on the necessary procedures to generate single particle tracking analysis techniques that can be implemented to answer biological questions. These analysis techniques range from designing and testing a particle tracking algorithm to inferring model parameters once an image has been processed. The intellectual contributions of the author include the techniques in diffusion estimation, localization filtering, and trajectory associations for tracking which will all be discussed in detail in later chapters. The author of this thesis has also contributed to the software development of automated gain calibration, live cell particle simulations, and various single particle tracking packages. Future work includes further evaluation of this laboratory's single particle tracking software, entropy based approaches towards hypothesis validations, and the uncertainty quantification of gain calibration.

  5. Comparison of chromosome analysis using cell culture by coverslip technique with flask technique.

    Science.gov (United States)

    Sajapala, Suraphan; Buranawut, Kitti; NiwatArunyakasemsuk, Md

    2014-02-01

    To determine accuracy rate ofchromosome study from amniotic cellculture by coverslip technique compared with flask technique and to compared timing ofamniotic cell culture, amount ofamniotic cell culture media and cost ofamniotic cell culture. Cross sectional study. Department of Obstetrics and Gynecology, Phramongkutklao Hospital. Subjects: 70 pregnant women who underwent amniocentesis at Phramongkutklao Hospital during November 1, 2007 to February 29, 2008. Amniotic cell culture by flask technique and coverslip technique. Accuracy of amniotic cell culture for chromosome study by coverslip technique compared with flask technique. Totally 70 pregnant women who underwent to amniocentesis and dividedamniotic fluid to cell culture by flask technique and coverslip technique. 69 samples had similar resultfrom both techniques. The only one sample had cell culture failure inboth methods due to blood contamination. Accuracy in coverslip technique was 100% compared with flask technique. In timing of amniotic cell culture, amount ofamniotic cell culture media and cost of amniotic cell culture between 2 methods that coverslip technique was lesser than flask technique. There is statistically significant of accuracy in chromosome result between coverslip technique and flask technique. Coverslip technique was lesser than flask technique in timing, amniotic cell culture media and costs ofamniotic cell culture.

  6. Development of flow injection analysis technique for uranium estimation

    International Nuclear Information System (INIS)

    Paranjape, A.H.; Pandit, S.S.; Shinde, S.S.; Ramanujam, A.; Dhumwad, R.K.

    1991-01-01

    Flow injection analysis is increasingly used as a process control analytical technique in many industries. It involves injection of the sample at a constant rate into a steady flowing stream of reagent and passing this mixture through a suitable detector. This paper describes the development of such a system for the analysis of uranium (VI) and (IV) and its gross gamma activity. It is amenable for on-line or automated off-line monitoring of uranium and its activity in process streams. The sample injection port is suitable for automated injection of radioactive samples. The performance of the system has been tested for the colorimetric response of U(VI) samples at 410 nm in the range of 35 to 360mg/ml in nitric acid medium using Metrohm 662 Photometer and a recorder as detector assembly. The precision of the method is found to be better than +/- 0.5%. This technique with certain modifications is used for the analysis of U(VI) in the range 0.1-3mg/ailq. by alcoholic thiocynate procedure within +/- 1.5% precision. Similarly the precision for the determination of U(IV) in the range 15-120 mg at 650 nm is found to be better than 5%. With NaI well-type detector in the flow line, the gross gamma counting of the solution under flow is found to be within a precision of +/- 5%. (author). 4 refs., 2 figs., 1 tab

  7. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (French Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  8. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (Spanish Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  9. Acceleration of multivariate analysis techniques in TMVA using GPUs

    CERN Document Server

    Hoecker, A; Therhaag, J; Washbrook, A

    2012-01-01

    A feasibility study into the acceleration of multivariate analysis techniques using Graphics Processing Units (GPUs) will be presented. The MLP-based Artificial Neural Network method contained in the TMVA framework has been chosen as a focus for investigation. It was found that the network training time on a GPU was lower than for CPU execution as the complexity of the network was increased. In addition, multiple neural networks can be trained simultaneously on a GPU within the same time taken for single network training on a CPU. This could be potentially leveraged to provide a qualitative performance gain in data classification.

  10. Reduction and analysis techniques for infrared imaging data

    Science.gov (United States)

    Mccaughrean, Mark

    1989-01-01

    Infrared detector arrays are becoming increasingly available to the astronomy community, with a number of array cameras already in use at national observatories, and others under development at many institutions. As the detector technology and imaging instruments grow more sophisticated, more attention is focussed on the business of turning raw data into scientifically significant information. Turning pictures into papers, or equivalently, astronomy into astrophysics, both accurately and efficiently, is discussed. Also discussed are some of the factors that can be considered at each of three major stages; acquisition, reduction, and analysis, concentrating in particular on several of the questions most relevant to the techniques currently applied to near infrared imaging.

  11. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    Science.gov (United States)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  12. Nonactivation interaction techniques in the analysis of environmental samples

    International Nuclear Information System (INIS)

    Tolgyessy, J.

    1986-01-01

    Nonactivation interaction analytical methods are based on the interaction processes of nuclear and X-ray radiation with a sample, leading to their absorption and backscattering, to the ionization of gases or excitation of fluorescent X-ray by radiation, but not to the activation of determined elements. From the point of view of environmental analysis, the most useful nonactivation interaction techniques are X-ray fluorescence by photon or charged particle excitation, ionization of gases by nuclear radiation, elastic scattering of charged particles and backscattering of beta radiation. The significant advantage of these methods is that they are nondestructive. (author)

  13. Characterization of the neutron sources storage pool of the Neutron Standards Laboratory, using Montecarlo Techniques

    International Nuclear Information System (INIS)

    Campo Blanco, X.

    2015-01-01

    The development of irradiation damage resistant materials is one of the most important open fields in the design of experimental facilities and conceptual nucleoelectric fusion plants. The Neutron Standards Laboratory aims to contribute to this development by allowing the neutron irradiation of materials in its calibration neutron sources storage pool. For this purposes, it is essential to characterize the pool itself in terms of neutron fluence and spectra due to the calibration neutron sources. In this work, the main features of this facility are presented and the characterization of the storage pool is carried out. Finally, an application is shown of the obtained results to the neutron irradiation of material.

  14. Sample preparation techniques in trace element analysis by X-ray emission spectroscopy

    International Nuclear Information System (INIS)

    Valkovic, V.

    1983-11-01

    The report, written under a research contract with the IAEA, contains a detailed presentation of the most difficult problem encountered in the trace element analysis by methods of the X-ray emission spectroscopy, namely the sample preparation techniques. The following items are covered. Sampling - with specific consideration of aerosols, water, soil, biological materials, petroleum and its products, storage of samples and their handling. Pretreatment of samples - preconcentration, ashing, solvent extraction, ion exchange and electrodeposition. Sample preparations for PIXE - analysis - backings, target uniformity and homogeneity, effects of irradiation, internal standards and specific examples of preparation (aqueous, biological, blood serum and solid samples). Sample preparations for radioactive sources or tube excitation - with specific examples (water, liquid and solid samples, soil, geological, plants and tissue samples). Finally, the problem of standards and reference materials, as well as that of interlaboratory comparisons, is discussed

  15. Standardizing risk analysis for the evaluation of oil and gas properties

    International Nuclear Information System (INIS)

    Robinson, J. G.

    1996-01-01

    Notwithstanding the advances made in 3-D seismic, horizontal drilling, and completion techniques, as well as in computer applications, all of which have improved our ability to find and produce new reserves, associated risks, both technical and economic, have also increased. Various efforts to standardize reserve evaluation, and to deal with the growing uncertainty, were discussed. Most of these efforts have failed in the face of great reluctance to change. The objective of this paper was to emphasize the need to incorporate and standardize the application of risk, to propose a revised 'expected value concept' for the economic evaluations of reserves, and to dispel the myth that statistical procedures are difficult, time consuming and expensive to apply. Essential characteristics of the various statistical procedures used in risk assessment, such as the Monte Carlo Simulation, Expected Value Determination and Decision Tree Analysis, were summarized. 8 refs., 9 tabs., 11 figs

  16. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  17. Envelopment technique and topographic overlays in bite mark analysis.

    Science.gov (United States)

    Djeapragassam, Parimala; Daniel, Mariappan Jonathan; Srinivasan, Subramanian Vasudevan; Ramadoss, Koliyan; Jimsha, Vannathan Kumaran

    2015-01-01

    The aims and objectives of our study were to compare four sequential overlays generated using the envelopment technique and to evaluate inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Dental stone models were prepared from impressions made from healthy individuals; photographs were taken and computer-assisted overlays were generated. The models were then enveloped in a different-color dental stone. After this, four sequential cuts were made at a thickness of 1mm each. Each sectional cut was photographed and overlays were generated. Thus, 125 overlays were generated and compared. The scoring was done based on matching accuracy and the data were analyzed. The Kruskal-Wallis one-way analysis of variance (ANOVA) test was used to compare four sequential overlays and Spearman's rank correlation tests were used to evaluate the inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Through our study, we conclude that the third and fourth cuts were the best among the four cuts and inter- and intraoperator reliability were found to be statistically significant at 5% level that is 95% confidence interval (P < 0.05).

  18. Measuring caloric response: comparison of different analysis techniques.

    Science.gov (United States)

    Mallinson, A I; Longridge, N S; Pace-Asciak, P; Ngo, R

    2010-01-01

    Electronystagmography (ENG) testing has been supplanted by newer techniques of measuring eye movement with infrared cameras (VNG). Most techniques of quantifying caloric induced nystagmus measure the slow phase velocity in some manner. Although our analysis is carried out by very experienced assessors, some systems have computer algorithms that have been "taught" to locate and quantify maximum responses. We wondered what differences in measurement might show up when measuring calorics using different techniques and systems, the relevance of this being that if there was a change in slow phase velocity between ENG and VNG testing when measuring caloric response, then normative data would have to be changed. There are also some subjective but important aspects of ENG interpretation which comment on the nature of the response (e.g. responses which might be "sporadic" or "scant"). Our experiment compared caloric responses in 100 patients analyzed four different ways. Each caloric was analyzed by our old ENG system, our new VNG system, an inexperienced assessor and the computer algorithm, and data was compared. All four systems made similar measurements but our inexperienced assessor failed to recognize responses as sporadic or scant, and we feel this is a limitation to be kept in mind in the rural setting, as it is an important aspect of assessment in complex patients. Assessment of complex VNGs should be left to an experienced assessor.

  19. Quantitative chemical analysis of lead in canned chillis by spectrophotometric and nuclear techniques

    International Nuclear Information System (INIS)

    Sanchez Paz, L.A.

    1991-01-01

    The objectives of this work are the quantification of lead contents in two types of canned chilli of three trademarks, determining its inside of maximum permissible level (2 ppm), comparing moreover two trademarks that have flask and canned presentation for to determine the filling effect in the final content of lead, moreover make a comparative study of the techniques using on base to exactitude, linearity and sensibility. The techniques used were atomic absorption spectrophotometry, plasma emission spectrometry and x-ray fluorescence. The preliminary treatment of the samples was by calcination, continued of the ashes dissolution in acid medium, for later gauge a determinate volume for analyze by atomic absorption and plasma emission. For the analysis by x-ray fluorescence, after solubilyzing ashes, its precipitate the lead with PCDA (Pyrrolidine carbodithioic ammonium acid) then its filtered, filter paper is dried and counted directly. The standards preparation is made following the same procedure as in samples using lead titrisol solution. For each technique the recovery percent is determined by the addition of enough know amount. For each technique calibration curves are plotted been determined that the three are lineal in the established range of work. The recovery percent in three cases is superior to ninety five percent. By means of a variance analysis it was determined that lead contain in samples do not exceed two ppm., and the lead content in canned chillis is superior to contained in glass containers (1.7, 0.4 ppm respectively). X-ray fluorescence analysis is different to the attained results by the other two techniques due to its sensibility is less. The most advisable techniques for this kind of analysis are atomic absorption spectrophotometry and plasma emission. (Author)

  20. Microspectrophotometric studies of Romanowsky stained blood cells. I. Subtraction analysis of a standardized procedure.

    Science.gov (United States)

    Galbraith, W; Marshall, P N; Bacus, J W

    1980-08-01

    This paper describes a microspectrophotometric study of blood smears stained by a simple, standardized Romanowsky technique, using only the dyes azure B and cosin. Absorbance spectra are presented for twenty-two classes of cellular object, and for the two dyes in solution, together with tabulations of spectral maxima, and suitable wavelengths for use in automated image processing. The colours of objects stained with azure B/eosin are discussed in terms of absorbance spectra. By a spectral subtraction technique, it is shown that the differential colouration of various cell structures may be explained satisfactorily in terms of the varying proportions of only four dye components. These are the monomers and dimers of azure B and eosin. Polymerization was found to occur both in solution and on binding to biopolymers. A similar analysis of a conventional Romanowsky stain would present much greater difficulties, due to the greater number of dye components, which, however, contribute little to the colours observed.

  1. Service Interaction Flow Analysis Technique for Service Personalization

    DEFF Research Database (Denmark)

    Korhonen, Olli; Kinnula, Marianne; Syrjanen, Anna-Liisa

    2017-01-01

    technology-mediated service interaction design is twofold: First, with the increased understanding on the role of personalization in managing variation in technology-mediated service interaction, our study contributes to designing service management information systems and human-computer interfaces......Service interaction flows are difficult to capture, analyze, outline, and represent for research and design purposes. We examine how variation of personalized service flows in technology-mediated service interaction can be modeled and analyzed to provide information on how service personalization...... could support interaction. We have analyzed service interaction cases in a context of technology-mediated car rental service. With the analysis technique we propose, inspired by Interaction Analysis method, we were able to capture and model the situational service interaction. Our contribution regarding...

  2. Mechanisms of subsidence for induced damage and techniques for analysis

    International Nuclear Information System (INIS)

    Drumm, E.C.; Bennett, R.M.; Kane, W.F.

    1988-01-01

    Structural damage due to mining induced subsidence is a function of the nature of the structure and its position on the subsidence profile. A point on the profile may be in the tensile zone, the compressive zone, or the no-deformation zone at the bottom of the profile. Damage to structures in the tension zone is primarily due to a reduction of support during vertical displacement of the ground surface, and to shear stresses between the soil and structure resulting from horizontal displacements. The damage mechanisms due to tension can be investigated effectively using a two-dimensional plane stress analysis. Structures in the compression zone are subjected to positive moments in the footing and large compressive horizontal stresses in the foundation walls. A plane strain analysis of the foundation wall is utilized to examine compression zone damage mechanisms. The structural aspects affecting each mechanism are identified and potential mitigation techniques are summarized

  3. Sensitivity analysis techniques for models of human behavior.

    Energy Technology Data Exchange (ETDEWEB)

    Bier, Asmeret Brooke

    2010-09-01

    Human and social modeling has emerged as an important research area at Sandia National Laboratories due to its potential to improve national defense-related decision-making in the presence of uncertainty. To learn about which sensitivity analysis techniques are most suitable for models of human behavior, different promising methods were applied to an example model, tested, and compared. The example model simulates cognitive, behavioral, and social processes and interactions, and involves substantial nonlinearity, uncertainty, and variability. Results showed that some sensitivity analysis methods create similar results, and can thus be considered redundant. However, other methods, such as global methods that consider interactions between inputs, can generate insight not gained from traditional methods.

  4. Slit-scanning technique using standard cell sorter instruments for analyzing and sorting nonacrocentric human chromosomes, including small ones

    NARCIS (Netherlands)

    Rens, W.; van Oven, C. H.; Stap, J.; Jakobs, M. E.; Aten, J. A.

    1994-01-01

    We have investigated the performance of two types of standard flow cell sorter instruments, a System 50 Cytofluorograph and a FACSTar PLUS cell sorter, for the on-line centromeric index (CI) analysis of human chromosomes. To optimize the results, we improved the detection efficiency for centromeres

  5. Spectrography analysis of stainless steel by the point to point technique

    International Nuclear Information System (INIS)

    Bona, A.

    1986-01-01

    A method for the determination of the elements Ni, Cr, Mn, Si, Mo, Nb, Cu, Co and V in stainless steel by emission spectrographic analysis using high voltage spark sources is presented. The 'point-to-point' technique is employed. The experimental parameters were optimized taking account a compromise between the detection sensitivity and the precision of the measurement. The parameters investigated were the high voltage capacitance, the inductance, the analytical and auxiliary gaps, the period of pre burn spark and the time of exposure. The edge shape of the counter electrodes and the type of polishing and diameter variation of the stailess steel eletrodes were evaluated in preliminary assays. In addition the degradation of the chemical power of the developer was also investigated. Counter electrodes of graphite, copper, aluminium and iron were employed and the counter electrode itself was used as an internal standard. In the case of graphite counter electrodes the iron lines were employed as internal standard. The relative errors were the criteria for evaluation of these experiments. The National Bureau of Standards - Certified reference stainless steel standards and the Eletrometal Acos Finos S.A. samples (certified by the supplier) were employed for drawing in the calibration systems and analytical curves. The best results were obtained using the convencional graphite counter electrodes. The inaccuracy and the imprecision of the proposed method varied from 2% to 15% and from 1% to 9% respectively. This present technique was compared to others instrumental techniques such as inductively coupled plasma, X-ray fluorescence and neutron activation analysis. The advantages and disadvantages for each case were discussed. (author) [pt

  6. [Applications of spectral analysis technique to monitoring grasshoppers].

    Science.gov (United States)

    Lu, Hui; Han, Jian-guo; Zhang, Lu-da

    2008-12-01

    Grasshopper monitoring is of great significance in protecting environment and reducing economic loss. However, how to predict grasshoppers accurately and effectively is a difficult problem for a long time. In the present paper, the importance of forecasting grasshoppers and its habitat is expounded, and the development in monitoring grasshopper populations and the common arithmetic of spectral analysis technique are illustrated. Meanwhile, the traditional methods are compared with the spectral technology. Remote sensing has been applied in monitoring the living, growing and breeding habitats of grasshopper population, and can be used to develop a forecast model combined with GIS. The NDVI values can be analyzed throughout the remote sensing data and be used in grasshopper forecasting. Hyper-spectra remote sensing technique which can be used to monitor grasshoppers more exactly has advantages in measuring the damage degree and classifying damage areas of grasshoppers, so it can be adopted to monitor the spatial distribution dynamic of rangeland grasshopper population. Differentialsmoothing can be used to reflect the relations between the characteristic parameters of hyper-spectra and leaf area index (LAI), and indicate the intensity of grasshopper damage. The technology of near infrared reflectance spectroscopy has been employed in judging grasshopper species, examining species occurrences and monitoring hatching places by measuring humidity and nutrient of soil, and can be used to investigate and observe grasshoppers in sample research. According to this paper, it is concluded that the spectral analysis technique could be used as a quick and exact tool in monitoring and forecasting the infestation of grasshoppers, and will become an important means in such kind of research for their advantages in determining spatial orientation, information extracting and processing. With the rapid development of spectral analysis methodology, the goal of sustainable monitoring

  7. Analysis of minor phase with neutron diffraction technique

    International Nuclear Information System (INIS)

    Engkir Sukirman; Herry Mugirahardjo

    2014-01-01

    The presence of minor phases in a sample have been analyzed with the neutron diffraction technique. In this research, the sample of Fe nanoparticles (FNP) has been selected as the object of case study. The first step was to prepare the FNP sample with the ball milling technique. Hereinafter, the sample of milling result was referred FIC2. The presence of phases formed in FIC2 were analyzed qualitatively and quantitatively using the high resolution neutron diffraction (HRPD ) and X-Ray Diffraction (XRD) techniques. The diffraction data were analyzed by means of the Rietveld method utilizing a computer code, namely FullProf and performed by referring to the supporting data, namely particle size and magnetic properties of materials. The two kinds of supporting data were obtained from the PSA (Particles Size Analyzer) and VSM (Vibrating Samples Magnetometer), respectively. The analysis result shows that quality of fitting for neutron diffraction pattern is better than the fitting quality for x-ray diffraction pattern. Of the HRPD data were revealed that FIC2 consists of Fe, γFe 2 O 3 and Fe 3 O 4 phases as much as 78.62; 21.37 and 0.01%, respectively. Of the XRD data were obtained that FIC2 consists of Fe and γFe 2 O 3 phases with amount of 99.96 and 0.04%, respectively; the presence of Fe 3 O 4 phase was not observed. With the neutron diffraction technique, the presence of minor phase can be determined accurately. (author)

  8. SURVEY ON CRIME ANALYSIS AND PREDICTION USING DATA MINING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H Benjamin Fredrick David

    2017-04-01

    Full Text Available Data Mining is the procedure which includes evaluating and examining large pre-existing databases in order to generate new information which may be essential to the organization. The extraction of new information is predicted using the existing datasets. Many approaches for analysis and prediction in data mining had been performed. But, many few efforts has made in the criminology field. Many few have taken efforts for comparing the information all these approaches produce. The police stations and other similar criminal justice agencies hold many large databases of information which can be used to predict or analyze the criminal movements and criminal activity involvement in the society. The criminals can also be predicted based on the crime data. The main aim of this work is to perform a survey on the supervised learning and unsupervised learning techniques that has been applied towards criminal identification. This paper presents the survey on the Crime analysis and crime prediction using several Data Mining techniques.

  9. Application of transport phenomena analysis technique to cerebrospinal fluid.

    Science.gov (United States)

    Lam, C H; Hansen, E A; Hall, W A; Hubel, A

    2013-12-01

    The study of hydrocephalus and the modeling of cerebrospinal fluid flow have proceeded in the past using mathematical analysis that was very capable of prediction phenomenonologically but not well in physiologic parameters. In this paper, the basis of fluid dynamics at the physiologic state is explained using first established equations of transport phenomenon. Then, microscopic and molecular level techniques of modeling are described using porous media theory and chemical kinetic theory and then applied to cerebrospinal fluid (CSF) dynamics. Using techniques of transport analysis allows the field of cerebrospinal fluid dynamics to approach the level of sophistication of urine and blood transport. Concepts such as intracellular and intercellular pathways, compartmentalization, and tortuosity are associated with quantifiable parameters that are relevant to the anatomy and physiology of cerebrospinal fluid transport. The engineering field of transport phenomenon is rich and steeped in architectural, aeronautical, nautical, and more recently biological history. This paper summarizes and reviews the approaches that have been taken in the field of engineering and applies it to CSF flow.

  10. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    Science.gov (United States)

    Torabipour, Amin; Najarzadeh, Maryam; Arab, Mohammad; Farzianpour, Freshteh; Ghasemzadeh, Roya

    2014-11-01

    This study aimed to measure the hospital productivity using data envelopment analysis (DEA) technique and Malmquist indices. This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software. Six hospitals (50%) had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF) was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05) (except in 2009 years). Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  11. Maintenance Audit through Value Analysis Technique: A Case Study

    Science.gov (United States)

    Carnero, M. C.; Delgado, S.

    2008-11-01

    The increase in competitiveness, technological changes and the increase in the requirements of quality and service have forced a change in the design and application of maintenance, as well as the way in which it is considered within the managerial strategy. There are numerous maintenance activities that must be developed in a service company. As a result the maintenance functions as a whole have to be outsourced. Nevertheless, delegating this subject to specialized personnel does not exempt the company from responsibilities, but rather leads to the need for control of each maintenance activity. In order to achieve this control and to evaluate the efficiency and effectiveness of the company it is essential to carry out an audit that diagnoses the problems that could develop. In this paper a maintenance audit applied to a service company is developed. The methodology applied is based on the expert systems. The expert system by means of rules uses the weighting technique SMART and value analysis to obtain the weighting between the decision functions and between the alternatives. The expert system applies numerous rules and relations between different variables associated with the specific maintenance functions, to obtain the maintenance state by sections and the general maintenance state of the enterprise. The contributions of this paper are related to the development of a maintenance audit in a service enterprise, in which maintenance is not generally considered a strategic subject and to the integration of decision-making tools such as the weighting technique SMART with value analysis techniques, typical in the design of new products, in the area of the rule-based expert systems.

  12. Robot-assisted repair of tricuspid leaflet prolapse using standard valvuloplasty techniques.

    Science.gov (United States)

    Seder, Christopher W; Suri, Rakesh M; Rehfeldt, Kent; Pislaru, Sorin; Burkhart, Harold M

    2012-11-01

    While minimally invasive approaches are used routinely to correct severe mitral regurgitation due to leaflet prolapse, isolated tricuspid valve prolapse is less frequent and usually addressed via sternotomy. A 34-year-old female presented with exertional dyspnea and severe tricuspid regurgitation due to an unsupported anterior leaflet causing prolapse, a tethered septal leaflet, and dilated annulus. Herein, the technique is described of a robot-assisted tricuspid valve repair using established open valvuloplasty principles. The robotic repair was performed by the placement of Gore-Tex neochordae from the anterior papillary muscle to the anterior tricuspid leaflet, plication of the anteroseptal and anteroposterior commissures, closure of an anterior leaflet cleft, and the insertion of an annuloplasty band. The patient had an uncomplicated hospital course and was dismissed home on the third postoperative day.

  13. Model-Based Data Integration and Process Standardization Techniques for Fault Management: A Feasibility Study

    Science.gov (United States)

    Haste, Deepak; Ghoshal, Sudipto; Johnson, Stephen B.; Moore, Craig

    2018-01-01

    This paper describes the theory and considerations in the application of model-based techniques to assimilate information from disjoint knowledge sources for performing NASA's Fault Management (FM)-related activities using the TEAMS® toolset. FM consists of the operational mitigation of existing and impending spacecraft failures. NASA's FM directives have both design-phase and operational-phase goals. This paper highlights recent studies by QSI and DST of the capabilities required in the TEAMS® toolset for conducting FM activities with the aim of reducing operating costs, increasing autonomy, and conforming to time schedules. These studies use and extend the analytic capabilities of QSI's TEAMS® toolset to conduct a range of FM activities within a centralized platform.

  14. Standard practice for evaluation of hydrogen uptake, permeation, and transport in metals by an electrochemical technique

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1997-01-01

    1.1 This practice gives a procedure for the evaluation of hydrogen uptake, permeation, and transport in metals using an electrochemical technique which was developed by Devanathan and Stachurski. While this practice is primarily intended for laboratory use, such measurements have been conducted in field or plant applications. Therefore, with proper adaptations, this practice can also be applied to such situations. 1.2 This practice describes calculation of an effective diffusivity of hydrogen atoms in a metal and for distinguishing reversible and irreversible trapping. 1.3 This practice specifies the method for evaluating hydrogen uptake in metals based on the steady-state hydrogen flux. 1.4 This practice gives guidance on preparation of specimens, control and monitoring of the environmental variables, test procedures, and possible analyses of results. 1.5 This practice can be applied in principle to all metals and alloys which have a high solubility for hydrogen, and for which the hydrogen permeation is ...

  15. Pyrite: A blender plugin for visualizing molecular dynamics simulations using industry-standard rendering techniques.

    Science.gov (United States)

    Rajendiran, Nivedita; Durrant, Jacob D

    2018-05-05

    Molecular dynamics (MD) simulations provide critical insights into many biological mechanisms. Programs such as VMD, Chimera, and PyMOL can produce impressive simulation visualizations, but they lack many advanced rendering algorithms common in the film and video-game industries. In contrast, the modeling program Blender includes such algorithms but cannot import MD-simulation data. MD trajectories often require many gigabytes of memory/disk space, complicating Blender import. We present Pyrite, a Blender plugin that overcomes these limitations. Pyrite allows researchers to visualize MD simulations within Blender, with full access to Blender's cutting-edge rendering techniques. We expect Pyrite-generated images to appeal to students and non-specialists alike. A copy of the plugin is available at http://durrantlab.com/pyrite/, released under the terms of the GNU General Public License Version 3. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  16. Analysis of scintigrams by singular value decomposition (SVD) technique

    Energy Technology Data Exchange (ETDEWEB)

    Savolainen, S.E.; Liewendahl, B.K. (Helsinki Univ. (Finland). Dept. of Physics)

    1994-05-01

    The singular value decomposition (SVD) method is presented as a potential tool for analyzing gamma camera images. Mathematically image analysis is a study of matrixes as the standard scintigram is a digitized matrix presentation of the recorded photon fluence from radioactivity of the object. Each matrix element (pixel) consists of a number, which equals the detected counts of the object position. The analysis of images can be reduced to the analysis of the singular values of the matrix decomposition. In the present study the clinical usefulness of SVD was tested by analyzing two different kinds of scintigrams: brain images by single photon emission tomography (SPET), and liver and spleen planar images. It is concluded that SVD can be applied to the analysis of gamma camera images, and that it provides an objective method for interpretation of clinically relevant information contained in the images. In image filtering, SVD provides results comparable to conventional filtering. In addition, the study of singular values can be used for semiquantitation of radionuclide images as exemplified by brain SPET studies and liver-spleen planar studies. (author).

  17. A comparison of approximation techniques for variance-based sensitivity analysis of biochemical reaction systems

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2010-05-01

    Full Text Available Abstract Background Sensitivity analysis is an indispensable tool for the analysis of complex systems. In a recent paper, we have introduced a thermodynamically consistent variance-based sensitivity analysis approach for studying the robustness and fragility properties of biochemical reaction systems under uncertainty in the standard chemical potentials of the activated complexes of the reactions and the standard chemical potentials of the molecular species. In that approach, key sensitivity indices were estimated by Monte Carlo sampling, which is computationally very demanding and impractical for large biochemical reaction systems. Computationally efficient algorithms are needed to make variance-based sensitivity analysis applicable to realistic cellular networks, modeled by biochemical reaction systems that consist of a large number of reactions and molecular species. Results We present four techniques, derivative approximation (DA, polynomial approximation (PA, Gauss-Hermite integration (GHI, and orthonormal Hermite approximation (OHA, for analytically approximating the variance-based sensitivity indices associated with a biochemical reaction system. By using a well-known model of the mitogen-activated protein kinase signaling cascade as a case study, we numerically compare the approximation quality of these techniques against traditional Monte Carlo sampling. Our results indicate that, although DA is computationally the most attractive technique, special care should be exercised when using it for sensitivity analysis, since it may only be accurate at low levels of uncertainty. On the other hand, PA, GHI, and OHA are computationally more demanding than DA but can work well at high levels of uncertainty. GHI results in a slightly better accuracy than PA, but it is more difficult to implement. OHA produces the most accurate approximation results and can be implemented in a straightforward manner. It turns out that the computational cost of the

  18. Evaluation of energy system analysis techniques for identifying underground facilities

    Energy Technology Data Exchange (ETDEWEB)

    VanKuiken, J.C.; Kavicky, J.A.; Portante, E.C. [and others

    1996-03-01

    This report describes the results of a study to determine the feasibility and potential usefulness of applying energy system analysis techniques to help detect and characterize underground facilities that could be used for clandestine activities. Four off-the-shelf energy system modeling tools were considered: (1) ENPEP (Energy and Power Evaluation Program) - a total energy system supply/demand model, (2) ICARUS (Investigation of Costs and Reliability in Utility Systems) - an electric utility system dispatching (or production cost and reliability) model, (3) SMN (Spot Market Network) - an aggregate electric power transmission network model, and (4) PECO/LF (Philadelphia Electric Company/Load Flow) - a detailed electricity load flow model. For the purposes of most of this work, underground facilities were assumed to consume about 500 kW to 3 MW of electricity. For some of the work, facilities as large as 10-20 MW were considered. The analysis of each model was conducted in three stages: data evaluation, base-case analysis, and comparative case analysis. For ENPEP and ICARUS, open source data from Pakistan were used for the evaluations. For SMN and PECO/LF, the country data were not readily available, so data for the state of Arizona were used to test the general concept.

  19. Determination of 25 elements in biological standard reference materials by neutron activation analysis

    International Nuclear Information System (INIS)

    Guzzi, G.; Pietra, R.; Sabbioni, E.

    1974-12-01

    Standard and Certified Reference Materials programme of the JRC includes the determination of trace elements in complex biological samples delivered by the U.S. National Bureau of Standards: Bovine liver (NBS SRM 1577), Orchard Leaves (NBS SRM 1571) and Tomato Leaves. The study has been performed by the use of neutron activation analysis. Due to the very low concentration of some elements, radiochemical groups or elemental separation procedures were necessary. The paper describes the techniques used to analyse 25 elements. Computer assisted instrumental neutron activation analysis with high resolution Ge(Li) spectrometry was considerably advantageous in the determination of Na, K, Cl, Mn, Fe, Rb and Co and in some cases of Ca, Zn, Cs, Sc, and Cr. For low contents of Ca, Mg, Ni and Si special chemical separation schemes, followed by Cerenkov counting have been developped. Two other separation procedures allowing the determination of As, Cd, Ga, Hg, Mo, Cu, Sr Se, Ba and P have been set up. The first, the simplified one involves the use of high resolution Ge(Li) detectors, the second, the more complete one involves a larger number of shorter measurements performed by simpler and more sensitive techniques, such as NaI(Tl) scintillation spectrometry and Cerenkov counting. The results obtained are presented and discussed

  20. The Trends and Prospects of Health Information Standards : Standardization Analysis and Suggestions

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Soo [Dept. of Radiological Science, College of Health Science, Catholic University of Pusan, Pusan (Korea, Republic of)

    2008-03-15

    Ubiquitous health care system, which is one of the developing solution technologies of IT, BT and NT, could give us new medical environments in future. Implementing health information systems can be complex, expensive and frustrating. Healthcare professionals seeking to acquire or upgrade systems do not have a convenient, reliable way of specifying a level of adherence to communication standards sufficient to achieve truly efficient interoperability. Great progress has been made in establishing such standards-DICOM, IHE and HL7, notably, are now highly advanced. IHE has defined a common framework to deliver the basic interoperability needed for local and regional health information networks. It has developed a foundational set of standards-based integration profiles for information exchange with three interrelated efforts. HL7 is one of several ANSI-accredited Standards Developing Organizations operating in the healthcare arena. Most SDOs produce standards (protocols) for a particular healthcare domain such as pharmacy, medical devices, imaging or insurance transactions. HL7's domain is clinical and administrative data. HL7 is an international community of healthcare subject matter experts and information scientists collaborating to create standards for the exchange, management and integration of electronic healthcare information. The ASTM specification for Continuity of Care Record was developed by subcommittee E31.28 on electronic health records, which includes clinicians, provider institutions, administrators, patient advocates, vendors, and health industry. In this paper, there are suggestions that provide a test bed, demonstration and specification of how standards such a IHE, HL7, ASTM can be used to provide an integrated environment.

  1. Recanalização da artéria femoral superficial com stents Zilver: técnica padronizada e análise retrospectiva de 3 anos Superficial femoral artery recanalization with Zilver stents: standard technique and 3-year retrospective analysis

    Directory of Open Access Journals (Sweden)

    Marcelo Ferreira

    2006-12-01

    Full Text Available OBJETIVOS: Descrever a técnica de recanalização endovascular da artéria femoral superficial e fazer uma análise retrospectiva dos 3 primeiros anos da técnica. MÉTODOS: Análise retrospectiva dos pacientes tratados entre 2001 e 2004, visando obter as taxas de perviedade das recanalizações. A amostra considerada neste estudo consta de 79 artérias femorais superficiais recanalizadas em 61 pacientes, nos quais foram utilizados exclusivamente a técnica descrita e o mesmo modelo de stent de nitinol auto-expansível (Zilver, COOK. RESULTADOS: Dos 61 pacientes, 8% possuíam isquemia crítica de membro inferior e 92% apresentavam claudicação incapacitante refratária ao tratamento clínico. A melhora clínica foi observada e referida pelos pacientes numa relação direta à perviedade das recanalizações. A análise estatística demonstrou taxas acumuladas de perviedade primária assistida de 98, 91 e 84% em 12, 24 e 37 meses, respectivamente. As taxas de perviedade, entendida como fluxo continuado nas recanalizações, foram de 96, 93 e 93% em 12, 24 e 37 meses, respectivamente. CONCLUSÕES: Consideramos a técnica da recanalização da artéria femoral superficial um método ao mesmo tempo pouco invasivo, com reduzidas complicações e de consideráveis taxas de sucesso anatômico e perviedade, que, em conjunto, são capazes de proporcionar satisfação e qualidade de vida aos pacientes portadores de doença arterial obstrutiva periférica.OBJECTIVES: To describe the endovascular recanalization technique of the superficial femoral artery and perform a 3-year retrospective analysis of the technique. METHODS: Retrospective analysis of the patients treated between 2001 and 2004, with the aim of obtaining the patency rates of the recanalizations. The sample consisted of 79 recanalized superficial femoral arteries in 61 patients, exclusively using the described technique and the same nitinol self-expanding stent model (Zilver, COOK. RESULTS

  2. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  3. Two-loop renormalization in the standard model, part II. Renormalization procedures and computational techniques

    Energy Technology Data Exchange (ETDEWEB)

    Actis, S. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Passarino, G. [Torino Univ. (Italy). Dipt. di Fisica Teorica; INFN, Sezione di Torino (Italy)

    2006-12-15

    In part I general aspects of the renormalization of a spontaneously broken gauge theory have been introduced. Here, in part II, two-loop renormalization is introduced and discussed within the context of the minimal Standard Model. Therefore, this paper deals with the transition between bare parameters and fields to renormalized ones. The full list of one- and two-loop counterterms is shown and it is proven that, by a suitable extension of the formalism already introduced at the one-loop level, two-point functions suffice in renormalizing the model. The problem of overlapping ultraviolet divergencies is analyzed and it is shown that all counterterms are local and of polynomial nature. The original program of 't Hooft and Veltman is at work. Finite parts are written in a way that allows for a fast and reliable numerical integration with all collinear logarithms extracted analytically. Finite renormalization, the transition between renormalized parameters and physical (pseudo-)observables, are discussed in part III where numerical results, e.g. for the complex poles of the unstable gauge bosons, are shown. An attempt is made to define the running of the electromagnetic coupling constant at the two-loop level. (orig.)

  4. Standard Test Methods for Total Normal Emittance of Surfaces Using Inspection-Meter Techniques

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1971-01-01

    1.1 These test methods cover determination of the total normal emittance (Note) of surfaces by means of portable, inspection-meter instruments. Note 1—Total normal emittance (εN) is defined as the ratio of the normal radiance of a specimen to that of a blackbody radiator at the same temperature. The equation relating εN to wavelength and spectral normal emittance [εN (λ)] is where: L b(λ, T) = Planck's blackbody radiation function = c1π −1λ−5(ec2/λT − 1)−1, c1 = 3.7415 × 10−16 W·m 2, c2 = 1.4388 × 10−2 m·K, T = absolute temperature, K, λ = wavelength, m, Lb(λ, T)dλ = Δπ −1T4, and Δ = Stefan-Boltzmann constant = 5.66961 × 10 −8 W·m2·K−4 1.2 These test methods are intended for measurements on large surfaces when rapid measurements must be made and where a nondestructive test is desired. They are particularly useful for production control tests. 1.3 The values stated in SI units are to be regarded as standard. No other units of measu...

  5. Watermarking Techniques Using Least Significant Bit Algorithm for Digital Image Security Standard Solution- Based Android

    Directory of Open Access Journals (Sweden)

    Ari Muzakir

    2017-05-01

    Full Text Available Ease of deployment of digital image through the internet has positive and negative sides, especially for owners of the original digital image. The positive side of the ease of rapid deployment is the owner of that image deploys digital image files to various sites in the world address. While the downside is that if there is no copyright that serves as protector of the image it will be very easily recognized ownership by other parties. Watermarking is one solution to protect the copyright and know the results of the digital image. With Digital Image Watermarking, copyright resulting digital image will be protected through the insertion of additional information such as owner information and the authenticity of the digital image. The least significant bit (LSB is one of the algorithm is simple and easy to understand. The results of the simulations carried out using android smartphone shows that the LSB watermarking technique is not able to be seen by naked human eye, meaning there is no significant difference in the image of the original files with images that have been inserted watermarking. The resulting image has dimensions of 640x480 with a bit depth of 32 bits. In addition, to determine the function of the ability of the device (smartphone in processing the image using this application used black box testing. 

  6. Salivary Fluoride level in preschool children after toothbrushing with standard and low fluoride content dentifrice, using the transversal dentifrice application technique: pilot study

    Directory of Open Access Journals (Sweden)

    Fabiana Jandre Melo

    2008-01-01

    Full Text Available Objective: To investigate the salivary fluoride concentration in pre-school children after toothbrushing with dentifrice containing standard (1100ppmF/NaF and low (500ppmF/NaF fluoride concentration, using the transversal technique of placing the product on the toothbrush. Methods: Eight children of both sexes, ranging from 4 to 9 years, and 5 years and 6 months of age, participated in the study. The experiment was divided into two phases with a weekly interval. In the first stage, the children used the standard concentration dentifrice for one week, and in the second, the low concentration product. Samples were collected at the end of each experimental stage, at the following times: Before brushing, immediately afterwards, and after 15, 30 and 45 minutes. The fluoride contents were analyzed by the microdiffusion technique. Statistical analysis was done by the analysis of variance ANOVA and Student’s-t test (p<0.05. Results: The salivary fluoride concentration was significantly higher at all times, when the standard concentration product was used. The comparison between the Halogen concentration found before bushing and immediately afterwards, showed that there was a 6.8 times increase in the standard dentifrice (0.19 x 1.29μgF/ml and in the low concentration product, an increase of 20.5 times (0.02 x 0.41μgF/ml. Conclusion: Toothbrushing with both products promoted relevant increases in the salivary fluoride concentration; however, longitudinal studies are necessary to verify the clinical result of this measurement.

  7. TU-EF-BRD-02: Indicators and Technique Analysis

    International Nuclear Information System (INIS)

    Carlone, M.

    2015-01-01

    Research related to quality and safety has been a staple of medical physics academic activities for a long time. From very early on, medical physicists have developed new radiation measurement equipment and analysis techniques, created ever increasingly accurate dose calculation models, and have vastly improved imaging, planning, and delivery techniques. These and other areas of interest have improved the quality and safety of radiotherapy for our patients. With the advent of TG-100, quality and safety is an area that will garner even more research interest in the future. As medical physicists pursue quality and safety research in greater numbers, it is worthwhile to consider what actually constitutes research on quality and safety. For example, should the development of algorithms for real-time EPID-based in-vivo dosimetry be defined as “quality and safety” research? How about the clinical implementation of such as system? Surely the application of failure modes and effects analysis to a clinical process would be considered quality and safety research, but is this type of research that should be included in the medical physics peer-reviewed literature? The answers to such questions are of critical importance to set researchers in a direction that will provide the greatest benefit to our field and the patients we serve. The purpose of this symposium is to consider what constitutes research in the arena of quality and safety and differentiate it from other research directions. The key distinction here is developing the tool itself (e.g. algorithms for EPID dosimetry) vs. studying the impact of the tool with some quantitative metric. Only the latter would I call quality and safety research. Issues of ‘basic’ versus ‘applied’ quality and safety research will be covered as well as how the research results should be structured to provide increasing levels of support that a quality and safety intervention is effective and sustainable. Examples from existing

  8. Characterization of the storage pool of the Neutron Standards Laboratory of CIEMAT, using Monte Carlo techniques

    Energy Technology Data Exchange (ETDEWEB)

    Campo B, X.; Mendez V, R.; Embid S, M. [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas, Av. Complutense 40, 28040 Madrid (Spain); Vega C, H. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Cipres No. 10, Fracc. La Penuela, 98060 Zacatecas (Mexico); Sanz G, J., E-mail: xandra.campo@ciemat.es [Universidad Nacional de Educacion a Distancia, Escuela Tecnica Superior de Ingenieros Industriales, C. Juan del Rosal 12, 28040 Madrid (Spain)

    2014-08-15

    Neutron Standards Laboratory of CIEMAT in Spain is a brand new irradiation facility, with {sup 241}Am-Be (185 GBq) and {sup 252}Cf (5 GBq) calibrated neutron sources which are stored in a water pool with a concrete cover. From this storage place an automated system is able to take the selected source and place it in the irradiation position, 4 m over the ground level and in the geometrical center of the Irradiation Room with 9 m (length) x 7.5 m (width) x 8 m (height). For calibration or irradiation purposes, detectors or materials can be placed on a bench but it is possible to use the pool (1.0 m x 1.5 m and more than 1.0 m depth) for long time irradiations in thermal neutron fields. For this reason it is essential to characterize the pool itself in terms of neutron spectrum. In this document, the main features of this facility are presented and the characterization of the storage pool in terms of neutron fluence rate and neutron spectrum has been carried out using simulations with MCNPX-2.7.e code. The MCNPX-2.7.e model has been validated using experimental measurements outside the pool (Bert hold LB6411). Inside the pool, the fluence rate decreases and the spectra is thermalized with the distance to the {sup 252}Cf source. This source predominates and the effect of the {sup 241}Am-Be source in these magnitudes is not shown until positions closer than 20 cm from it. (author)

  9. Characterization of the storage pool of the Neutron Standards Laboratory of CIEMAT, using Monte Carlo techniques

    International Nuclear Information System (INIS)

    Campo B, X.; Mendez V, R.; Embid S, M.; Vega C, H. R.; Sanz G, J.

    2014-08-01

    Neutron Standards Laboratory of CIEMAT in Spain is a brand new irradiation facility, with 241 Am-Be (185 GBq) and 252 Cf (5 GBq) calibrated neutron sources which are stored in a water pool with a concrete cover. From this storage place an automated system is able to take the selected source and place it in the irradiation position, 4 m over the ground level and in the geometrical center of the Irradiation Room with 9 m (length) x 7.5 m (width) x 8 m (height). For calibration or irradiation purposes, detectors or materials can be placed on a bench but it is possible to use the pool (1.0 m x 1.5 m and more than 1.0 m depth) for long time irradiations in thermal neutron fields. For this reason it is essential to characterize the pool itself in terms of neutron spectrum. In this document, the main features of this facility are presented and the characterization of the storage pool in terms of neutron fluence rate and neutron spectrum has been carried out using simulations with MCNPX-2.7.e code. The MCNPX-2.7.e model has been validated using experimental measurements outside the pool (Bert hold LB6411). Inside the pool, the fluence rate decreases and the spectra is thermalized with the distance to the 252 Cf source. This source predominates and the effect of the 241 Am-Be source in these magnitudes is not shown until positions closer than 20 cm from it. (author)

  10. Standardization of the Fricke gel dosimetry method and tridimensional dose evaluation using the magnetic resonance imaging technique

    International Nuclear Information System (INIS)

    Cavinato, Christianne Cobello

    2009-01-01

    This study standardized the method for obtaining the Fricke gel solution developed at IPEN. The results for different gel qualities used in the preparation of solutions and the influence of the gelatin concentration in the response of dosimetric solutions were compared. Type tests such as: dose response dependence, minimum and maximum detection limits, response reproducibility, among others, were carried out using different radiation types and the Optical Absorption (OA) spectrophotometry and Magnetic Resonance (MR) techniques. The useful dose ranges for Co 60 gamma radiation and 6 MeV photons are 0,4 to 30,0 Gy and 0,5 to 100,0 Gy , using OA and MR techniques, respectively. A study of ferric ions diffusion in solution was performed to determine the optimum time interval between irradiation and samples evaluation; until 2,5 hours after irradiation to obtain sharp MR images. A spherical simulator consisting of Fricke gel solution prepared with 5% by weight 270 Bloom gelatine (national quality) was developed to be used to three-dimensional dose assessment using the Magnetic Resonance Imaging (MRI) technique. The Fricke gel solution prepared with 270 Bloom gelatine, that, in addition to low cost, can be easily acquired on the national market, presents satisfactory results on the ease of handling, sensitivity, response reproducibility and consistency. The results confirm their applicability in the three-dimensional dosimetry using MRI technique. (author)

  11. The 'Pull' Technique for Removal of Peritoneal Dialysis Catheters: A Call for Re-Evaluation of Practice Standards.

    Science.gov (United States)

    Grieff, Marvin; Mamo, Elizabeth; Scroggins, Gina; Kurchin, Alexander

    2017-01-01

    ♦ BACKGROUND: The most commonly used peritoneal dialysis (PD) catheters have silicon tubing with attached Dacron cuffs. The current standard of care for PD catheter removal is by complete surgical dissection, withdrawing both the tubing and the cuffs. The intention is to avoid infection of any residual part of the catheter. We retrospectively analyzed our results with the alternative 'pull' technique, by which the silicon tube is pulled out, leaving the Dacron cuffs within the abdominal wall. This technique never gained popularity due to concern that the retained cuffs would get infected. ♦ METHODS: We reviewed our experience from an 18-month period, between January 2014 and June 2015. There were 46 catheter removals in 40 patients. All the catheters were of the double-cuffed coiled Tenckhoff type (Covidien, Dublin, Ireland). ♦ RESULTS: Of the 46 catheter removals by the 'pull' technique, there was only 1 case of retained cuff infection. ♦ CONCLUSIONS: The 'pull' technique is a safe method for Tenckhoff catheter removal with low risk of infection. We strongly recommend it as the procedure of choice. Copyright © 2017 International Society for Peritoneal Dialysis.

  12. [The standard implantation of a total hip prosthesis via two incisions (the Yale Technique)].

    Science.gov (United States)

    Kipping, Robert

    2009-09-01

    Implantation of a total hip endoprosthesis with minimal trauma to the soft tissue. The need for visual aids (e.g., navigation or X-rays) during the procedure is frequently avoided. All kinds of coxarthrosis for every age group, for every variation of bone construction, and even in obese patients. Extremely dysplastic hip joints involving the development of a secondary socket and the necessity of reconstruction of the acetabular socket (e.g., in the Harris method). Using a fixed lateral position, a small entry incision is made between the tensor fasciae latae and the sartorius muscles and the prosthesis socket is put into place. Via a second dorsal incision, after stripping the exterior rotators, the prosthesis stem and ball are implanted and the two parts of the prosthesis are attached. Full weight bearing allowed immediately. A luxation prophylaxis, in the form of a self-developed hip bodice (the so-called Yale bandage), is used until the end of the 4th postoperative week. Discharge from hospital is possible after just a few days. Upon discharge, the patient is sent to a rehabilitation facility, either as a resident or as an outpatient, for approximately 3 weeks. Return to the workplace, with only light physical activity, is possible once the wound has healed completely; this could be as soon as 14 days after the operation. Checkups are made after 4 weeks, 6 months, 1 year and then every year; these checkups include a full examination, X-rays and laboratory tests. Full exposure to sport or heavy manual labor is usually approved after the 6-month checkup. Between October 2004 and April 2006, a total of 221 patients underwent surgery using this new technique (of these 15 patients underwent two-stage bilateral hip joint replacements). Patients were followed up for a minimum of 12 months and a maximum of 30 months. The Harris Hip Score improved from an average of 45.25 preoperatively to 96.4 postoperatively.

  13. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Directory of Open Access Journals (Sweden)

    Richard E.A. van Emmerik

    2016-03-01

    Full Text Available Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1 maintain pattern stability, (2 transition into new states, and (3 are governed by short- and long-term (fractal correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  14. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  15. Use of decision analysis techniques to determine Hanford cleanup priorities

    International Nuclear Information System (INIS)

    Fassbender, L.; Gregory, R.; Winterfeldt, D. von; John, R.

    1992-01-01

    In January 1991, the U.S. Department of Energy (DOE) Richland Field Office, Westinghouse Hanford Company, and the Pacific Northwest Laboratory initiated the Hanford Integrated Planning Process (HIPP) to ensure that technically sound and publicly acceptable decisions are made that support the environmental cleanup mission at Hanford. One of the HIPP's key roles is to develop an understanding of the science and technology (S and T) requirements to support the cleanup mission. This includes conducting an annual systematic assessment of the S and T needs at Hanford to support a comprehensive technology development program and a complementary scientific research program. Basic to success is a planning and assessment methodology that is defensible from a technical perspective and acceptable to the various Hanford stakeholders. Decision analysis techniques were used to help identify and prioritize problems and S and T needs at Hanford. The approach used structured elicitations to bring many Hanford stakeholders into the process. Decision analysis, which is based on the axioms and methods of utility and probability theory, is especially useful in problems characterized by uncertainties and multiple objectives. Decision analysis addresses uncertainties by laying out a logical sequence of decisions, events, and consequences and by quantifying event and consequence probabilities on the basis of expert judgments

  16. Techniques of DNA methylation analysis with nutritional applications.

    Science.gov (United States)

    Mansego, Maria L; Milagro, Fermín I; Campión, Javier; Martínez, J Alfredo

    2013-01-01

    Epigenetic mechanisms are likely to play an important role in the regulation of metabolism and body weight through gene-nutrient interactions. This review focuses on methods for analyzing one of the most important epigenetic mechanisms, DNA methylation, from single nucleotide to global measurement depending on the study goal and scope. In addition, this study highlights the major principles and methods for DNA methylation analysis with emphasis on nutritional applications. Recent developments concerning epigenetic technologies are showing promising results of DNA methylation levels at a single-base resolution and provide the ability to differentiate between 5-methylcytosine and other nucleotide modifications such as 5-hydroxymethylcytosine. A large number of methods can be used for the analysis of DNA methylation such as pyrosequencing™, primer extension or real-time PCR methods, and genome-wide DNA methylation profile from microarray or sequencing-based methods. Researchers should conduct a preliminary analysis focused on the type of validation and information provided by each technique in order to select the best method fitting for their nutritional research interests. Copyright © 2013 S. Karger AG, Basel.

  17. Electrodeposition as a sample preparation technique for TXRF analysis

    International Nuclear Information System (INIS)

    Griesel, S.; Reus, U.; Prange, A.

    2000-01-01

    TXRF analysis of trace elements at concentrations in the μg/L range and below in high salt matrices normally requires a number of sample preparation steps that include separation of the salt matrix and preconcentration of the trace elements. A neat approach which allows samples to be prepared straightforwardly in a single step involves the application of electrochemical deposition using the TXRF sample support itself as an electrode. For this work a common three-electrode arrangement (radiometer analytical) with a rotating disc electrode as the working electrode, as is frequently employed in voltametric analysis, has been used. A special electrode tip has been constructed as a holder for the sample carrier which consists of polished glassy carbon. This material has been proven to be suitable for both its electrical and chemical properties. Measurements of the trace elements were performed using the ATOMIKA 8030C TXRF spectrometer, with the option of variable incident angles. In first experiments an artificial sea water matrix containing various trace elements in the μg/L range has been used. Elements such as Cr, Mn, Fe, Co, Ni, Cu, Zn, Ag, Cd, Hg, and Pb deposited on glassy carbon carriers. The deposition can be optimized by controlling the potential of the working electrode with respect to the reference electrode. Metal ions with a suitable standard potential are reduced to the metallic state and plated onto the electrode surface. When deposition is finished the sample carrier is demounted, rinsed with ultra-pure water and measured directly. Deposition yields for the elements under investigation are quite similar, and with an appropriate choice of the reference element, quantification can be achieved directly by internal standardization. The influence of parameters such as time, pH value, and trace element concentration on the deposition yield has been examined, and the results will be presented along with reproducibility studies. (author)

  18. Novel technique for coal pyrolysis and hydrogenation product analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.; Boyle, J.

    1993-03-15

    A microjet reactor coupled to a VUV photoionization time-of-flight mass spectrometer has been used to obtain species measurements during high temperature pyrolysis and oxidation of a wide range of hydrocarbon compounds ranging from allene and acetylene to cyclohexane, benzene and toluene. Initial work focused on calibration of the technique, optimization of ion collection and detection and characterization of limitations. Using the optimized technique with 118 nm photoionization, intermediate species profiles were obtained for analysis of the hydrocarbon pyrolysis and oxidation mechanisms. The soft'' ionization, yielding predominantly molecular ions, allowed the study of reaction pathways in these high temperature systems where both sampling and detection challenges are severe. Work has focused on the pyrolysis and oxidative pyrolysis of aliphatic and aromatic hydrocarbon mixtures representative of coal pyrolysis and hydropyrolysis products. The detailed mass spectra obtained during pyrolysis and oxidation of hydrocarbon mixtures is especially important because of the complex nature of the product mixture even at short residence times and low primary reactant conversions. The combustion community has advanced detailed modeling of pyrolysis and oxidation to the C4 hydrocarbon level but in general above that size uncertainties in rate constant and thermodynamic data do not allow us to a priori predict products from mixed hydrocarbon pyrolyses using a detailed chemistry model. For pyrolysis of mixtures of coal-derived liquid fractions with a large range of compound structures and molecular weights in the hundreds of amu the modeling challenge is severe. Lumped models are possible from stable product data.

  19. Factor Rotation and Standard Errors in Exploratory Factor Analysis

    Science.gov (United States)

    Zhang, Guangjian; Preacher, Kristopher J.

    2015-01-01

    In this article, we report a surprising phenomenon: Oblique CF-varimax and oblique CF-quartimax rotation produced similar point estimates for rotated factor loadings and factor correlations but different standard error estimates in an empirical example. Influences of factor rotation on asymptotic standard errors are investigated using a numerical…

  20. SHOT PUT O’BRIAN TECHNIQUE, EXTENDING THE ANALYSIS OF TECHNIQUE FROM FOUR TO SIX PHASES WITH THE DESCRIPTION

    Directory of Open Access Journals (Sweden)

    Zlatan Saračević

    2011-09-01

    Full Text Available Due to the complexity of the motion, shot put technique is described in phases for easier analysis, easer learning of technique and error correction. It is complete so that in its implementation the transition from phase to phase is not noticed. In aforementioned and described phases of O'Brian spinal shot put technique a large distance, emptiness and disconnection appear between the initial position phase and a phase of overtaking the device, which in the training methods and training technique in primary and secondary education, as well as for students and athletes beginners in shot put represents a major problem regarding connecting, training and technique advancement. Therefore, this work is aimed at facilitating the methods of training of shot put technique, extending from four to six phases, which have been described and include the complete O'Brian technique.

  1. Performance of selected imputation techniques for missing variances in meta-analysis

    Science.gov (United States)

    Idris, N. R. N.; Abdullah, M. H.; Tolos, S. M.

    2013-04-01

    A common method of handling the problem of missing variances in meta-analysis of continuous response is through imputation. However, the performance of imputation techniques may be influenced by the type of model utilised. In this article, we examine through a simulation study the effects of the techniques of imputation of the missing SDs and type of models used on the overall meta-analysis estimates. The results suggest that imputation should be adopted to estimate the overall effect size, irrespective of the model used. However, the accuracy of the estimates of the corresponding standard error (SE) is influenced by the imputation techniques. For estimates based on the fixed effects model, mean imputation provides better estimates than multiple imputations, while those based on the random effects model responds more robustly to the type of imputation techniques. The results showed that although imputation is good in reducing the bias in point estimates, it is more likely to produce coverage probability which is higher than the nominal value.

  2. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    Energy Technology Data Exchange (ETDEWEB)

    Cuesta, C [University of Washington, Seattle; Abgrall, N. [Lawrence Berkeley National Laboratory (LBNL); Arnquist, I. J. [Pacific Northwest National Laboratory (PNNL); Avignone, III, F. T. [University of South Carolina/Oak Ridge National Laboratory (ORNL); Baldenegro-Barrera, C. X. [Oak Ridge National Laboratory (ORNL); Barabash, A.S. [Institute of Theoretical & Experimental Physics (ITEP), Moscow, Russia; Bertrand, F. E. [Oak Ridge National Laboratory (ORNL); Bradley, A. W. [Lawrence Berkeley National Laboratory (LBNL); Brudanin, V. [Joint Institute for Nuclear Research, Dubna, Russia; Busch, M. [Duke University/TUNL; Buuck, M. [University of Washington, Seattle; Byram, D. [University of South Dakota; Caldwell, A. S. [South Dakota School of Mines and Technology; Chan, Y-D [Lawrence Berkeley National Laboratory (LBNL); Christofferson, C. D. [South Dakota School of Mines and Technology; Detwiler, J. A. [University of Washington, Seattle; Efremenko, Yu. [University of Tennessee, Knoxville (UTK); Ejiri, H. [Osaka University, Japan; Elliott, S. R. [Los Alamos National Laboratory (LANL); Galindo-Uribarri, A. [Oak Ridge National Laboratory (ORNL); Gilliss, T. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Giovanetti, G. K. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Goett, J [Los Alamos National Laboratory (LANL); Green, M. P. [Oak Ridge National Laboratory (ORNL); Gruszko, J [University of Washington, Seattle; Guinn, I S [University of Washington, Seattle; Guiseppe, V E [University of South Carolina, Columbia; Henning, R. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Hoppe, E.W. [Pacific Northwest National Laboratory (PNNL); Howard, S. [South Dakota School of Mines and Technology; Howe, M. A. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Jasinski, B R [University of South Dakota; Keeter, K.J. [Black Hills State University, Spearfish, South Dakota; Kidd, M. F. [Tennessee Technological University (TTU); Konovalov, S.I. [Institute of Theoretical & Experimental Physics (ITEP), Moscow, Russia; Kouzes, R. T. [Pacific Northwest National Laboratory (PNNL); LaFerriere, B. D. [Pacific Northwest National Laboratory (PNNL); Leon, J. [University of Washington, Seattle; MacMullin, J. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Martin, R. D. [University of South Dakota; Meijer, S. J. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Mertens, S. [Lawrence Berkeley National Laboratory (LBNL); Orrell, J. L. [Pacific Northwest National Laboratory (PNNL); O' Shaughnessy, C. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Poon, A.W.P. [Lawrence Berkeley National Laboratory (LBNL); Radford, D. C. [Oak Ridge National Laboratory (ORNL); Rager, J. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Rielage, K. [Los Alamos National Laboratory (LANL); Robertson, R.G.H. [University of Washington, Seattle; Romero-Romero, E. [University of Tennessee, Knoxville, (UTK)/Oak Ridge National Lab (ORNL); Shanks, B. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Shirchenko, M. [Joint Institute for Nuclear Research, Dubna, Russia; Snyder, N [University of South Dakota; Suriano, A. M. [South Dakota School of Mines and Technology; Tedeschi, D [University of South Carolina, Columbia; Trimble, J. E. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Varner, R. L. [Oak Ridge National Laboratory (ORNL); Vasilyev, S. [Joint Institute for Nuclear Research, Dubna, Russia; Vetter, K. [University of California/Lawrence Berkeley National Laboratory (LBNL); et al.

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in Ge-76. In view of the next generation of tonne-scale Ge-based 0 nu beta beta-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  3. Evaluation of tritium analysis techniques for a continuous tritium monitor

    International Nuclear Information System (INIS)

    Fernandez, S.J.; Girton, R.C.

    1978-04-01

    Present methods for tritium monitoring are evaluated and a program is proposed to modify the existing methods or develop new instrumentation to establish a state-of-the-art monitoring capability for nuclear fuel reprocessing plants. The capabilities, advantages, and disadvantages of the most popular counting and separation techniques are described. The following criteria were used to evaluate present methods: specificity, selectivity, precision, insensitivity to gamma radiation, and economy. A novel approach is explored to continuously separate the tritium from a complex mixture of stack gases. This approach, based on the different permeabilities of the stack gas constituents, is integrated into a complete monitoring system. This monitoring system is designed to perform real time tritium analysis. A schedule is presented for development and demonstration of the completed system

  4. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  5. Advances in zymography techniques and patents regarding protease analysis.

    Science.gov (United States)

    Wilkesman, Jeff; Kurz, Liliana

    2012-08-01

    Detection of enzymatic activity on gel electrophoresis, namely zymography, is a technique that has received increasing attention in the last 10 years, according to the number of articles published. A growing amount of enzymes, mainly proteases, are now routinely detected by zymography. Detailed analytical studies are beginning to be published, as well as new patents have been developed. This new article updates the information covered in our last review, condensing the recent publications dealing with the identification of proteolytic enzymes in electrophoretic gel supports and its variations. The new advances of this method are basically focused towards two dimensional zymography and transfer zymography. Though comparatively fewer patents have been published, they basically coincide in the study of matrix metalloproteases. The tendency is foreseen to be very productive in the area of zymoproteomics, combining electrophoresis and mass spectrometry for the analysis of proteases.

  6. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  7. Standardization of sample collection, isolation and analysis methods in extracellular vesicle research

    Directory of Open Access Journals (Sweden)

    Kenneth W. Witwer

    2013-05-01

    Full Text Available The emergence of publications on extracellular RNA (exRNA and extracellular vesicles (EV has highlighted the potential of these molecules and vehicles as biomarkers of disease and therapeutic targets. These findings have created a paradigm shift, most prominently in the field of oncology, prompting expanded interest in the field and dedication of funds for EV research. At the same time, understanding of EV subtypes, biogenesis, cargo and mechanisms of shuttling remains incomplete. The techniques that can be harnessed to address the many gaps in our current knowledge were the subject of a special workshop of the International Society for Extracellular Vesicles (ISEV in New York City in October 2012. As part of the “ISEV Research Seminar: Analysis and Function of RNA in Extracellular Vesicles (evRNA”, 6 round-table discussions were held to provide an evidence-based framework for isolation and analysis of EV, purification and analysis of associated RNA molecules, and molecular engineering of EV for therapeutic intervention. This article arises from the discussion of EV isolation and analysis at that meeting. The conclusions of the round table are supplemented with a review of published materials and our experience. Controversies and outstanding questions are identified that may inform future research and funding priorities. While we emphasize the need for standardization of specimen handling, appropriate normative controls, and isolation and analysis techniques to facilitate comparison of results, we also recognize that continual development and evaluation of techniques will be necessary as new knowledge is amassed. On many points, consensus has not yet been achieved and must be built through the reporting of well-controlled experiments.

  8. Standardization of sample collection, isolation and analysis methods in extracellular vesicle research

    Science.gov (United States)

    Witwer, Kenneth W.; Buzás, Edit I.; Bemis, Lynne T.; Bora, Adriana; Lässer, Cecilia; Lötvall, Jan; Nolte-‘t Hoen, Esther N.; Piper, Melissa G.; Sivaraman, Sarada; Skog, Johan; Théry, Clotilde; Wauben, Marca H.; Hochberg, Fred

    2013-01-01

    The emergence of publications on extracellular RNA (exRNA) and extracellular vesicles (EV) has highlighted the potential of these molecules and vehicles as biomarkers of disease and therapeutic targets. These findings have created a paradigm shift, most prominently in the field of oncology, prompting expanded interest in the field and dedication of funds for EV research. At the same time, understanding of EV subtypes, biogenesis, cargo and mechanisms of shuttling remains incomplete. The techniques that can be harnessed to address the many gaps in our current knowledge were the subject of a special workshop of the International Society for Extracellular Vesicles (ISEV) in New York City in October 2012. As part of the “ISEV Research Seminar: Analysis and Function of RNA in Extracellular Vesicles (evRNA)”, 6 round-table discussions were held to provide an evidence-based framework for isolation and analysis of EV, purification and analysis of associated RNA molecules, and molecular engineering of EV for therapeutic intervention. This article arises from the discussion of EV isolation and analysis at that meeting. The conclusions of the round table are supplemented with a review of published materials and our experience. Controversies and outstanding questions are identified that may inform future research and funding priorities. While we emphasize the need for standardization of specimen handling, appropriate normative controls, and isolation and analysis techniques to facilitate comparison of results, we also recognize that continual development and evaluation of techniques will be necessary as new knowledge is amassed. On many points, consensus has not yet been achieved and must be built through the reporting of well-controlled experiments. PMID:24009894

  9. The Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project

    Science.gov (United States)

    Barnes, D.; Harrison, R. A.; Davies, J. A.; Perry, C. H.; Moestl, C.; Rouillard, A.; Bothmer, V.; Rodriguez, L.; Eastwood, J. P.; Kilpua, E.; Gallagher, P.; Odstrcil, D.

    2017-12-01

    Understanding solar wind evolution is fundamental to advancing our knowledge of energy and mass transport in the solar system, whilst also being crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of solar wind evolution, by enabling direct and continuous observation of both transient and background components of the solar wind as they propagate from the Sun to 1 AU and beyond. The recently completed, EU-funded FP7 Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project (1st May 2014 - 30th April 2017) combined European expertise in heliospheric imaging, built up over the last decade in particular through leadership of the Heliospheric Imager (HI) instruments aboard NASA's STEREO mission, with expertise in solar and coronal imaging as well as the interpretation of in-situ and radio diagnostic measurements of solar wind phenomena. HELCATS involved: (1) the cataloguing of transient (coronal mass ejections) and background (stream/corotating interaction regions) solar wind structures observed by the STEREO/HI instruments, including estimates of their kinematic properties based on a variety of modelling techniques; (2) the verification of these kinematic properties through comparison with solar source observations and in-situ measurements at multiple points throughout the heliosphere; (3) the assessment of the potential for initialising numerical models based on the derived kinematic properties of transient and background solar wind components; and (4) the assessment of the complementarity of radio observations (Type II radio bursts and interplanetary scintillation) in the detection and analysis of heliospheric structure in combination with heliospheric imaging observations. In this presentation, we provide an overview of the HELCATS project emphasising, in particular, the principal achievements and legacy of this unprecedented project.

  10. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    Science.gov (United States)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  11. Multiplex Ligation-Dependent Probe Amplification Technique for Copy Number Analysis on Small Amounts of DNA Material

    DEFF Research Database (Denmark)

    Sørensen, Karina; Andersen, Paal; Larsen, Lars

    2008-01-01

    The multiplex ligation-dependent probe amplification (MLPA) technique is a sensitive technique for relative quantification of up to 50 different nucleic acid sequences in a single reaction, and the technique is routinely used for copy number analysis in various syndromes and diseases. The aim...... of the study was to exploit the potential of MLPA when the DNA material is limited. The DNA concentration required in standard MLPA analysis is not attainable from dried blood spot samples (DBSS) often used in neonatal screening programs. A novel design of MLPA probes has been developed to permit for MLPA...... analysis on small amounts of DNA. Six patients with congenital adrenal hyperplasia (CAH) were used in this study. DNA was extracted from both whole blood and DBSS and subjected to MLPA analysis using normal and modified probes. Results were analyzed using GeneMarker and manual Excel analysis. A total...

  12. Alternative calibration techniques for counteracting the matrix effects in GC-MS-SPE pesticide residue analysis - a statistical approach.

    Science.gov (United States)

    Rimayi, Cornelius; Odusanya, David; Mtunzi, Fanyana; Tsoka, Shepherd

    2015-01-01

    This paper investigates the efficiency of application of four different multivariate calibration techniques, namely matrix-matched internal standard (MMIS), matrix-matched external standard (MMES), solvent-only internal standard (SOIS) and solvent-only external standard (SOES) on the detection and quantification of 20 organochlorine compounds from high, low and blank matrix water sample matrices by Gas Chromatography-Mass Spectrometry (GC-MS) coupled to solid phase extraction (SPE). Further statistical testing, using Statistical Package for the Social Science (SPSS) by applying MANOVA, T-tests and Levene's F tests indicates that matrix composition has a more significant effect on the efficiency of the analytical method than the calibration method of choice. Matrix effects are widely described as one of the major sources of errors in GC-MS multiresidue analysis. Descriptive and inferential statistics proved that the matrix-matched internal standard calibration was the best approach to use for samples of varying matrix composition as it produced the most precise average mean recovery of 87% across all matrices tested. The use of an internal standard calibration overall produced more precise total recoveries than external standard calibration, with mean values of 77% and 64% respectively. The internal standard calibration technique produced a particularly high overall standard deviation of 38% at 95% confidence level indicating that it is less robust than the external standard calibration method which had an overall standard error of 32% at 95% confidence level. Overall, the matrix-matched external standard calibration proved to be the best calibration approach for analysis of low matrix samples which consisted of the real sample matrix as it had the most precise recovery of 98% compared to other calibration approaches for the low-matrix samples. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Comparison of global sensitivity analysis techniques and importance measures in PSA

    International Nuclear Information System (INIS)

    Borgonovo, E.; Apostolakis, G.E.; Tarantola, S.; Saltelli, A.

    2003-01-01

    This paper discusses application and results of global sensitivity analysis techniques to probabilistic safety assessment (PSA) models, and their comparison to importance measures. This comparison allows one to understand whether PSA elements that are important to the risk, as revealed by importance measures, are also important contributors to the model uncertainty, as revealed by global sensitivity analysis. We show that, due to epistemic dependence, uncertainty and global sensitivity analysis of PSA models must be performed at the parameter level. A difficulty arises, since standard codes produce the calculations at the basic event level. We discuss both the indirect comparison through importance measures computed for basic events, and the direct comparison performed using the differential importance measure and the Fussell-Vesely importance at the parameter level. Results are discussed for the large LLOCA sequence of the advanced test reactor PSA

  14. Russian Language Development Assessment as a Standardized Technique for Assessing Communicative Function in Children Aged 3–9 Years

    Directory of Open Access Journals (Sweden)

    Prikhoda N.A.,

    2016-10-01

    Full Text Available The article describes the Russian Language Development Assessment, a standardized individual diagnostic tool for children aged from 3 to 9 that helps to assess the following components of a child’s communicative function: passive vocabulary, expressive vocabulary, knowledge of semantic constructs with logical, temporal and spatial relations, passive perception and active use of syntactic and morphological features of words in a sentence, active and passive phonological awareness, active and passive knowledge of syntactic structures and categories. The article provides descriptions of content and diagnostic procedures for all 7 subtests included in the assessment (Passive Vocabulary, Active Vocabulary, Linguistic Operators, Sentence structure, Word Structure, Phonology, Sentence Repetition. Basing on the data collected in the study that involved 86 first- graders of a Moscow school, the article analyzes the internal consistency and construct validity of each subtest of the technique. It concludes that the Russian Language Development Assessment technique can be of much use both in terms of diagnostic purposes and in supporting children with ASD taking into account the lack of standardized tools for language and speech development assessment in Russian and the importance of this measure in general.

  15. A comparative study of standard vs. high definition colonoscopy for adenoma and hyperplastic polyp detection with optimized withdrawal technique.

    Science.gov (United States)

    East, J E; Stavrindis, M; Thomas-Gibson, S; Guenther, T; Tekkis, P P; Saunders, B P

    2008-09-15

    Colonoscopy has a known miss rate for polyps and adenomas. High definition (HD) colonoscopes may allow detection of subtle mucosal change, potentially aiding detection of adenomas and hyperplastic polyps. To compare detection rates between HD and standard definition (SD) colonoscopy. Prospective, cohort study with optimized withdrawal technique (withdrawal time >6 min, antispasmodic, position changes, re-examining flexures and folds). One hundred and thirty patients attending for routine colonoscopy were examined with either SD (n = 72) or HD (n = 58) colonoscopes. Groups were well matched. Sixty per cent of patients had at least one adenoma detected with SD vs. 71% with HD, P = 0.20, relative risk (benefit) 1.32 (95% CI 0.85-2.04). Eighty-eight adenomas (mean +/- standard deviation 1.2 +/- 1.4) were detected using SD vs. 93 (1.6 +/- 1.5) with HD, P = 0.12; however more nonflat, diminutive (9 mm) hyperplastic polyps was 7% (0.09 +/- 0.36). High definition did not lead to a significant increase in adenoma or hyperplastic polyp detection, but may help where comprehensive lesion detection is paramount. High detection rates appear possible with either SD or HD, when using an optimized withdrawal technique.

  16. Analysis of improved criteria for mold growth in ASHRAE standard 160 by comparison with field observations

    Science.gov (United States)

    Samuel V. Glass; Stanley D. Gatland II; Kohta Ueno; Christopher J. Schumacher

    2017-01-01

    ASHRAE Standard 160, Criteria for Moisture-Control Design Analysis in Buildings, was published in 2009. The standard sets criteria for moisture design loads, hygrothermal analysis methods, and satisfactory moisture performance of the building envelope. One of the evaluation criteria specifies conditions necessary to avoid mold growth. The current standard requires that...

  17. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    2010-11-01

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  18. Crystallographic texture analysis of archaeological metals: interpretation of manufacturing techniques

    International Nuclear Information System (INIS)

    Artioli, G.

    2007-01-01

    Neutron probes and high energy X-rays are sources of primary importance for the non-invasive characterization of materials related to cultural heritage. Their employment in the characterization of archaeological metal objects, combined with the recent instrumental and computational developments in the field of crystallographic texture analysis (CTA) from diffraction data proves to be a powerful tool for the interpretation of ancient metal working techniques. Diffraction based CTA, when performed using penetrating probes and adequate detector coverage of reciprocal space, for example using large detector arrays and/or ToF mode, allows simultaneous identification and quantification of crystalline phases, besides the microstructural and textural characterization of the object, and it can be effectively used as a totally non-invasive tool for metallographic analysis. Furthermore, the chemical composition of the object may also be obtained by the simultaneous detection of prompt gamma rays induced by neutron activation, or by the fluorescence signal from high energy X-rays, in order to obtain a large amount of complementary information in a single experiment. The specific application of neutron CTA to the characterization of the manufacturing processes of prehistoric copper axes is discussed in detail. (orig.)

  19. Crystallographic texture analysis of archaeological metals: interpretation of manufacturing techniques

    Science.gov (United States)

    Artioli, G.

    2007-12-01

    Neutron probes and high energy X-rays are sources of primary importance for the non-invasive characterization of materials related to cultural heritage. Their employment in the characterization of archaeological metal objects, combined with the recent instrumental and computational developments in the field of crystallographic texture analysis (CTA) from diffraction data proves to be a powerful tool for the interpretation of ancient metal working techniques. Diffraction based CTA, when performed using penetrating probes and adequate detector coverage of reciprocal space, for example using large detector arrays and/or ToF mode, allows simultaneous identification and quantification of crystalline phases, besides the microstructural and textural characterization of the object, and it can be effectively used as a totally non-invasive tool for metallographic analysis. Furthermore, the chemical composition of the object may also be obtained by the simultaneous detection of prompt gamma rays induced by neutron activation, or by the fluorescence signal from high energy X-rays, in order to obtain a large amount of complementary information in a single experiment. The specific application of neutron CTA to the characterization of the manufacturing processes of prehistoric copper axes is discussed in detail.

  20. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng

    2010-01-01

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  1. Analysis of Downs syndrome with molecular techniques for future diagnoses

    Directory of Open Access Journals (Sweden)

    May Salem Al-Nbaheen

    2018-03-01

    Full Text Available Down syndrome (DS is a genetic disorder appeared due to the presence of trisomy in chromosome 21 in the G-group of the acrocentric region. DS is also known as non-Mendelian inheritance, due to the lack of Mendel’s laws. The disorder in children is identified through clinical symptoms and chromosomal analysis and till now there are no biochemical and molecular analyses. Presently, whole exome sequencing (WES has largely contributed in identifying the new disease-causing genes and represented a significant breakthrough in the field of human genetics and this technique uses high throughput sequencing technologies to determine the arrangement of DNA base pairs specifying the protein coding regions of an individual’s genome. Apart from this next generation sequencing and whole genome sequencing also contribute for identifying the disease marker. From this review, the suggestion was to perform the WES is DS children to identify the marker region. Keywords: Downs syndrome, Exome sequencing, Chromosomal analysis, Genes, Genetics

  2. Vehicle Codes and Standards: Overview and Gap Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Blake, C.; Buttner, W.; Rivkin, C.

    2010-02-01

    This report identifies gaps in vehicle codes and standards and recommends ways to fill the gaps, focusing on six alternative fuels: biodiesel, natural gas, electricity, ethanol, hydrogen, and propane.

  3. The analysis of gastric function using computational techniques

    International Nuclear Information System (INIS)

    Young, Paul

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of the study was (i) to assess the feasibility of using the motility program in a volunteer study and (ii) to determine the effects of the meals on motility. The results showed that the parameters were remarkably consistent between the 4 meals. However, for each meal, velocity and percentage occlusion were found to increase as contractions propagated along the antrum. The first clinical application of the motility program was carried out in Study 2. Motility from three patients was measured, after they had been referred to the Magnetic Resonance Centre with gastric problems. The results showed that one of the patients displayed an irregular motility, compared to the results of the volunteer study. This result had not been observed using other investigative techniques. In Study 3, motility was measured in Low Viscosity and High Viscosity liquid/solid meals, with the solid particulate consisting of agar beads of varying breakdown strength. The results showed that

  4. Machine Learning Techniques for Arterial Pressure Waveform Analysis

    Directory of Open Access Journals (Sweden)

    João Cardoso

    2013-05-01

    Full Text Available The Arterial Pressure Waveform (APW can provide essential information about arterial wall integrity and arterial stiffness. Most of APW analysis frameworks individually process each hemodynamic parameter and do not evaluate inter-dependencies in the overall pulse morphology. The key contribution of this work is the use of machine learning algorithms to deal with vectorized features extracted from APW. With this purpose, we follow a five-step evaluation methodology: (1 a custom-designed, non-invasive, electromechanical device was used in the data collection from 50 subjects; (2 the acquired position and amplitude of onset, Systolic Peak (SP, Point of Inflection (Pi and Dicrotic Wave (DW were used for the computation of some morphological attributes; (3 pre-processing work on the datasets was performed in order to reduce the number of input features and increase the model accuracy by selecting the most relevant ones; (4 classification of the dataset was carried out using four different machine learning algorithms: Random Forest, BayesNet (probabilistic, J48 (decision tree and RIPPER (rule-based induction; and (5 we evaluate the trained models, using the majority-voting system, comparatively to the respective calculated Augmentation Index (AIx. Classification algorithms have been proved to be efficient, in particular Random Forest has shown good accuracy (96.95% and high area under the curve (AUC of a Receiver Operating Characteristic (ROC curve (0.961. Finally, during validation tests, a correlation between high risk labels, retrieved from the multi-parametric approach, and positive AIx values was verified. This approach gives allowance for designing new hemodynamic morphology vectors and techniques for multiple APW analysis, thus improving the arterial pulse understanding, especially when compared to traditional single-parameter analysis, where the failure in one parameter measurement component, such as Pi, can jeopardize the whole evaluation.

  5. Standard Test Method for Isotopic Analysis of Uranium Hexafluoride by Single-Standard Gas Source Multiple Collector Mass Spectrometer Method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This test method is applicable to the isotopic analysis of uranium hexafluoride (UF6) with 235U concentrations less than or equal to 5 % and 234U, 236U concentrations of 0.0002 to 0.1 %. 1.2 This test method may be applicable to the analysis of the entire range of 235U isotopic compositions providing that adequate Certified Reference Materials (CRMs or traceable standards) are available. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety health practices and determine the applicability of regulatory limitations prior to use.

  6. Meta-analysis and literature review of techniques to achieve hemostasis in endoscopic sinus surgery.

    Science.gov (United States)

    Khosla, Akhil J; Pernas, Francisco G; Maeso, Patricia A

    2013-06-01

    Functional endoscopic sinus surgery (FESS) has been used as the standard of treatment for sinonasal disease in which medical therapy fails to ameliorate the disease. Intraoperative hemostasis is a crucial factor in FESS. Currently, ideal techniques for creating intraoperative hemostasis have yet to be clarified and standardized. We sought to better understand what variables can affect intraoperative blood loss and therefore improve surgical field outcomes. A literature search was conducted using PubMed, OVID, MD Consult, and Micromedex with keywords including: FESS, intraoperative blood loss, hemorrhage, and vasoconstriction. The articles were then evaluated with regard to blood loss, surgical grade, and operative time. Eleven articles were cross-referenced to determine the most statistically significant techniques in 3 main categories: general anesthetics, preoperative steroids, and use of epinephrine. Analysis of the articles indicate that total intravenous anesthesia (TIVA) is statistically more beneficial than balanced anesthesia (BA), providing an average difference in blood loss of 75.3057 mL; the use of preoperative steroids is statistically more beneficial than placebo, with an improved difference in blood loss of 28 mL; and a trend toward hemostasis with the use of local anesthetics at a concentration of 1:200,000. Meta-analysis of 1148 patients concludes that hemostasis during FESS is best conducted using TIVA, preoperative steroids, and topical local anesthetic at a 1:200,000 concentration. © 2012 ARS-AAOA, LLC.

  7. A dynamic mechanical analysis technique for porous media.

    Science.gov (United States)

    Pattison, Adam Jeffry; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-02-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite-element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a nonlinear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1-14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in

  8. Data analysis techniques: a tool for cumulative exposure assessment.

    Science.gov (United States)

    Lalloué, Benoît; Monnez, Jean-Marie; Padilla, Cindy; Kihal, Wahida; Zmirou-Navier, Denis; Deguen, Séverine

    2015-01-01

    Everyone is subject to environmental exposures from various sources, with negative health impacts (air, water and soil contamination, noise, etc.or with positive effects (e.g. green space). Studies considering such complex environmental settings in a global manner are rare. We propose to use statistical factor and cluster analyses to create a composite exposure index with a data-driven approach, in view to assess the environmental burden experienced by populations. We illustrate this approach in a large French metropolitan area. The study was carried out in the Great Lyon area (France, 1.2 M inhabitants) at the census Block Group (BG) scale. We used as environmental indicators ambient air NO2 annual concentrations, noise levels and proximity to green spaces, to industrial plants, to polluted sites and to road traffic. They were synthesized using Multiple Factor Analysis (MFA), a data-driven technique without a priori modeling, followed by a Hierarchical Clustering to create BG classes. The first components of the MFA explained, respectively, 30, 14, 11 and 9% of the total variance. Clustering in five classes group: (1) a particular type of large BGs without population; (2) BGs of green residential areas, with less negative exposures than average; (3) BGs of residential areas near midtown; (4) BGs close to industries; and (5) midtown urban BGs, with higher negative exposures than average and less green spaces. Other numbers of classes were tested in order to assess a variety of clustering. We present an approach using statistical factor and cluster analyses techniques, which seem overlooked to assess cumulative exposure in complex environmental settings. Although it cannot be applied directly for risk or health effect assessment, the resulting index can help to identify hot spots of cumulative exposure, to prioritize urban policies or to compare the environmental burden across study areas in an epidemiological framework.

  9. Two-dimensional Imaging Velocity Interferometry: Technique and Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Erskine, D J; Smith, R F; Bolme, C; Celliers, P; Collins, G

    2011-03-23

    We describe the data analysis procedures for an emerging interferometric technique for measuring motion across a two-dimensional image at a moment in time, i.e. a snapshot 2d-VISAR. Velocity interferometers (VISAR) measuring target motion to high precision have been an important diagnostic in shockwave physics for many years Until recently, this diagnostic has been limited to measuring motion at points or lines across a target. We introduce an emerging interferometric technique for measuring motion across a two-dimensional image, which could be called a snapshot 2d-VISAR. If a sufficiently fast movie camera technology existed, it could be placed behind a traditional VISAR optical system and record a 2d image vs time. But since that technology is not yet available, we use a CCD detector to record a single 2d image, with the pulsed nature of the illumination providing the time resolution. Consequently, since we are using pulsed illumination having a coherence length shorter than the VISAR interferometer delay ({approx}0.1 ns), we must use the white light velocimetry configuration to produce fringes with significant visibility. In this scheme, two interferometers (illuminating, detecting) having nearly identical delays are used in series, with one before the target and one after. This produces fringes with at most 50% visibility, but otherwise has the same fringe shift per target motion of a traditional VISAR. The 2d-VISAR observes a new world of information about shock behavior not readily accessible by traditional point or 1d-VISARS, simultaneously providing both a velocity map and an 'ordinary' snapshot photograph of the target. The 2d-VISAR has been used to observe nonuniformities in NIF related targets (polycrystalline diamond, Be), and in Si and Al.

  10. Confirmatory factors analysis of science teacher leadership in the Thailand world-class standard schools

    Science.gov (United States)

    Thawinkarn, Dawruwan

    2018-01-01

    This research aims to analyze factors of science teacher leadership in the Thailand World-Class Standard Schools. The research instrument was a five scale rating questionnaire with reliability 0.986. The sample group included 500 science teachers from World-Class Standard Schools who had been selected by using the stratified random sampling technique. Factor analysis of science teacher leadership in the Thailand World-Class Standard Schools was conducted by using M plus for Windows. The results are as follows: The results of confirmatory factor analysis on science teacher leadership in the Thailand World-Class Standard Schools revealed that the model significantly correlated with the empirical data. The consistency index value was x2 = 105.655, df = 88, P-Value = 0.086, TLI = 0.997, CFI = 0.999, RMSEA = 0.022, and SRMR = 0.019. The value of factor loading of science teacher leadership was positive, with statistical significance at the level of 0.01. The value of six factors was between 0.880-0.996. The highest factor loading was the professional learning community, followed by child-centered instruction, participation in development, the role model in teaching, transformational leaders, and self-development with factor loading at 0.996, 0.928, 0.911, 0.907, 0.901, and 0.871, respectively. The reliability of each factor was 99.1%, 86.0%, 83.0%, 82.2%, 81.0%, and 75.8%, respectively.

  11. Metabolomic analysis using porcine skin: a pilot study of analytical techniques.

    Science.gov (United States)

    Wu, Julie; Fiehn, Oliver; Armstrong, April W

    2014-06-15

    Metabolic byproducts serve as indicators of the chemical processes and can provide valuable information on pathogenesis by measuring the amplified output. Standardized techniques for metabolome extraction of skin samples serve as a critical foundation to this field but have not been developed. We sought to determine the optimal cell lysage techniques for skin sample preparation and to compare GC-TOF-MS and UHPLC-QTOF-MS for metabolomic analysis. Using porcine skin samples, we pulverized the skin via various combinations of mechanical techniques for cell lysage. After extraction, the samples were subjected to GC-TOF-MS and/or UHPLC-QTOF-MS. Signal intensities from GC-TOF-MS analysis showed that ultrasonication (2.7x107) was most effective for cell lysage when compared to mortar-and-pestle (2.6x107), ball mill followed by ultrasonication (1.6x107), mortar-and-pestle followed by ultrasonication (1.4x107), and homogenization (trial 1: 8.4x106; trial 2: 1.6x107). Due to the similar signal intensities, ultrasonication and mortar-and-pestle were applied to additional samples and subjected to GC-TOF-MS and UHPLC-QTOF-MS. Ultrasonication yielded greater signal intensities than mortar-and-pestle for 92% of detected metabolites following GC-TOF-MS and for 68% of detected metabolites following UHPLC-QTOF-MS. Overall, ultrasonication is the preferred method for efficient cell lysage of skin tissue for both metabolomic platforms. With standardized sample preparation, metabolomic analysis of skin can serve as a powerful tool in elucidating underlying biological processes in dermatological conditions.

  12. Platinum stable isotope analysis of geological standard reference materials by double-spike MC-ICPMS.

    Science.gov (United States)

    Creech, J B; Baker, J A; Handler, M R; Bizzarro, M

    2014-01-10

    We report a method for the chemical purification of Pt from geological materials by ion-exchange chromatography for subsequent Pt stable isotope analysis by multiple-collector inductively coupled plasma mass spectrometry (MC-ICPMS) using a 196 Pt- 198 Pt double-spike to correct for instrumental mass bias. Double-spiking of samples was carried out prior to digestion and chemical separation to correct for any mass-dependent fractionation that may occur due to incomplete recovery of Pt. Samples were digested using a NiS fire assay method, which pre-concentrates Pt into a metallic bead that is readily dissolved in acid in preparation for anion-exchange chemistry. Pt was recovered from anion-exchange resin in concentrated HNO 3 acid after elution of matrix elements, including the other platinum group elements (PGE), in dilute HCl and HNO 3 acids. The separation method has been calibrated using a precious metal standard solution doped with a range of synthetic matrices and results in Pt yields of ≥90% with purity of ≥95%. Using this chemical separation technique, we have separated Pt from 11 international geological standard reference materials comprising of PGE ores, mantle rocks, igneous rocks and one sample from the Cretaceous-Paleogene boundary layer. Pt concentrations in these samples range from ca. 5 ng g -1 to 4 μg g -1 . This analytical method has been shown to have an external reproducibility on δ 198 Pt (permil difference in the 198 Pt/ 194 Pt ratio from the IRMM-010 standard) of ±0.040 (2 sd) on Pt solution standards (Creech et al., 2013, J. Anal. At. Spectrom. 28, 853-865). The reproducibility in natural samples is evaluated by processing multiple replicates of four standard reference materials, and is conservatively taken to be ca. ±0.088 (2 sd). Pt stable isotope data for the full set of reference materials have a range of δ 198 Pt values with offsets of up to 0.4‰ from the IRMM-010 standard, which are readily resolved with this technique. These

  13. Automatic Satellite Telemetry Analysis for SSA using Artificial Intelligence Techniques

    Science.gov (United States)

    Stottler, R.; Mao, J.

    In April 2016, General Hyten, commander of Air Force Space Command, announced the Space Enterprise Vision (SEV) (http://www.af.mil/News/Article-Display/Article/719941/hyten-announces-space-enterprise-vision/). The SEV addresses increasing threats to space-related systems. The vision includes an integrated approach across all mission areas (communications, positioning, navigation and timing, missile warning, and weather data) and emphasizes improved access to data across the entire enterprise and the ability to protect space-related assets and capabilities. "The future space enterprise will maintain our nation's ability to deliver critical space effects throughout all phases of conflict," Hyten said. Satellite telemetry is going to become available to a new audience. While that telemetry information should be valuable for achieving Space Situational Awareness (SSA), these new satellite telemetry data consumers will not know how to utilize it. We were tasked with applying AI techniques to build an infrastructure to process satellite telemetry into higher abstraction level symbolic space situational awareness and to initially populate that infrastructure with useful data analysis methods. We are working with two organizations, Montana State University (MSU) and the Air Force Academy, both of whom control satellites and therefore currently analyze satellite telemetry to assess the health and circumstances of their satellites. The design which has resulted from our knowledge elicitation and cognitive task analysis is a hybrid approach which combines symbolic processing techniques of Case-Based Reasoning (CBR) and Behavior Transition Networks (BTNs) with current Machine Learning approaches. BTNs are used to represent the process and associated formulas to check telemetry values against anticipated problems and issues. CBR is used to represent and retrieve BTNs that represent an investigative process that should be applied to the telemetry in certain circumstances

  14. Analysis of Daylighting Requirements within ASHRAE Standard 90.1

    Energy Technology Data Exchange (ETDEWEB)

    Athalye, Rahul A.; Xie, YuLong; Liu, Bing; Rosenberg, Michael I.

    2013-08-01

    Pacific Northwest National Laboratory (PNNL), under the Building Energy Codes Program (BECP) funded by U.S. Department of Energy (DOE), provides support to the ASHRAE/IES/IESNA Standard 90.1(Standard 90.1) Standing Standards Project Committee (SSPC 90.1) and its subcommittees. In an effort to provide the ASHRAE SSPC 90.1 with data that will improve the daylighting and fenestration requirements in the Standard, PNNL collaborated with Heschong Mahone Group (HMG), now part of TRC Solutions. Combining EnergyPlus, a whole-building energy simulation software developed by DOE, with Radiance, a highly accurate illumination modeling software (Ward 1994), the daylighting requirements within Standard 90.1 were analyzed in greater detail. The initial scope of the study was to evaluate the impact of the fraction of window area compared to exterior wall area (window-to-wall ratio (WWR)) on energy consumption when daylighting controls are implemented. This scope was expanded to study the impact of fenestration visible transmittance (VT), electric lighting controls and daylighted area on building energy consumption.

  15. on the use of INAA and ICP-MS Techniques for analysis of some cosmetics samples

    International Nuclear Information System (INIS)

    El-Shazly, E.A.A.; Abo zahra, SH.F.; El-Sweify, F.H.; El-Shaht, M.F.

    2004-01-01

    instrumental neutron activation analysis (INAA) and inductively coupled plasma Mass spectrometry (ICP-MS) techniques have been used for the analysis of compact eye shadow cosmetics samples. these techniques were chosen for their simplicity. rapidity and sensitivity for analysis of such samples. the analyzed samples were collected from the egyptian market and were of different origins either imported or locally - manufactured. About 21 trace elements have been determined in the analyzed samples. these elements are: Ag,Ba, Cd, Ce, Co,Cr, Cs,Cu, Eu, Fe, Hf, Hg,Ni,Pb,Rb, Sb,Sc,Sn, Ta,Th and Zn. The percentage relative standard deviation (%RSD) has been calculated in each case. the importance of this study is to determine trace elements which may have toxic or irritant effects since cosmetics products, similar to drugs come in contact with human skin, and to evaluate the possibility of using INAA for controlling the imitations of brand name through trace elements analysis . the results are discussed in the present paper

  16. Limited vs extended face-lift techniques: objective analysis of intraoperative results.

    Science.gov (United States)

    Litner, Jason A; Adamson, Peter A

    2006-01-01

    To compare the intraoperative outcomes of superficial musculoaponeurotic system plication, imbrication, and deep-plane rhytidectomy techniques. Thirty-two patients undergoing primary deep-plane rhytidectomy participated. Each hemiface in all patients was submitted sequentially to 3 progressively more extensive lifts, while other variables were standardized. Four major outcome measures were studied, including the extent of skin redundancy and the repositioning of soft tissues along the malar, mandibular, and cervical vectors of lift. The amount of skin excess was measured without tension from the free edge to a point over the intertragal incisure, along a plane overlying the jawline. Using a soft tissue caliper, repositioning was examined by measurement of preintervention and immediate postintervention distances from dependent points to fixed anthropometric reference points. The mean skin excesses were 10.4, 12.8, and 19.4 mm for the plication, imbrication, and deep-plane lifts, respectively. The greatest absolute soft tissue repositioning was noted along the jawline, with the least in the midface. Analysis revealed significant differences from baseline and between lift types for each of the studied techniques in each of the variables tested. These data support the use of the deep-plane rhytidectomy technique to achieve a superior intraoperative lift relative to comparator techniques.

  17. A novel CT acquisition and analysis technique for breathing motion modeling

    International Nuclear Information System (INIS)

    Low, Daniel A; White, Benjamin M; Lee, Percy P; Thomas, David H; Gaudio, Sergio; Jani, Shyam S; Wu, Xiao; Lamb, James M

    2013-01-01

    To report on a novel technique for providing artifact-free quantitative four-dimensional computed tomography (4DCT) image datasets for breathing motion modeling. Commercial clinical 4DCT methods have difficulty managing irregular breathing. The resulting images contain motion-induced artifacts that can distort structures and inaccurately characterize breathing motion. We have developed a novel scanning and analysis method for motion-correlated CT that utilizes standard repeated fast helical acquisitions, a simultaneous breathing surrogate measurement, deformable image registration, and a published breathing motion model. The motion model differs from the CT-measured motion by an average of 0.65 mm, indicating the precision of the motion model. The integral of the divergence of one of the motion model parameters is predicted to be a constant 1.11 and is found in this case to be 1.09, indicating the accuracy of the motion model. The proposed technique shows promise for providing motion-artifact free images at user-selected breathing phases, accurate Hounsfield units, and noise characteristics similar to non-4D CT techniques, at a patient dose similar to or less than current 4DCT techniques. (fast track communication)

  18. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  19. Trends in grazing emission x-ray analysis techniques

    International Nuclear Information System (INIS)

    Grieken, R. van; Tsuji, K.; Injuk, J.

    2000-01-01

    then, the detection limits imposed by the semiconductor industry roadmap can probably not be obtained by tube-excited GEXRF. Th perspectives for tube-excited GE-XRF are thus rather poor. Future developments imply the combination of GEXRF with synchrotron radiation excitation. Grazing-emission particle-induced X-ray emission (GE-PIXE) suffers of similar quantification Problems for material deposited on a carrier, but it makes PIXE a surface-sensitive technique, while normally the protons penetrate some tens of μm in the sample. Similarly, grazing-emission electron probe micro-analysis (GE-EPNIA) allows to selectively analyze particles on a flat carrier, allows surface sensitivities in the nm rather than μ range, and yields, in principle, a spatial resolution for chemical analysis similar to the size of the impinging electron beam, rather than of the electron-excited volume. Both GE-PIXE and GE-EPMA need to be explored more fully in the near future. (author)

  20. Standard test method for isotopic analysis of uranium hexafluoride by double standard single-collector gas mass spectrometer method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This is a quantitative test method applicable to determining the mass percent of uranium isotopes in uranium hexafluoride (UF6) samples with 235U concentrations between 0.1 and 5.0 mass %. 1.2 This test method may be applicable for the entire range of 235U concentrations for which adequate standards are available. 1.3 This test method is for analysis by a gas magnetic sector mass spectrometer with a single collector using interpolation to determine the isotopic concentration of an unknown sample between two characterized UF6 standards. 1.4 This test method is to replace the existing test method currently published in Test Methods C761 and is used in the nuclear fuel cycle for UF6 isotopic analyses. 1.5 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appro...

  1. Analysis of Biomechanical Structure and Passing Techniques in Basketball

    Directory of Open Access Journals (Sweden)

    Ricardo E. Izzo

    2011-06-01

    Full Text Available The basketball is a complex sport, which these days has become increasingly linked to its’ psychophysical aspects rather than to the technical ones. Therefore, it is important to make a through study of the passing techniques from the point of view of the type of the pass and its’ biomechanics. From the point of view of the type of the used passes, the most used is the two-handed chest pass with a frequency of 39.9%. This is followed, in terms of frequency, by one-handed passes – the baseball, with 20.9 % – and by the two-handed over the head pass, with 18.2 %, and finally, one- or two-handed indirect passes (bounces, with 11.2 % and 9.8 %. Considering the most used pass in basketball, from the biomechanical point of view, the muscles involved in the correct movement consider all the muscles of the upper extremity, adding also the shoulder muscles as well as the body fixators (abdominals, hip flexors, knee extensors, and dorsal flexors of the foot. The technical and conditional analysis considers the throwing speed, the throw height and the air resistance. In conclusion, the aim of this study is to give some guidelines to improve the mechanical execution of the movements in training, without neglecting the importance of the harmony of the movements themselves.

  2. Elemental analysis of brazing alloy samples by neutron activation technique

    International Nuclear Information System (INIS)

    Eissa, E.A.; Rofail, N.B.; Hassan, A.M.; El-Shershaby, A.; Walley El-Dine, N.

    1996-01-01

    Two brazing alloy samples (C P 2 and C P 3 ) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10 1 1 n/cm 2 /s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10 1 2 n/cm 2 /s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab

  3. Analysis of Program Obfuscation Schemes with Variable Encoding Technique

    Science.gov (United States)

    Fukushima, Kazuhide; Kiyomoto, Shinsaku; Tanaka, Toshiaki; Sakurai, Kouichi

    Program analysis techniques have improved steadily over the past several decades, and software obfuscation schemes have come to be used in many commercial programs. A software obfuscation scheme transforms an original program or a binary file into an obfuscated program that is more complicated and difficult to analyze, while preserving its functionality. However, the security of obfuscation schemes has not been properly evaluated. In this paper, we analyze obfuscation schemes in order to clarify the advantages of our scheme, the XOR-encoding scheme. First, we more clearly define five types of attack models that we defined previously, and define quantitative resistance to these attacks. Then, we compare the security, functionality and efficiency of three obfuscation schemes with encoding variables: (1) Sato et al.'s scheme with linear transformation, (2) our previous scheme with affine transformation, and (3) the XOR-encoding scheme. We show that the XOR-encoding scheme is superior with regard to the following two points: (1) the XOR-encoding scheme is more secure against a data-dependency attack and a brute force attack than our previous scheme, and is as secure against an information-collecting attack and an inverse transformation attack as our previous scheme, (2) the XOR-encoding scheme does not restrict the calculable ranges of programs and the loss of efficiency is less than in our previous scheme.

  4. Seismic margin analysis technique for nuclear power plant structures

    International Nuclear Information System (INIS)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed

  5. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Lalitha Jayaraman

    2010-01-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.

  6. Application of linear and spherical flow analysis techniques to field problems--case studies

    Energy Technology Data Exchange (ETDEWEB)

    Kohlhaas, C.A.; delGiuoice, C.; Abbott, W.A.

    1982-09-01

    Most engineers examine well-test data only with techniques developed for flow in a horizontal cylindrical-radial pattern toward the wellbore. Spherical and linear flow have application in many reservoir situations. Spherical flow has been examined extensively by many authors as an intermediate period between two radial-flow periods for wells which have a short completion interval in thick formations. Linear flow situations develop early in the life of wells which have been fracture-treated: their early linear-flow periods are followed by radial flow. Linear flow may develop late in a well test after a period of early radial flow due to certain configurations of reservoir geometry. Techniques for analyzing spherical and linear flow are summarized here. Data plots which should be prepared and diagnostic features for recognizing and interpreting spherical and linear flow are outlined. These techniques are applied to three example cases to illustrate the methods of analysis and the types of information which can be developed from such analyses and cannot be obtained from the standard Horner-plot analysis.

  7. Study of some environmental problem in egypt using neutron activation analysis techniques

    International Nuclear Information System (INIS)

    El-Karim, A.H.M.G.

    2003-01-01

    this thesis deals with the investigation of the possibility of using the new (second) egyptian research reactor (ETRR-2) at Inshas (22 MW) for the neutron activation analysis (ANN) of trace elements, particularly in air dust, collected from cairo and some other cities of egypt. in this concern chapter 1 gives an introduction about the activation methods in general, describing the various techniques used and a comparison of the methods with other instrumental methods of analysis . as a main classification, the neutron activation methods involve prompt γ-ray NAA and delayed γ-ray NAA; cyclic NAA (repeated activation) was also outlined. the methodology of NAA involves the absolute method, the relative method and the mono standard (single comparator) method , which is in between the absolute and relative methods

  8. Demonstration of a 10 V programmable Josephson voltage standard system based on a multi-chip technique

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, T; Sasaki, H; Yamamori, H; Shoji, A [National Institute of Advanced Industrial Science and Technology, 1-1-1 Umezono, Tsukuba 305-8568 (Japan)], E-mail: yamada-takahiro@aist.go.jp

    2008-03-01

    We have demonstrated a programmable Josephson voltage standard (PJVS) operation up to 10.84 V using a multi-chip technique. We combined two PJVS chips fabricated using NbN/(TiN{sub x}/NbN){sub 2} junction technology. Each PJVS chip was mounted on a single chip carrier using bonding wire, and the two chip carriers were connected by a simple Cu lead wire, and mounted on a cryocooler. High-precision measurements confirmed flat voltage steps for all 22 cells, with a peak-to-peak variation of 100 nV and wide margins of at least 0.35 mA. We also confirmed the stability of the voltage steps in spite of a temperature and RF frequency variation of {+-} 0.1 K and {+-} 0.1 GHz, respectively.

  9. Acoustic analysis of diphthongs in Standard South African English

    CSIR Research Space (South Africa)

    Martirosian, O

    2008-11-01

    Full Text Available the need for diphthongs in a Standard South African English (SSAE) ASR system by replacing them with selected variants and analysing the system results. We define a systematic process to identify and evaluate replacement options for diphthongs and find...

  10. Gasoline taxes or efficiency standards? A heterogeneous household demand analysis

    International Nuclear Information System (INIS)

    Liu, Weiwei

    2015-01-01

    Using detailed consumer expenditure survey data and a flexible semiparametric dynamic demand model, this paper estimates the price elasticity and fuel efficiency elasticity of gasoline demand at the household level. The goal is to assess the effectiveness of gasoline taxes and vehicle fuel efficiency standards on fuel consumption. The results reveal substantial interaction between vehicle fuel efficiency and the price elasticity of gasoline demand: the improvement of vehicle fuel efficiency leads to lower price elasticity and weakens consumers’ sensitivity to gasoline price changes. The offsetting effect also differs across households due to demographic heterogeneity. These findings imply that when gasoline taxes are in place, tightening efficiency standards will partially offset the strength of taxes on reducing fuel consumption. - Highlights: • Model household gasoline demand using a semiparametric approach. • Estimate heterogeneous price elasticity and fuel efficiency elasticity. • Assess the effectiveness of gasoline taxes and efficiency standards. • Efficiency standards offset the impact of gasoline taxes on fuel consumption. • The offsetting effect differs by household demographics

  11. Suitable pellets standards development for LA-ICPMS analysis of Al2O3 powders

    International Nuclear Information System (INIS)

    Ferraz, Israel Elias; Sousa, Talita Alves de; Silva, Ieda de Souza; Gomide, Ricardo Goncalves; Oliveira, Luis Claudio de

    2013-01-01

    Chemical and physical characterization of aluminium oxides has a special interest for the nuclear industry, despite arduous chemical digestion process. Therefore, laser ablation inductively coupled plasma mass spectrometry is an attractive method for analysis. However, due to the lack of suitable matrix-matched certified reference materials (MRC) for such powders and ceramic pellets analysis, LA-ICPMS has not yet been fully applied. Furthermore, establishing calibrate curves to trace element quantification using external standards raises a significant problem. In this context, the development of suitable standard pellets to have calibration curves for chemical determination of the impurities onto aluminium oxide powders by LA-ICPMS analytical technique was aimed in this work. It was developed using two different analytical strategies: (I) boric acid pressed pellets and (II) lithium tetra-borate melted pellets, both spiked with high purity oxides of Si, Mg, Ca, Na,Fe, Cr and Ni. The analytical strategy (II) which presented the best analytical parameters was selected, a reference certificated material was analyzed and the results compared. The limits of detection, linearity, precision, accuracy and recovery study results are presented and discussed. (author)

  12. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  13. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  14. Standardization of RAPD assay for genetic analysis of olive

    African Journals Online (AJOL)

    PRECIOUS

    2009-12-15

    Dec 15, 2009 ... europaea L. Genetic variability and molecular cultivar identification. Genet. Res. Crop Evol. 54(1): 117-128. Mir Ali N, Nabulsi I (2003). Genetic diversity of almond (Prunus dulcis) using RAPD technique, Sci. Hort. 98: 461-471. Nei M (1972). Nei's Original Measures of Genetic Identity and Genetic. Distance.

  15. COMPARISON AND ANALYSIS OF VARIOUS HISTOGRAM EQUALIZATION TECHNIQUES

    OpenAIRE

    MADKI.M.R; RUBINA KHAN

    2012-01-01

    The intensity histogram gives information which can be used for contrast enhancement. The histogram equalization could be flat for levels less than the total number of levels. This could deteriorate the image. This problem can be overcome various techniques. This paper gives a comparative of the Bi-Histogram Equalization, Recursive Mean Seperated Histogram Equalization, Multipeak Histogram Equalization and Brightness Preserving Dynamic Histogram Equalization techniques by using these techniqu...

  16. Enhancements and Health-Related Studies of Neutron Activation Analysis Technique

    International Nuclear Information System (INIS)

    Soliman, M.A.M.

    2012-01-01

    The work presented in this thesis covers two major points. One algorithm concerns with establishment of an accurate standardization method with multi-elemental capabilities and low workload suitable for NAA standardization at ETRR-2. The second one deals with constructing and developing an effective nondestructive technique for analysis of liquid samples based on NAA using (very) short-lived radionuclides. To achieve the first goal, attention has been directed toward implementation of the k 0 -method for calculation of the elements concentrations in the samples. The k 0 -method of NAA standardization has a considerable success as a method for accurate multi-elemental analysis with comparable low workload. The k 0 - method is based on the fact that the unknown sample is irradiated with only one standard element as comparator. To access the implementation of this method at ETRR-2, careful and complete characterization of the neutron flux parameters in the irradiation positions as well as the efficiency calibration of the γ-ray spectrometer must be carried out. The required neutron flux parameters are: the ratio of the thermal to epithermal neutron fluxes (f) and the deviation factor (α) of the epithermal neutron flux from the ideal 1/E law. The work presented in Chapter 4 shows the efficiency calibration curve of the γ ray spectrometer system which was obtained using standard radioactive point sources. Moreover, the f and α parameters were determined in some selected irradiation sites using sets of Zr-Au as neutron flux monitors. Due to different locations relative to the reactor core, the available neutron fluxes in the selected irradiation positions differ substantially, so that different irradiation demands can be satisfied. The reference materials coal NIST 1632c and IAEA-Soil 7 were analyzed for data validation and good agreement between the experimental values and the certified values was obtained. The obtained results have revealed that the k 0 -NAA

  17. Late effects of craniospinal irradiation for standard risk medulloblastoma in paediatric patients: A comparison of treatment techniques

    International Nuclear Information System (INIS)

    Leman, J.

    2016-01-01

    Background: Survival rates for standard risk medulloblastoma are favourable, but craniospinal irradiation (CSI) necessary to eradicate microscopic spread causes life limiting late effects. Aims: The aim of this paper is to compare CSI techniques in terms of toxicity and quality of life for survivors. Methods and materials: A literature search was conducted using synonyms of ‘medulloblastoma’, ’craniospinal’, ‘radiotherapy’ and ‘side effects’ to highlight 29 papers that would facilitate this discussion. Results and discussion: Intensity modulated radiotherapy (IMRT), tomotherapy and protons all provide CSI which can reduce dose to normal tissue, however photon methods cannot eliminate exit dose as well as protons can. Research for each technique requires longer term follow up in order to prove that survival rates remain high whilst reducing late effects. Findings/conclusion: Proton therapy is the superior method of CSI in term of late effects, but more research is needed to evidence this. Until proton therapy is available in the UK IMRT should be utilised. - Highlights: • Craniospinal irradiation is vital in the treatment of medulloblastoma. • Survivors often suffer long term side effects which reduce quality of life. • Tomotherapy, IMRT and proton therapy reduce late effects by sparing normal tissue. • Proton therapy offers superior dose distribution but further research is necessary. • IMRT should be employed for photon radiotherapy.

  18. Determination analysis of energy conservation standards for distribution transformers

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.; Das, S.

    1996-07-01

    This report contains information for US DOE to use in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. Potential for saving energy with more efficient liquid-immersed and dry-type distribution transformers could be significant because these transformers account for an estimated 140 billion kWh of the annual energy lost in the delivery of electricity. Objective was to determine whether energy conservation standards for distribution transformers would have the potential for significant energy savings, be technically feasible, and be economically justified from a national perspective. It was found that energy conservation for distribution transformers would be technically and economically feasible. Based on the energy conservation options analyzed, 3.6-13.7 quads of energy could be saved from 2000 to 2030.

  19. Reliability Analysis and Standardization of Spacecraft Command Generation Processes

    Science.gov (United States)

    Meshkat, Leila; Grenander, Sven; Evensen, Ken

    2011-01-01

    center dot In order to reduce commanding errors that are caused by humans, we create an approach and corresponding artifacts for standardizing the command generation process and conducting risk management during the design and assurance of such processes. center dot The literature review conducted during the standardization process revealed that very few atomic level human activities are associated with even a broad set of missions. center dot Applicable human reliability metrics for performing these atomic level tasks are available. center dot The process for building a "Periodic Table" of Command and Control Functions as well as Probabilistic Risk Assessment (PRA) models is demonstrated. center dot The PRA models are executed using data from human reliability data banks. center dot The Periodic Table is related to the PRA models via Fault Links.

  20. Comparative Analysis of the Dark Ground Buffy Coat Technique (DG ...

    African Journals Online (AJOL)

    The prevalence of typanosome infection in 65 cattle reared under expensive system of management was determined using the dark ground buffy coat (DG) technique and the enzyme-linkedimmunisorbent assay (ELISA). The DG technique showed that there were 18 positive cases (27.69%) of total number of animals, made ...

  1. Advanced patch-clamp techniques and single-channel analysis

    NARCIS (Netherlands)

    Biskup, B; Elzenga, JTM; Homann, U; Thiel, G; Wissing, F; Maathuis, FJM

    Much of our knowledge of ion-transport mechanisms in plant cell membranes comes from experiments using voltage-clamp. This technique allows the measurement of ionic currents across the membrane, whilst the voltage is held under experimental control. The patch-clamp technique was developed to study

  2. Coke drums inspection and evaluation using stress and strain analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Haraguchi, Marcio Issamu [Tricom Tecnologia e Servicos de Manutencao Industrial Ltda., Piquete, SP (Brazil); Samman, Mahmod [Houston Engineering Solutions, Houston, TX (United States); Tinoco, Ediberto Bastos; Marangone, Fabio de Castro; Silva, Hezio Rosa da; Barcelos, Gustavo de Carvalho [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    Coke drums deform due to a complex combination of mechanical and thermal cyclic stresses. Bulges have progressive behavior and represent the main maintenance problem related to these drums. Bulge failure typically result in through-wall cracks, leaks, and sometimes fires. Such failures generally do not represent a great risk to personnel. Repairs needed to maintain reliability of these vessels might require extensive interruption to operation which in turn considerably impacts the profitability of the unit. Therefore the condition, progression and severity of these bulges should be closely monitored. Coke drums can be inspected during turnaround with 3D Laser Scanning and Remote Visual Inspection (RVI) tools, resulting in a detailed dimensional and visual evaluation of the internal surface. A typical project has some goals: inspect the equipment to generate maintenance or inspection recommendations, comparison with previous results and baseline data. Until recently, coke drum structural analysis has been traditionally performed analyzing Stress Concentration Factors (SCF) thought Finite Element Analysis methods; however this technique has some serious technical and practical limitations. To avoid these shortcomings, the new strain analysis technique PSI (Plastic Strain Index) was developed. This method which is based on API 579/ ASME FFS standard failure limit represents the state of the art of coke drum bulging severity assessment has an excellent correlation with failure history. (author)

  3. Comparison between ultrasound guided technique and digital palpation technique for radial artery cannulation in adult patients: An updated meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Bhattacharjee, Sulagna; Maitra, Souvik; Baidya, Dalim K

    2018-03-22

    Possible advantages and risks associated with ultrasound guided radial artery cannulation in-comparison to digital palpation guided method in adult patients are not fully known. We have compared ultrasound guided radial artery cannulation with digital palpation technique in this meta-analysis. Meta-analysis of randomized controlled trials. Trials conducted in operating room, emergency department, cardiac catheterization laboratory. PubMed and Cochrane Central Register of Controlled Trials (CENTRAL) were searched (from 1946 to 20th November 2017) to identify prospective randomized controlled trials in adult patients. Two-dimensional ultrasound guided radial artery catheterization versus digital palpation guided radial artery cannulation. Overall cannulation success rate, first attempt success rate, time to cannulation and mean number of attempts to successful cannulation. Odds ratio (OR) and standardized mean difference (SMD) or mean difference (MD) with 95% confidence interval (CI) were calculated for categorical and continuous variables respectively. Data of 1895 patients from 10 studies have been included in this meta- analysis. Overall cannulation success rate was similar between ultrasound guided technique and digital palpation [OR (95% CI) 2.01 (1.00, 4.06); p = 0.05]. Ultrasound guided radial artery cannulation is associated with higher first attempt success rate of radial artery cannulation in comparison to digital palpation [OR (95% CI) 2.76 (186, 4.10); p guided technique with palpation technique. Radial artery cannulation by ultrasound guidance may increase the first attempt success rate but not the overall cannulation success when compared to digital palpation technique. However, results of this meta-analysis should be interpreted with caution due presence of heterogeneity. Copyright © 2018. Published by Elsevier Inc.

  4. A hybrid electron and photon IMRT planning technique that lowers normal tissue integral patient dose using standard hardware.

    Science.gov (United States)

    Rosca, Florin

    2012-06-01

    To present a mixed electron and photon IMRT planning technique using electron beams with an energy range of 6-22 MeV and standard hardware that minimizes integral dose to patients for targets as deep as 7.5 cm. Ten brain cases, two lung, a thyroid, an abdominal, and a parotid case were planned using two planning techniques: a photon-only IMRT (IMRT) versus a mixed modality treatment (E+IMRT) that includes an enface electron beam and a photon IMRT portion that ensures a uniform target coverage. The electron beam is delivered using a regular cutout placed in an electron cone. The electron energy was chosen to provide a good trade-off between minimizing integral dose and generating a uniform, deliverable plan. The authors choose electron energies that cover the deepest part of PTV with the 65%-70% isodose line. The normal tissue integral dose, the dose for ring structures around the PTV, and the volumes of the 75%, 50%, and 25% isosurfaces were used to compare the dose distributions generated by the two planning techniques. The normal tissue integral dose was lowered by about 20% by the E+IMRT plans compared to the photon-only IMRT ones for most studied cases. With the exception of lungs, the dose reduction associated to the E+IMRT plans was more pronounced further away from the target. The average dose ratio delivered to the 0-2 cm and the 2-4 cm ring structures for brain patients for the two planning techniques were 89.6% and 70.8%, respectively. The enhanced dose sparing away from the target for the brain patients can also be observed in the ratio of the 75%, 50%, and 25% isodose line volumes for the two techniques, which decreases from 85.5% to 72.6% and further to 65.1%, respectively. For lungs, the lateral electron beams used in the E+IMRT plans were perpendicular to the mostly anterior/posterior photon beams, generating much more conformal plans. The authors proved that even using the existing electron delivery hardware, a mixed electron/photon planning

  5. Application status of on-line nuclear techniques in analysis of coal quality

    International Nuclear Information System (INIS)

    Cai Shaohui

    1993-01-01

    Nuclear techniques are favourable for continuous on-line analysis, because they are fast, non-intrusive. They can be used in the adverse circumstances in coal industry. The paper reviews the application status of on-line nuclear techniques in analysis of coal quality and economic benefits derived from such techniques in developed countries

  6. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  7. Cybersecurity Vulnerability Analysis of the PLC PRIME Standard

    Directory of Open Access Journals (Sweden)

    Miguel Seijo Simó

    2017-01-01

    Full Text Available Security in critical infrastructures such as the power grid is of vital importance. The Smart Grid puts power grid classical security approach on the ropes, since it introduces cyberphysical systems where devices, communications, and information systems must be protected. PoweRline Intelligent Metering Evolution (PRIME is a Narrowband Power-Line Communications (NB-PLC protocol widely used in the last mile of Advanced Metering Infrastructure (AMI deployments, playing a key role in the Smart Grid. Therefore, this work aims to unveil the cybersecurity vulnerabilities present in PRIME standard, proposing solutions and validating and discussing the results obtained.

  8. Development, improvement and calibration of neutronic reaction rate measurements: elaboration of a base of standard techniques; Developpement, amelioration et calibration des mesures de taux de reaction neutroniques: elaboration d`une base de techniques standards

    Energy Technology Data Exchange (ETDEWEB)

    Hudelot, J.P

    1998-06-19

    In order to improve and to validate the neutronic calculation schemes, perfecting integral measurements of neutronic parameters is necessary. This thesis focuses on the conception, the improvement and the development of neutronic reaction rates measurements, and aims at building a base of standard techniques. Two subjects are discussed. The first one deals with direct measurements by fission chambers. A short presentation of the different usual techniques is given. Then, those last ones are applied through the example of doubling time measurements on the EOLE facility during the MISTRAL 1 experimental programme. Two calibration devices of fission chambers are developed: a thermal column located in the central part of the MINERVE facility, and a calibration cell using a pulsed high flux neutron generator and based on the discrimination of the energy of the neutrons with a time-of-flight method. This second device will soon allow to measure the mass of fission chambers with a precision of about 1 %. Finally, the necessity of those calibrations will be shown through spectral indices measurements in core MISTRAL 1 (UO{sub 2}) and MISTRAL 2 (MOX) of the EOLE facility. In each case, the associated calculation schemes, performed using the Monte Carlo MCNP code with the ENDF-BV library, will be validated. Concerning the second one, the goal is to develop a method for measuring the modified conversion ratio of {sup 238}U (defined as the ratio of {sup 238}U capture rate to total fission rate) by gamma-ray spectrometry of fuel rods. Within the framework of the MISTRAL 1 and MISTRAL 2 programmes, the measurement device, the experimental results and the spectrometer calibration are described. Furthermore, the MCNP calculations of neutron self-shielding and gamma self-absorption are validated. It is finally shown that measurement uncertainties are better than 1 %. The extension of this technique to future modified conversion ratio measurements for {sup 242}Pu (on MOX rods) and

  9. Comparative analysis of data mining techniques for business data

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  10. ANALYSIS OF RELATIONS BETWEEN JUDO TECHNIQUES AND SPECIFIC MOTOR ABILITIES

    Directory of Open Access Journals (Sweden)

    Patrik Drid

    2006-06-01

    Full Text Available Specific physical preparation affects the development of motor abilities required for execution of specific movements in judo. When selecting proper specific exercises for judo for a target motor ability, it is necessary to precede it with the study of the structure of specific judo techniques and activities of individual muscle groups engaged for execution of the technique. On the basis of this, one can understand which muscles are most engaged during realization of individual techniques, which serves as a standpoint for selection of a particular complex of specific exercises to produce the highest effects. In addition to the development of particular muscle groups, the means of specific preparation will take effect on the development of those motor abilities which are evaluated as the indispensable for the development of particular qualities which are characteristic for judo. This paper analyses the relationship between judo techniques field and specific motor abilities.

  11. Techniques for the Statistical Analysis of Observer Data

    National Research Council Canada - National Science Library

    Bennett, John G

    2001-01-01

    .... The two techniques are as follows: (1) fitting logistic curves to the vehicle data, and (2) using the Fisher Exact Test to compare the probability of detection of the two vehicles at each range...

  12. Analysis of neutron-reflectometry data by Monte Carlo technique

    CERN Document Server

    Singh, S

    2002-01-01

    Neutron-reflectometry data is collected in momentum space. The real-space information is extracted by fitting a model for the structure of a thin-film sample. We have attempted a Monte Carlo technique to extract the structure of the thin film. In this technique we change the structural parameters of the thin film by simulated annealing based on the Metropolis algorithm. (orig.)

  13. A Comparative Analysis of Machine Learning Techniques for Credit Scoring

    OpenAIRE

    Nwulu, Nnamdi; Oroja, Shola; İlkan, Mustafa

    2012-01-01

    Abstract Credit Scoring has become an oft researched topic in light of the increasing volatility of the global economy and the recent world financial crisis. Amidst the many methods used for credit scoring, machine learning techniques are becoming increasingly popular due to their efficient and accurate nature and relative simplicity. Furthermore machine learning techniques minimize the risk of human bias and error and maximize speed as they are able to perform computation...

  14. Protein purification and analysis: next generation Western blotting techniques.

    Science.gov (United States)

    Mishra, Manish; Tiwari, Shuchita; Gomes, Aldrin V

    2017-11-01

    Western blotting is one of the most commonly used techniques in molecular biology and proteomics. Since western blotting is a multistep protocol, variations and errors can occur at any step reducing the reliability and reproducibility of this technique. Recent reports suggest that a few key steps, such as the sample preparation method, the amount and source of primary antibody used, as well as the normalization method utilized, are critical for reproducible western blot results. Areas covered: In this review, improvements in different areas of western blotting, including protein transfer and antibody validation, are summarized. The review discusses the most advanced western blotting techniques available and highlights the relationship between next generation western blotting techniques and its clinical relevance. Expert commentary: Over the last decade significant improvements have been made in creating more sensitive, automated, and advanced techniques by optimizing various aspects of the western blot protocol. New methods such as single cell-resolution western blot, capillary electrophoresis, DigiWest, automated microfluid western blotting and microchip electrophoresis have all been developed to reduce potential problems associated with the western blotting technique. Innovative developments in instrumentation and increased sensitivity for western blots offer novel possibilities for increasing the clinical implications of western blot.

  15. An Analysis of the Doodle Tonguing Technique for Trombone and its Application to Performance

    OpenAIRE

    Vizard, Christopher

    2018-01-01

    This exegesis investigates the Doodle Tonguing technique for trombone, and its use in jazz performance. Doodle Tonguing is a modification of the standard tonguing techniques used by brass players and was developed for the trombone by Carl Fontana in the late 1940’s. A review of literature showed that Doodle Tonguing was developed in order to allow trombonists to play difficult rapid passages with greater fluidity and evenness than could be achieved with standard tonguing tech...

  16. Refined analysis of piping systens according to nuclear standard regulations

    International Nuclear Information System (INIS)

    Bisconti, N.; Lazzeri, L.; Strona, P.P.

    1975-01-01

    A number of programs have been selected to perform particular analyses partly coming from available libraries such as SAP 4 for static and dynamic analysis, partly directly written such as TRATE (for thermal analysis), VASTA, VASTB (to perform the analysis required by ASME 3 for pipings of class A and class B), CFRS (for the calculation of floor response spectra etc.). All the programs are automatically linked and directed by a general program (SCATCA for class A and SCATCB for class B pipings). The starting point is a list of the fabrication, thermal, geometrical and seismic data. The geometrical data are plotted (to check for possible errors) and fed to SAP for static and dynamic analysis together with seismic data and thermal data (average temperatures) reelaborated by TRATE 2 code. The raw data from SAP (weight, thermal, fixed points displacements, seismic, other dynamic) are concerned and reordered and fed to COMBIN 2 program together with the other data from thermal analysis (from TRATE 2). From Combin 2 program all the data are listed; each load set to be considered is provided, for each point, with the necessary data (thermal moments, pressure, average temperatures, thermal gradients), all the data from seismic, weight, and other dynamic analysis are also provided. All this amount of data is stored on a file and examined by VASTA code (for class A) or VASTB (for classes B,C) in order to make a decision about the acceptability of the design. Each subprogram may have an independent output in order to check partial results. Details about each program are provided and an exemple is given, together with a discussion of some-particular problems (thermohydraulic set definition, fatigue analysis, etc.)

  17. A new technique for quantitative analysis of hair loss in mice using grayscale analysis.

    Science.gov (United States)

    Ponnapakkam, Tulasi; Katikaneni, Ranjitha; Gulati, Rohan; Gensure, Robert

    2015-03-09

    Alopecia is a common form of hair loss which can occur in many different conditions, including male-pattern hair loss, polycystic ovarian syndrome, and alopecia areata. Alopecia can also occur as a side effect of chemotherapy in cancer patients. In this study, our goal was to develop a consistent and reliable method to quantify hair loss in mice, which will allow investigators to accurately assess and compare new therapeutic approaches for these various forms of alopecia. The method utilizes a standard gel imager to obtain and process images of mice, measuring the light absorption, which occurs in rough proportion to the amount of black (or gray) hair on the mouse. Data that has been quantified in this fashion can then be analyzed using standard statistical techniques (i.e., ANOVA, T-test). This methodology was tested in mouse models of chemotherapy-induced alopecia, alopecia areata and alopecia from waxing. In this report, the detailed protocol is presented for performing these measurements, including validation data from C57BL/6 and C3H/HeJ strains of mice. This new technique offers a number of advantages, including relative simplicity of application, reliance on equipment which is readily available in most research laboratories, and applying an objective, quantitative assessment which is more robust than subjective evaluations. Improvements in quantification of hair growth in mice will improve study of alopecia models and facilitate evaluation of promising new therapies in preclinical studies.

  18. DAG expression: high-throughput gene expression analysis of real-time PCR data using standard curves for relative quantification.

    Directory of Open Access Journals (Sweden)

    María Ballester

    Full Text Available BACKGROUND: Real-time quantitative PCR (qPCR is still the gold-standard technique for gene-expression quantification. Recent technological advances of this method allow for the high-throughput gene-expression analysis, without the limitations of sample space and reagent used. However, non-commercial and user-friendly software for the management and analysis of these data is not available. RESULTS: The recently developed commercial microarrays allow for the drawing of standard curves of multiple assays using the same n-fold diluted samples. Data Analysis Gene (DAG Expression software has been developed to perform high-throughput gene-expression data analysis using standard curves for relative quantification and one or multiple reference genes for sample normalization. We discuss the application of DAG Expression in the analysis of data from an experiment performed with Fluidigm technology, in which 48 genes and 115 samples were measured. Furthermore, the quality of our analysis was tested and compared with other available methods. CONCLUSIONS: DAG Expression is a freely available software that permits the automated analysis and visualization of high-throughput qPCR. A detailed manual and a demo-experiment are provided within the DAG Expression software at http://www.dagexpression.com/dage.zip.

  19. Drought analysis of Antalya province by standardized precipitation index (SPI

    Directory of Open Access Journals (Sweden)

    Nazmi DİNÇ

    2016-12-01

    Full Text Available Drought is occurring as a result of global warming in our country and as well as over the world and defined as the precipitation deficit in a certain time period which is lower than that of the normal. It affects negatively all of the living being. Many drought indices have been developed to define the severity and characteristics of drought over time and space. In this study, drought characteristics have been evaluated by using the Standardized Precipitation Index (SPI in meteorological stations located in Alanya, Antalya, Demre, Elmalı, Finike, Gazipaşa, Korkuteli and Manavgat having long term data (1974-2014. According to 3-, 6-, 12- and 24- months time scales, the trend in SPI values are not decreasing and the SPI values were found to be between 0.99 (normal and ~-0.99 (drought close to normal. It is concluded that drought can occur in summer as well as in winter.

  20. Standard guide for corrosion-related failure analysis

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2000-01-01

    1.1 This guide covers key issues to be considered when examining metallic failures when corrosion is suspected as either a major or minor causative factor. 1.2 Corrosion-related failures could include one or more of the following: change in surface appearance (for example, tarnish, rust, color change), pin hole leak, catastrophic structural failure (for example, collapse, explosive rupture, implosive rupture, cracking), weld failure, loss of electrical continuity, and loss of functionality (for example, seizure, galling, spalling, swelling). 1.3 Issues covered include overall failure site conditions, operating conditions at the time of failure, history of equipment and its operation, corrosion product sampling, environmental sampling, metallurgical and electrochemical factors, morphology (mode) or failure, and by considering the preceding, deducing the cause(s) of corrosion failure. This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibili...

  1. Evaluation of nuclear reactor based activation analysis techniques

    International Nuclear Information System (INIS)

    Obrusnik, I.; Kucera, J.

    1977-09-01

    A survey is presented of the basic types of activation analysis applied in environmental control. Reactor neutron activation analysis is described (including the reactor as a neutron source, sample activation in the reactor, methodology of neutron activation analysis, sample transport into the reactor and sample packaging after irradiation, instrumental activation analysis with radiochemical separation, data measurement and evaluation, sampling and sample preparation). Sources of environmental contamination with trace elements, sampling and sample analysis by neutron activation are described. The analysis is described of soils, waters and biological materials. Methods are shown of evaluating neutron activation analysis results and of their interpretation for purposes of environmental control. (J.B.)

  2. Comparison of standard PCR/cloning to single genome sequencing for analysis of HIV-1 populations.

    Science.gov (United States)

    Jordan, Michael R; Kearney, Mary; Palmer, Sarah; Shao, Wei; Maldarelli, Frank; Coakley, Eoin P; Chappey, Colombe; Wanke, Christine; Coffin, John M

    2010-09-01

    To compare standard PCR/cloning and single genome sequencing (SGS) in their ability to reflect actual intra-patient polymorphism of HIV-1 populations, a total of 530 HIV-1 pro-pol sequences obtained by both sequencing techniques from a set of 17 ART naïve patient specimens was analyzed. For each specimen, 12 and 15 sequences, on average, were characterized by the two techniques. Using phylogenetic analysis, tests for panmixia and entropy, and Bland-Altman plots, no difference in population structure or genetic diversity was shown in 14 of the 17 subjects. Evidence of sampling bias by the presence of subsets of identical sequences was found by either method. Overall, the study shows that neither method was more biased than the other, and providing that an adequate number of PCR templates is analyzed, and that the bulk sequencing captures the diversity of the viral population, either method is likely to provide a similar measure of population diversity. Copyright 2010 Elsevier B.V. All rights reserved.

  3. Twitter Sentiment Analysis of Movie Reviews using Machine Learning Techniques.

    OpenAIRE

    Akshay Amolik; Niketan Jivane; Mahavir Bhandari; Dr.M.Venkatesan

    2015-01-01

    Sentiment analysis is basically concerned with analysis of emotions and opinions from text. We can refer sentiment analysis as opinion mining. Sentiment analysis finds and justifies the sentiment of the person with respect to a given source of content. Social media contain huge amount of the sentiment data in the form of tweets, blogs, and updates on the status, posts, etc. Sentiment analysis of this largely generated data is very useful to express the opinion of the mass. Twitter sentiment a...

  4. Analysis of obsydians and films of silicon carbide by RBS technique

    International Nuclear Information System (INIS)

    Franco S, F.

    1998-01-01

    Motivated by archaeological interest this work is presented, which consist in the characterization of obsydian samples from different mineral sites in Mexico and films of silicon carbide, undertaken by an Ion Beam Analysis: RBS (Rutherford Back Scattering). As part of an intensive investigation of obsydian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in Central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In the first part of this work, the non-destructive IBA technique, RBS are used to analyze obsydian samples. The last part is an analysis of thin films of silicon carbide as a part of a research program of the Universidad Nacional Autonoma de Mexico and ININ. The application of this technique were carried out at the IF-UNAM, and the analysis was performed at laboratories of the ININ Nuclear Centre facilities. The samples considered in this work were mounted on a sample holder designed for the purpose of exposing each sample to the alpha particles beam. This RBS analysis was carried out with an ET Tandem accelerator at the IF UNAM. The spectrometry was carried out with employing a Si(Li) detector set at 15 degrees in relation to the target normal. The mean projectile energy was 2.00 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (MIchoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza (Puebla), Guadalupe Victoria (Puebla) and Oyameles (Puebla). The mean values are accompanied by errors expressed as one standard devistion of the mean for each element

  5. Production of uranium standard samples for spectrographic analysis

    International Nuclear Information System (INIS)

    Neuilly, M.; Leclerc, J.C.

    1969-01-01

    This report describes the conditions of preparation of twelve castings of uranium intended for use as reference samples in spectrographic analysis. Results are given of impurity determinations carried out by several laboratories using different methods, together with the 'probable values' of the concentrations. Samples of these different castings are now available and can be sent to any laboratory which requires them. (authors) [fr

  6. Proposed minimum reporting standards for data analysis in metabolomics

    NARCIS (Netherlands)

    Goodacre, R.; Broadhurst, D.; Smilde, A.K.; Kristal, B.S.; Baker, J.D.; Beger, R.; Bessant, C.; Connor, S.; Capuani, G.; Craig, A.; Ebbels, T.; Kell, D.B.; Manetti, C.; Newton, J.; Paternostro, G.; Somorjai, R.; Sjöström, M.; Trygg, J.; Wulfert, F.

    2007-01-01

    The goal of this group is to define the reporting requirements associated with the statistical analysis (including univariate, multivariate, informatics, machine learning etc.) of metabolite data with respect to other measured/collected experimental data (often called meta-data). These definitions

  7. Tribological analysis of nano clay/epoxy/glass fiber by using Taguchi’s technique

    International Nuclear Information System (INIS)

    Senthil Kumar, M.S.; Mohana Sundara Raju, N.; Sampath, P.S.; Vivek, U.

    2015-01-01

    Highlights: • To study the tribological property of modified epoxy with and without E glass fiber. • To analyze the tribological property of specimens by Taguchi’s technique and ANOVA. • To investigate the surface morphology of test specimens with SEM. - Abstract: In this work, a detailed analysis was performed to profoundly study the tribological property of various nano clay (Cloisite 25A) loaded epoxy, with and without inclusion of E-glass fiber using Taguchi’s technique. For this purpose, the test samples were prepared according to the ASTM standard, and the test was carried out with the assistance of pin-on-disk machine. To proceed further, L 25 orthogonal array was constructed to evaluate the tribological property with four control variables such as filler content, normal load, sliding velocity and sliding distance at each level. The results indicated that the combination of factors greatly influenced the process to achieve the minimum wear and coefficient of friction. Overall, the experiment results depicted least wear and friction coefficient for fiber reinforced laminates. In the same way, appreciable wear and friction coefficient was noted for without fiber laminates. Additionally, the SN ratio results too exhibited the similar trend. Moreover, ANOVA analysis revealed that the fiber inclusion on laminates has lesser contribution on coefficient of friction and wear when compared to without fiber laminates. At last, the microstructure behavior of the test samples was investigated with an assistance of Scanning Electron Microscope (SEM) to analyze the surface morphology

  8. Use of an internal standard 233U, 236U to improve the accuracy of isotopic uranium analysis by thermal ionization mass spectrometry. Application to isotope dilution analysis

    International Nuclear Information System (INIS)

    Chevalier, C.; Hagemann, R.; Lucas, M.; Devillers, C.

    1982-01-01

    A method using a calibrated mixture of isotopes 233 U and 236 U has been developed in order to correct the isotopic fractionation which limits the accuracy of isotopic analysis by thermal ionization mass spectrometry. The 236/233 internal standard ratio is calibrated against the 235/238 ratios of uranium isotopic standards. To perform the analysis of the unknown sample, the latter is mixed with the internal standard, the differences between the true value and the observed values of the 236/233 ratio allows the determination of a correction factor, which is applied to the measured 235/238 ratio values. Since 1978, 235 U abundance measurements on series of samples have been performed, using this technique; data are obtained with an accuracy better than 0,05%. It is intended to apply this method for precise determination of 238/233 ratio in the case of uranium concentration measurements by isotope dilution [fr

  9. Performance analysis of two-way DF relay selection techniques

    Directory of Open Access Journals (Sweden)

    Samer Alabed

    2016-09-01

    Full Text Available This work proposes novel bi-directional dual-relay selection techniques based on Alamouti space-time block coding (STBC using the decode and forward (DF protocol and analyzes their performance. In the proposed techniques, two- and the three-phase relaying schemes are used to perform bi-directional communication between the communicating terminals via two selected single-antenna relays that employ the Alamouti STBC in a distributed fashion to achieve diversity and orthogonalization of the channels and hence improve the reliability of the system and enable the use of a symbol-wise detector. Furthermore, the network coding strategy applied at all relays is not associated with any power wastage for broadcasting data already known at any terminal, resulting in improved overall performance at the terminals. Our simulations confirm the analytical results and show a substantially improved bit error rate (BER performance of our proposed techniques compared with the current state of the art.

  10. Radon remedial techniques in buildings - analysis of French actual cases

    International Nuclear Information System (INIS)

    Dupuis, M.

    2004-01-01

    The IRSN has compiled a collection of solutions from data provided by the various decentralised government services in 31 French departments. Contributors were asked to provide a description of the building, as well as details of measured radon levels, the type of reduction technique adopted and the cost. Illustrative layouts, technical drawings and photographs were also requested, when available. Of the cases recorded, 85% are establishments open to the public (schools (70%), city halls (4%) and combined city halls and school houses (26%)), 11% are houses and 4% industrial buildings. IRSN obtained 27 real cases of remedial techniques used. The data were presented in the form of fact sheets. The primary aim of this exercise was to illustrate each of the radon reduction techniques that can be used in the different building types (with basement, ground bearing slab, crawl space). This investigation not only enabled us to show that combining passive and active techniques reduces the operating cost of the installation, but above all that it considerably improves the efficiency. The passive technique reduces the amount of radon in the building and thus reduces the necessary ventilation rate, which directly affects the cost of operating the installation. For the 27 cases recorded, we noted:(a) the application of 7 passive techniques: sealing of floors and semi-buried walls, together with improved aeration by installing ventilation openings or ventilation strips in the windows. Radon concentrations were reduced on average by a factor of 4.7. No measurement in excess of 400 Bq.m -3 (the limit recommended by the French public authorities) was obtained following completion of the works; (b) the application of 15 active techniques: depressurization of the underlying ground, crawl space or basement and/or pressurization of the building. Radon concentrations were reduced on average by a factor of 13.8. Radon concentrations of over 400 Bq.m -3 were measured in only 4 cases

  11. Genetic programming system for building block analysis to enhance data analysis and data mining techniques

    Science.gov (United States)

    Eick, Christoph F.; Sanz, Walter D.; Zhang, Ruijian

    1999-02-01

    Recently, many computerized data mining tools and environments have been proposed for finding interesting patterns in large data collections. These tools employ techniques that originate from research in various areas, such as machine learning, statistical data analysis, and visualization. Each of these techniques makes assumptions concerning the composition of the data collection to be analyzed. If the particular data collection does not meet these assumptions well, the technique usually performs poorly. For example, decision tree tools, such as C4.5, rely on rectangular approximations, which do not perform well if the boundaries between different classes have other shapes, such as a 45 degree line or elliptical shapes. However, if we could find a transformation f that transforms the original attribute space, in which class boundaries are more, better rectangular approximations could be obtained. In this paper, we address the problem of finding such transformations f. We describe the features of the tool, WOLS, whose goal is the discovery of ingredients for such transformation functions f, which we call building blocks. The tool employs genetic programming and symbolic regression for this purpose. We also present and discuss the results of case studies, using the building block analysis tool, in the areas of decision tree learning and regression analysis.

  12. Geotechnical Analysis of Paleoseismic Shaking Using Liquefaction Features: Part I. Major Updating of Analysis Techniques

    Science.gov (United States)

    Olson, Scott M.; Green, Russell A.; Obermeier, Stephen F.

    2003-01-01

    A new methodology is proposed for the geotechnical analysis of strength of paleoseismic shaking using liquefaction effects. The proposed method provides recommendations for selection of both individual and regionally located test sites, techniques for validation of field data for use in back-analysis, and use of a recently developed energy-based solution to back-calculate paleoearthquake magnitude and strength of shaking. The proposed method allows investigators to assess the influence of post-earthquake density change and aging. The proposed method also describes how the back-calculations from individual sites should be integrated into a regional assessment of paleoseismic parameters.

  13. Current trends in nuclear borehole logging techniques for elemental analysis

    International Nuclear Information System (INIS)

    1988-06-01

    This report is the result of a consultants' meeting organized by the IAEA and held in Ottawa, Canada, 2-6 November 1987 in order to assess the present technical status of nuclear borehole logging techniques, to find out the well established applications and the development trends. It contains a summary report giving a comprehensive overview of the techniques and applications and a collection of research papers describing work done in industrial institutes. A separate abstract was prepared for each of these 9 papers. Refs, figs and tabs

  14. A review on applications of the wavelet transform techniques in spectral analysis

    International Nuclear Information System (INIS)

    Medhat, M.E.; Albdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Awaad, Z.

    2004-01-01

    Starting from 1989, a new technique known as wavelet transforms (WT) has been applied successfully for analysis of different types of spectra. WT offers certain advantages over Fourier transforms for analysis of signals. A review of using this technique through different fields of elemental analysis is presented

  15. Determining the Number of Factors in P-Technique Factor Analysis

    Science.gov (United States)

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  16. Novel thermal imaging analysis technique for detecting inflammation in thyroid eye disease.

    Science.gov (United States)

    Di Maria, Costanzo; Allen, John; Dickinson, Jane; Neoh, Christopher; Perros, Petros

    2014-12-01

    The disease phase in thyroid eye disease (TED) is commonly assessed by clinical investigation of cardinal signs of inflammation and using the clinical activity score (CAS). Although CAS is the current gold standard, the clinical assessment would benefit if a more objective tool were available. The aim of this work was to explore the clinical value of a novel thermal imaging analysis technique to objectively quantify the thermal characteristics of the eye and peri-orbital region and determine the disease phase in TED. This was a cross-sectional study comparing consecutive patients with active TED (CAS ≥ 3/7) attending a tertiary center, with a group of consecutive patients with inactive TED (CAS <3). Thermal images were acquired from 30 TED patients, 17 with active disease and 13 with inactive disease. Patients underwent standard ophthalmological clinical assessments and thermal imaging. Five novel thermal eye parameters (TEP) were developed to quantify the thermal characteristics of the eyes in terms of the highest level of inflammation (TEP1), overall level of inflammation (TEP2), right-left asymmetry in the level of inflammation (TEP3), maximum temperature variability across the eyes (TEP4), and right-left asymmetry in the temperature variability (TEP5). All five TEP were increased in active TED. TEP1 gave the largest accuracy (77%) at separating the two groups, with 65% sensitivity and 92% specificity. A statistical model combining all five parameters increased the overall accuracy, compared to using only one parameter, to 93% (94% sensitivity and 92% specificity). All five of the parameters were also found to be increased in patients with chemosis compared to those without. The potential diagnostic value of this novel thermal imaging analysis technique has been demonstrated. Further investigation on a larger group of patients is necessary to confirm these results.

  17. Technique for evaluation of spatial resolution and microcalcifications in digital and scanned images of a standard breast phantom

    International Nuclear Information System (INIS)

    Santana, Priscila do C.; Gomes, Danielle S.; Oliveira, Marcio A.; Oliveira, Paulo Marcio C. de; Meira-Belo, Luiz C.; Nogueira-Tavares, Maria S.

    2011-01-01

    In this work, an automated methodology to evaluate digital and scanned images of a standard phantom (Phantom Mama) was studied. The Phantom Mama was used as an important tool to check the quality of mammographs. The scanned images were digitized using a ScanMaker 9800XL, with resolution of 900 dpi. The aim of this work is to test an automatic methodology for evaluation of spatial resolution and microcalcifications group of phantom mama images acquired with the same parameters in the same equipment. In order to analyze the images we have used the ImageJ software (in Java) which is public domain. We have used the Fast Fourier transform technique to evaluate the spatial resolution and used the ImageJ function Subtract Background and the Light Background plus Sliding Paraboloid on the evaluation of the five groups of microcalcifications on the breast phantom to assess the viability of using automated methods for both types of images. The methodology was adequate for evaluated the microcalcifications group and the spatial resolution in scanned and digital images, but the Phantom Mama doesn't provide sufficient parameters to evaluate the spatial resolution in this images. (author)

  18. Integrated Data Analysis (IDCA) Program - PETN Class 4 Standard

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Shelley, Timothy J. [Air Force Research Lab. (AFRL), Tyndall AFB, FL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2012-08-01

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of PETN Class 4. The PETN was found to have: 1) an impact sensitivity (DH50) range of 6 to 12 cm, 2) a BAM friction sensitivity (F50) range 7 to 11 kg, TIL (0/10) of 3.7 to 7.2 kg, 3) a ABL friction sensitivity threshold of 5 or less psig at 8 fps, 4) an ABL ESD sensitivity threshold of 0.031 to 0.326 j/g, and 5) a thermal sensitivity of an endothermic feature with Tmin = ~ 141 °C, and a exothermic feature with a Tmax = ~205°C.

  19. C. F. Braun. Standard turbine island design, safety analysis report

    International Nuclear Information System (INIS)

    1974-01-01

    A standard turbine island used with a BWR is described. It consists of the turbine-generator; steam system; condensate storage, cleanup, and transfer systems; control and instrumentation; water treatment plant; make-up demineralizer; potable and waste water systems; and a compressed air system. The turbine-generator is a tandem-compound nuclear-type turbine with one double-flow high-pressure section and a six-flow low-pressure section in three double-flow low-pressure casings. The turbine is direct connected to an 1800 rpm synchronous a-c generator. A combined moisture separator and two-stage reheater is provided. The main steam system delivers the steam generated in a BWR to the main turbine stop valves. The condensate system maintains proper water inventory. Protective features prevent loss of the system due to electrical failure of a component and isolates faults to ensure continuity of a power supply from alternate sources. (U.S.)

  20. Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelley, Timothy J. [Air Force Research Lab. (AFRL), Tyndall AFB, FL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-10-30

    The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statistically significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.