WorldWideScience

Sample records for bkg igex analysis

  1. Two factors influencing dose reconstruction in low dose range: the variability of BKG intensity on one individual and water content

    International Nuclear Information System (INIS)

    Zhang, Tengda; Zhang, Wenyi; Zhao, Zhixin; Zhang, Haiying; Ruan, Shuzhou; Jiao, Ling

    2016-01-01

    A fast and accurate retrospective dosimetry method for the triage is very important in radiation accidents. Electron paramagnetic resonance (EPR) fingernail dosimetry is a promising way to estimate radiation dose. This article presents two factors influencing dose reconstruction in low dose range: the variability of background signal (BKG) intensity on one individual and water content. Comparing the EPR spectrum of dried and humidified fingernail samples, it is necessary to add a procedure of dehydration before EPR measurements, so as to eliminate the deviation caused by water content. Besides, the BKGs of different fingers' nails are not the same as researchers thought previously, and the difference between maximum and minimum BKG intensities of one individual can reach 55.89 %. Meanwhile, the variability of the BKG intensity among individuals is large enough to impact precise dose reconstruction. Water within fingernails and instability of BKG are two reasons that cause the inaccuracy of radiation dose reconstruction in low-dosage level. (authors)

  2. Guide on reflectivity data analysis

    International Nuclear Information System (INIS)

    Lee, Jeong Soo; Ku, Ja Seung; Seong, Baek Seok; Lee, Chang Hee; Hong, Kwang Pyo; Choi, Byung Hoon

    2004-09-01

    This report contains reduction and fitting process of neutron reflectivity data by REFLRED and REFLFIT in NIST. Because the detail of data reduction like BKG, footprint and data normalization was described, it will be useful to the user who has no experience in this field. Also, reflectivity and BKG of d-PS thin film were measured by HANARO neutron reflectometer. From these, the structure of d-PS thin film was analyzed with REFLRED and REFLFIT. Because the structure of thin film such as thickness, roughness and SLD was attained in the work, the possibility of data analysis with REFLRED and REFLFIT was certified

  3. Greenhouse gases reduction in the Unilever Sustainable Living Plan; Halveren BKG's doel van duurzaamheidsplan Unilver

    Energy Technology Data Exchange (ETDEWEB)

    Van Gerwen, R.J.M. [Refrigeration and HVAC Unilver Engineering Services, Aberdeen, Scotland (United Kingdom)

    2011-09-15

    The Unilever Sustainable Living Plan will result in three significant outcomes by 2020: (1) help more than a billion people take action to improve their health and well-being; (2) decouple growth from the environmental impact of Unilever activities, achieving absolute reductions across the product lifecycle. Unilever's goal is to halve the environmental footprint of the making and use of products; and (3) enhance the livelihoods of hundreds of thousands of people in the supply chain. [Dutch] Unilever lanceerde zijn Sustainable Living Plan aan het eind van 2010. Dit plan is gericht op het verdubbelen van de groei van het bedrijf, waarbij het gebruik van water, afval en de emissie van broeikasgassen (BKG) halveert gedurende de levenscyclus van de producten. Productiebedrijven vertegenwoordigen slechts drie procent van de broeikasgas footprint, maar Unilever is daar direct verantwoordelijk voor. Voor de carbon footprint van de productie is daarom een uitgebreid implementatieplan ontwikkeld, met inbegrip van concepten voor nieuwe 'groene' fabrieken en productielijnen, productieprocesverbeteringen, het gebruik van hernieuwbare energie en verbeteringen in de bestaande fabrieken. Voor opwekking van koude en beperking van de uitstoot van broeikasgassen zijn tien verbeterpunten geidentificeerd, en is het besparingspotentieel gekwantificeerd.

  4. New approach for simplified and automated measurement of left ventricular ejection fraction by ECG gated blood pool scintigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Inagaki, Suetsugu; Adachi, Haruhiko; Sugihara, Hiroki; Katsume, Hiroshi; Ijichi, Hamao; Okamoto, Kunio; Hosoba, Minoru

    1984-12-01

    Background (BKG) correction is important but debatable in the measurement of Left ventricular ejection fraction (LVEF) with ECG gated blood pool scintigraphy. We devised a new simplified BKG processing (fixed BKG method) without BKG region-of-interest (ROI) assignment, and the accuracy and reproducibility were assessed in 25 patients with various heart diseases and 5 normal subjects by comparison with LVEF obtained by contrast levolgraphy (LVG-EF). Four additional protocols for LVEF measurement with BKG-ROI assignment were also assessed for reference. LVEF calculated using the fixed BKG ratio of 0.64 (BKG count rates were 64%) of end-diastolic count rates of LV) with ''Fixed'' LV-ROI was best correlated with LVG-EF (r = 0.936, p < 0.001) and most approximated (Fixed BKG ratio method EF: 61.1 +- 20.1, LVG-EF: 61.2 +- 20.4% (mean +- SD)) among other protocols. The wide availability of the fixed value of 0.64 was tested in various diseases, body size and end-diastolic volume by LVG, and the results were to be little influenced by them. Furthermore, fixed BKG method produced lower inter-and intra- observer variability than other protocols requiring BKG-ROI assignment, probably due to its simplified processing. In conclusion, fixed BKG ratio method simplifies the measurement of LVEF, and is feasible for automated processing and single probe system.

  5. Biodistribution of Boron compounds in an experimental model of liver metastases for Boron Neutron Capture (BNCT) Studies

    International Nuclear Information System (INIS)

    Garabalino, Marcela A.; Monti Hughes, Andrea; Molinari, Ana J.; Heber, Elisa M.; Pozzi, Emiliano C.C.; Itoiz, Maria E.; Trivillin, Veronica A.; Schwint, Amanda E.; Nievas, Susana; Aromando, Romina F.

    2009-01-01

    Boron Neutron Capture Therapy (BNCT) is a binary treatment modality that involves the selective accumulation of 10 B carriers in tumors followed by irradiation with thermal or epithermal neutrons. The high linear energy transfer alpha particles and recoiling 7 Li nuclei emitted during the capture of a thermal neutron by a 10 B nucleus have a short range and a high biological effectiveness. Thus, BNCT would potentially target neoplastic tissue selectively. In previous studies we demonstrated the therapeutic efficacy of different BNCT protocols in an experimental model of oral cancer. More recently we performed experimental studies in normal rat liver that evidenced the feasibility of treating liver metastases employing a novel BNCT protocol proposed by JEC based on ex-situ treatment and partial liver auto-transplant. The aim of the present study was to perform biodistribution studies with different boron compounds and different administration protocols to determine the protocols that would be therapeutically useful in 'in vivo' BNCT studies at the RA-3 Nuclear Reactor in an experimental model of liver metastases in rats. Materials and Methods. A total of 70 BDIX rats (Charles River Lab., MA, USA) were inoculated in the liver with syngeneic colon cancer cells DH/DK12/TRb (ECACC, UK) to induce the development of subcapsular metastatic nodules. 15 days post-inoculation the animals were used for biodistribution studies. A total of 11 protocols were evaluated employing the boron compounds boronophenylalanine (BPA) and GB-10 (Na 2 10 B 1 -0H 10 ), alone or combined employing different doses and administration routes. Tumor, normal tissue and blood samples were processed for boron measurement by ICP-OES. Results. Several protocols proved potentially useful for BNCT studies in terms of absolute boron concentration in tumor and preferential uptake of boron by tumor tissue, i.e. BPA 15.5 mg 10 B/kg iv + GB-10 50 mg 10 B/kg iv; BPA 46.5 mg 10 B/kg ip; BPA 46.5 mg 10 B/kg ip

  6. Attention deficit hyperactivity disorder: binding of [99mTc]TRODAT-1 to the dopamine transporter before and after methylphenidate treatment

    International Nuclear Information System (INIS)

    Dresel, S.; LaFougere, C.; Brinkbaeumer, K.; Hahn, K.; Tatsch, K.; Krause, J.; Krause, K.-H.; Kung, H.F.

    2000-01-01

    Involvement of the dopaminergic system has been suggested in patients suffering from attention deficit hyperactivity disorder (ADHD) since the symptoms can be successfully treated with methylphenidate, a potent blocker of the dopamine transporter (DAT). This study reports the findings on the status of the DAT in adults with ADHD before and after commencement of treatment with methylphenidate, as measured using [ 99m Tc]TRODAT-1. Seventeen patients (seven males, ten females, aged 21-64 years, mean 38 years) were examined before and after the initiation of methylphenidate treatment (3 x 5 mg/day). All subjects were injected with 800 MBq [ 99m Tc]TRODAT-1 and imaged 3 h p.i. Single-photon emission tomography (SPET) scans were acquired using a triple-headed gamma camera. For semi-quantitative evaluation of the DAT, transverse slices corrected for attenuation were used to calculate specific binding in the striatum, with the cerebellum used as background [(STR-BKG)/BKG]. Data were compared with an age-matched control group. It was found that untreated patients presented with a significantly increased specific binding of [ 99m Tc]TRODAT-1 to the DAT as compared with normal controls [(STR-BKG)/BKG: 1.43±0.18 vs 1.22±0.06, P<0.001]. Under treatment with methylphenidate, specific binding decreased significantly in all patients [(STR-BKG)/BKG: 1.00±0.14, P<0.001]. Our findings suggest that the number of DAT binding sites is higher in drug-naive patients suffering from ADHD than in normal controls. The decrease in available DAT binding sites under treatment with methylphenidate correlates well with the improvement in clinical symptoms. The data of this study help to elucidate the complex dysregulation of the dopaminergic neurotransmitter system in patients suffering from ADHD and the effect of treatment with psychoactive drugs. (orig.)

  7. Attention deficit hyperactivity disorder: binding of [{sup 99m}Tc]TRODAT-1 to the dopamine transporter before and after methylphenidate treatment

    Energy Technology Data Exchange (ETDEWEB)

    Dresel, S; LaFougere, C; Brinkbaeumer, K; Hahn, K; Tatsch, K [Dept. of Nuclear Medicine, Univ. of Munich (Germany); Krause, J; Krause, K -H [Inst. for Psychiatry and Psychotherapy, Ottobrunn (Germany); Friedrich Baur Inst., Univ. of Munich (Germany); Kung, H F [Dept. of Radiology, Univ. of Pennsylvania (United States)

    2000-10-01

    Involvement of the dopaminergic system has been suggested in patients suffering from attention deficit hyperactivity disorder (ADHD) since the symptoms can be successfully treated with methylphenidate, a potent blocker of the dopamine transporter (DAT). This study reports the findings on the status of the DAT in adults with ADHD before and after commencement of treatment with methylphenidate, as measured using [{sup 99m}Tc]TRODAT-1. Seventeen patients (seven males, ten females, aged 21-64 years, mean 38 years) were examined before and after the initiation of methylphenidate treatment (3 x 5 mg/day). All subjects were injected with 800 MBq [{sup 99m}Tc]TRODAT-1 and imaged 3 h p.i. Single-photon emission tomography (SPET) scans were acquired using a triple-headed gamma camera. For semi-quantitative evaluation of the DAT, transverse slices corrected for attenuation were used to calculate specific binding in the striatum, with the cerebellum used as background [(STR-BKG)/BKG]. Data were compared with an age-matched control group. It was found that untreated patients presented with a significantly increased specific binding of [{sup 99m}Tc]TRODAT-1 to the DAT as compared with normal controls [(STR-BKG)/BKG: 1.43{+-}0.18 vs 1.22{+-}0.06, P<0.001]. Under treatment with methylphenidate, specific binding decreased significantly in all patients [(STR-BKG)/BKG: 1.00{+-}0.14, P<0.001]. Our findings suggest that the number of DAT binding sites is higher in drug-naive patients suffering from ADHD than in normal controls. The decrease in available DAT binding sites under treatment with methylphenidate correlates well with the improvement in clinical symptoms. The data of this study help to elucidate the complex dysregulation of the dopaminergic neurotransmitter system in patients suffering from ADHD and the effect of treatment with psychoactive drugs. (orig.)

  8. Assimilation of wind speed and direction observations: results from real observation experiments

    Directory of Open Access Journals (Sweden)

    Feng Gao

    2015-06-01

    Full Text Available The assimilation of wind observations in the form of speed and direction (asm_sd by the Weather Research and Forecasting Model Data Assimilation System (WRFDA was performed using real data and employing a series of cycling assimilation experiments for a 2-week period, as a follow-up for an idealised post hoc assimilation experiment. The satellite-derived Atmospheric Motion Vectors (AMV and surface dataset in Meteorological Assimilation Data Ingest System (MADIS were assimilated. This new method takes into account the observation errors of both wind speed (spd and direction (dir, and WRFDA background quality control (BKG-QC influences the choice of wind observations, due to data conversions between (u,v and (spd, dir. The impacts of BKG-QC, as well as the new method, on the wind analysis were analysed separately. Because the dir observational errors produced by different platforms are not known or tuned well in WRFDA, a practical method, which uses similar assimilation weights in comparative trials, was employed to estimate the spd and dir observation errors. The asm_sd produces positive impacts on analyses and short-range forecasts of spd and dir with smaller root-mean-square errors than the u,v-based system. The bias of spd analysis decreases by 54.8%. These improvements result partly from BKG-QC screening of spd and dir observations in a direct way, but mainly from the independent impact of spd (dir data assimilation on spd (dir analysis, which is the primary distinction from the standard WRFDA method. The potential impacts of asm_sd on precipitation forecasts were evaluated. Results demonstrate that the asm_sd is able to indirectly improve the precipitation forecasts by improving the prediction accuracies of key wind-related factors leading to precipitation (e.g. warm moist advection and frontogenesis.

  9. IVS Combination Center at BKG - Robust Outlier Detection and Weighting Strategies

    Science.gov (United States)

    Bachmann, S.; Lösler, M.

    2012-12-01

    Outlier detection plays an important role within the IVS combination. Even if the original data is the same for all contributing Analysis Centers (AC), the analyzed data shows differences due to analysis software characteristics. The treatment of outliers is thus a fine line between keeping data heterogeneity and elimination of real outliers. Robust outlier detection based on the Least Median Square (LMS) is used within the IVS combination. This method allows reliable outlier detection with a small number of input parameters. A similar problem arises for the weighting of the individual solutions within the combination process. The variance component estimation (VCE) is used to control the weighting factor for each AC. The Operator-Software-Impact (OSI) method takes into account that the analyzed data is strongly influenced by the software and the responsible operator. It allows to make the VCE more sensitive to the diverse input data. This method has already been set up within GNSS data analysis as well as the analysis of troposphere data. The benefit of an OSI realization within the VLBI combination and its potential in weighting factor determination has not been investigated before.

  10. Calculating exclusion limits for weakly interacting massive particle direct detection experiments without background subtraction

    International Nuclear Information System (INIS)

    Green, Anne M.

    2002-01-01

    Competitive limits on the weakly interacting massive particle (WIMP) spin-independent scattering cross section are currently being produced by 76 Ge detectors originally designed to search for neutrinoless double beta decay, such as the Heidelberg-Moscow and IGEX experiments. In the absence of background subtraction, limits on the WIMP interaction cross section are set by calculating the upper confidence limit on the theoretical event rate, given the observed event rate. The standard analysis technique involves calculating the 90% upper confidence limit on the number of events in each bin, and excluding any set of parameters (WIMP mass and cross section) which produces a theoretical event rate for any bin which exceeds the 90% upper confidence limit on the event rate for that bin. We show that, if there is more than one energy bin, this produces exclusion limits that are actually at a lower degree of confidence than 90%, and are hence erroneously tight. We formulate criteria which produce true 90% confidence exclusion limits in these circumstances, including calculating the individual bin confidence limit for which the overall probability that no bins exceed this confidence limit is 90% and calculating the 90% minimum confidence limit on the number of bins which exceed their individual bin 90% confidence limits. We then compare the limits on the WIMP cross section produced by these criteria with those found using the standard technique, using data from the Heidelberg-Moscow and IGEX experiments

  11. BEGe detectors in GERDA Phase I - performance, physics analysis and surface events

    Energy Technology Data Exchange (ETDEWEB)

    Lazzaro, Andrea [Physik-Department E15, Technische Universitaet Muenchen (Germany); Collaboration: GERDA-Collaboration

    2014-07-01

    The Phase I of the Gerda experiment, which has concluded its data taking in Summer 2013, was based on coaxial HPGe detectors already used for IGEX and HdM experiments. In the upcoming Phase II customized Broad Energy Germanium (BEGe) detectors will provide the major contribution to the total exposure. The first set of BEGe detectors has been deployed in Gerda since June 2012. The data collected in Phase I show the performance achieved in terms of spectroscopy and pulse shape discrimination. In particular the strongest background source, the {sup 42}K beta decay from the liquid argon surrounding the detectors, has been effectively rejected. The signals due to beta decay on the detector surface are indeed characterized by a longer charge collection time. This talk focuses on this key feature of the BEGe-PSD.

  12. Study of 99m Tc-TRODAT-1 Imaging on Human Brain with Children Autism by Single Photon Emission Computed Tomography

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Objective: Evaluate the application values of 99mTc-2 β [ N, N'-bis( 2-mercaptoethy1 ) ethylenediamino ] methyl, 3 β -(4-chlorophenyl) tropane ( TRODAT-1 ) dopamine transporter (DAT) SPECT imaging in children autism, and offer the academic foundation to etiology, mechanism and clinical therapy of autism. Methods:Ten autistic children and ten healthy controls were examined with 99mTc-TRODAT-1 DAT SPECT imaging.Striatal specific uptake of 99mTc-TRODAT-1 was calculated with region of interest analysis according to the ratios between striatum and cerebellum [ (STR-BKG)/BKG]. Results:There was no difference in semiquantitative dopamine transporter between bilateral striatum in autistic children ( P = 0. 562) and in normal controls ( P = 0. 573 ); dopamine transporter in brain of patients with autism increased more significantly than that in normal controls ( P = 0. 017 ). Conclusion: Dopaminergic nervous system is dysfunction in human brain with children autism, and DAT 99mTc-TRODAT-1 SPECT imaging on human brain will help the imaging diagnosis of children autism.

  13. [Study of dopamine transporter imaging on the brain of children with autism].

    Science.gov (United States)

    Sun, Xiaomian; Yue, Jing; Zheng, Chongxun

    2008-04-01

    This study was conducted to evaluate the applicability of 99mTc-2beta-[ N, N'-bis (2-mercaptoethyl) ethylenediamino]methyl,3beta(4-chlorophenyl)tropane(TRODAT-1) dopamine transporter(DAT) SPECT imaging in children with autism, and thus to provide an academic basis for the etiology, mechanism and clinical therapy of autism. Ten autistic children and ten healthy controls were examined with 99mTc-TRODAT-1 DAT SPECT imaging. Striatal specific uptake of 99mTc-TRODAT-1 was calculated with region of interest analysis according to the ratics between striatum and cerebellum [(STR-BKG)/BKG]. There was no statistically significant difference in semiquantitative dopamine transporter between the bilateral striata of autistic children (P=0.562), and between those of normal controls (p=0.573); Dopamine transporter in the brain of patients with autism increased significantly as compared with that in the brain of normal controls (P=0.017). Dopaminergic nervous system is dysfunctioning in the brain of children with autism, and DAT 99mTc-TRODAT-1 SPECT imaging on the brain will help the imaging diagnosis of childhcod autism.

  14. Heme synthesis in normal mouse liver and mouse liver tumors

    International Nuclear Information System (INIS)

    Stout, D.L.; Becker, F.F.

    1990-01-01

    Hepatic cancers from mice and rats demonstrate decreased levels of delta-aminolevulinic acid synthase, the rate-limiting enzyme in the heme synthetic pathway, and increased heme oxygenase, the heme-catabolizing enzyme. These findings suggest that diminution of P-450, b5, and catalase in these lesions may result from a heme supply that is limited by decreased heme synthesis and increased heme catabolism. Heme synthesis was measured in mouse liver tumors (MLT) and adjacent tumor-free lobes (BKG) by administering the radiolabeled heme precursors 55 FeCl3 and [2- 14 C]glycine and subsequently extracting the heme for determination of specific activity. Despite reduced delta-aminolevulinic acid synthase activity in MLT, both tissues incorporated [2-14C]glycine into heme at similar rates. At early time points, heme extracted from MLT contained less 55Fe than that from BKG. This was attributed to the findings that MLT took up 55Fe at a slower rate than BKG and had larger iron stores than BKG. The amount of heme per milligram of protein was also similar in both tissues. These findings militate against the hypothesis that diminished hemoprotein levels in MLT result from limited availability of heme. It is probable, therefore, that decreased hemoprotein levels in hepatic tumors are linked to a general program of dedifferentiation associated with the cancer phenotype. Diminution of hemoprotein in MLT may result in a relatively increased intracellular heme pool. delta-Aminolevulinic acid synthase and heme oxygenase are, respectively, negatively and positively regulated by heme. Thus, their alteration in MLT may be due to the regulatory influences of the heme pool

  15. Search for Neutrinoless Double-Beta Decay

    OpenAIRE

    Tornow, Werner

    2014-01-01

    After the pioneering work of the Heidelberg-Moscow (HDM) and International Germanium Experiment (IGEX) groups, the second round of neutrinoless double-$\\beta$ decay searches currently underway has or will improve the life-time limits of double-$\\beta$ decay candidates by a factor of two to three, reaching in the near future the $T_{1/2} = 3 \\times 10^{25}$ yr level. This talk will focus on the large-scale experiments GERDA, EXO-200, and KamLAND-Zen, which have reported already lower half-life...

  16. In vivo tumor angiogenesis imaging with site-specific labeled 99mTc-HYNIC-VEGF

    International Nuclear Information System (INIS)

    Blankenberg, Francis G.; Backer, Marina V.; Patel, Vimalkumar; Backer, Joseph M.; Levashova, Zoia

    2006-01-01

    We recently developed a cysteine-containing peptide tag (C-tag) that allows for site-specific modification of C-tag-containing fusion proteins with a bifunctional chelator, HYNIC (hydrazine nicotinamide)-maleimide. We then constructed and expressed C-tagged vascular endothelial growth factor (VEGF) and labeled it with HYNIC. We wished to test 99m Tc-HYNIC-C-tagged VEGF ( 99m Tc-HYNIC-VEGF) for the imaging of tumor vasculature before and after antiangiogenic (low continuous dosing, metronomic) and tumoricidal (high-dose) cyclophosphamide treatment. HYNIC-maleimide was reacted with the two thiol groups of C-tagged VEGF without any effect on biologic activity in vitro. 99m Tc-HYNIC-VEGF was prepared using tin/tricine as an exchange reagent, and injected via the tail vein (200-300 μCi, 1-2 μg protein) followed by microSPECT imaging 1 h later. Sequencing analysis of HYNIC-containing peptides obtained after digestion confirmed the site-specific labeling of the two accessible thiol groups of C-tagged VEGF. Tumor vascularity was easily visualized with 99m Tc/VEGF in Balb/c mice with 4T1 murine mammary carcinoma 10 days after implantation into the left axillary fat pad in controls (12.3±5.0 tumor/bkg, n=27) along with its decrease following treatment with high (150 mg/kg q.o.d. x 4; 1.14±0.48 tumor/bkg, n=9) or low (25 mg/kg q.d. x 7; 1.03±0.18 tumor/bkg, n=9) dose cyclophosphamide. Binding specificity was confirmed by observing a 75% decrease in tumor uptake of 99m Tc/biotin-inactivated VEGF, as compared with 99m Tc-HYNIC-VEGF. 99m Tc can be loaded onto C-tagged VEGF in a site-specific fashion without reducing its bioactivity. 99m Tc-HYNIC-VEGF can be rapidly prepared for the imaging of tumor vasculature and its response to different types of chemotherapy. (orig.)

  17. Biodistribution study with combined administration of BPA and BSH for BNCT in the hamster cheek pouch oral cancer model

    International Nuclear Information System (INIS)

    Garabalino, M A; Heber, E M; Monti Hughes, A; Pzzi, E C C; Molinari, A J; Niggg, D W; Bauer, W; Trivillin, V A; Schwint, A E

    2012-01-01

    We previously proved the therapeutic potential of the chemically non-selective boron compound decahydrodecaborate (GB-10) as a stand-alone boron carrier for BNCT in the hamster cheek pouch oral cancer model with no toxic effects in normal or precancerous tissue. Although GB-10 is not taken up selectively by oral tumor tissue, selective tumor lethality would result from selective aberrant tumor blood vessel damage. Furthermore, BNCT efficacy was enhanced when GB-10 and boronophenylalanine (BPA) were administered jointly. The fact that sodium mercaptoundecahydro-closo-dodecaborate (BSH) is being investigated clinically as a stand-alone boron agent for BNCT of brain tumors and in combination with BPA for recurrent head and neck malignancies makes it a particularly interesting boron compound to explore. Based on the working hypothesis that BSH would conceivably behave similarly to GB-10 in oral cancer, we previously performed biodistribution studies with BSH alone in the hamster cheek pouch oral cancer model. The aim of the present study was to perform biodistribution studies of BSH + BPA administered jointly in the hamster cheek pouch oral cancer model as a starting point to contribute to the knowledge of (BSH+BPA)-BNCT radiobiology and optimize therapeutic efficacy. The right cheek pouch of Syrian hamsters was subjected to topical administration of a carcinogen twice a week for 12 weeks. Once the exophytic tumors, i.e. squamous cell carcinomas, had developed, the animals were used for biodistribution studies with BSH + BPA. Three administration protocols with different proportions of each of the compounds were assessed: 1. BSH, 50 mg 10 B/kg, iv + BPA, 15.5 mg 10 B/kg, ip; 2. BSH, 34.5 mg 10 B/kg, iv + BPA, 31 mg 10 B/kg, ip; 3. BSH, 20 mg 10 B/kg, iv + BPA, 46.5 mg 10 B/kg, ip. Groups of animals were euthanized 4 h after the administration of BSH and 3 h after the administration of BPA. Samples of blood, tumor, precancerous and normal pouch and other tissues with

  18. In vivo effects of olanzapine on striatal dopamine D[sub 2]/D[sub 3] receptor binding in schizophrenic patients: an iodine-123 iodobenzamide single-photon emission tomography study

    Energy Technology Data Exchange (ETDEWEB)

    Dresel, S.; Rossmueller, B.; Hahn, K.; Tatsch, K. (Department of Nuclear Medicine, University of Munich (Germany)); Mager, T.; Meisenzahl, E.; Moeller, H.J. (Department of Psychiatry, University of Munich (Germany))

    1999-08-01

    Olanzapine is a new atypical antipsychotic agent that belongs to the same chemical class as clozapine. The pharmacological efficacy of olanzapine (in contrast to that of risperidone) has been shown to be comparable to that of clozapine, but olanzapine has the advantage of producing a less pronounced bone marrow depressing effect than clozapine. The specific aims of this study were (a) to assess dopamine D[sub 2]/D[sub 3] receptor availability in patients treated with olanzapine by means of iodine-123 iodobenzamide [[sup 123]I]IBZM single-photon emission tomography (SPET), (b) to compare the results with findings of [[sup 123]I]IBZM SPET in patients under treatment with risperidone and (c) to correlate the results with the occurrance of extrapyramidal side-effects (EPMS). Brain SPET scans were performed in 20 schizophrenic patients (DSM III R) at 2 h after i.v. administration of 185 MBq [[sup 123]I]IBZM. Images were acquired using a triple-head gamma camera (Picker Prism 3000 XP). For semiquantitative evaluation of D[sub 2]/D[sub 3] receptor binding, transverse slices corrected for attenuation were used to calculate specific uptake values [STR-BKG]/BKG (STR=striatum; BKG=background). The mean daily dose of olanzapine ranged from 0.05 to 0.6 mg/kg body weight. The dopamine D[sub 2]/D[sub 3] receptor binding was reduced in all patients treated with olanzapine. Specific IBZM binding [STR-BKG]/BKG ranged from 0.13 to 0.61 (normal controls >0.95). The decreased D[sub 2]/D[sub 3] receptor availability revealed an exponential dose-response relationship (r=-0.85, P<0.001). The slope of the curve was similar to that of risperidone and considerably higher than that of clozapine as compared with the results of a previously published study. EPMS were observed in only one patient, presenting with the lowest D[sub 2]/D[sub 3] availability. The frequency of EPMS induced by olanzapine (5%) was considerably lower than the frequency under risperidone treatment (40%). Our findings

  19. In vivo effects of olanzapine on striatal dopamine D{sub 2}/D{sub 3} receptor binding in schizophrenic patients: an iodine-123 iodobenzamide single-photon emission tomography study

    Energy Technology Data Exchange (ETDEWEB)

    Dresel, S.; Rossmueller, B.; Hahn, K.; Tatsch, K. [Department of Nuclear Medicine, University of Munich (Germany); Mager, T.; Meisenzahl, E.; Moeller, H.J. [Department of Psychiatry, University of Munich (Germany)

    1999-08-01

    Olanzapine is a new atypical antipsychotic agent that belongs to the same chemical class as clozapine. The pharmacological efficacy of olanzapine (in contrast to that of risperidone) has been shown to be comparable to that of clozapine, but olanzapine has the advantage of producing a less pronounced bone marrow depressing effect than clozapine. The specific aims of this study were (a) to assess dopamine D{sub 2}/D{sub 3} receptor availability in patients treated with olanzapine by means of iodine-123 iodobenzamide [{sup 123}I]IBZM single-photon emission tomography (SPET), (b) to compare the results with findings of [{sup 123}I]IBZM SPET in patients under treatment with risperidone and (c) to correlate the results with the occurrance of extrapyramidal side-effects (EPMS). Brain SPET scans were performed in 20 schizophrenic patients (DSM III R) at 2 h after i.v. administration of 185 MBq [{sup 123}I]IBZM. Images were acquired using a triple-head gamma camera (Picker Prism 3000 XP). For semiquantitative evaluation of D{sub 2}/D{sub 3} receptor binding, transverse slices corrected for attenuation were used to calculate specific uptake values [STR-BKG]/BKG (STR=striatum; BKG=background). The mean daily dose of olanzapine ranged from 0.05 to 0.6 mg/kg body weight. The dopamine D{sub 2}/D{sub 3} receptor binding was reduced in all patients treated with olanzapine. Specific IBZM binding [STR-BKG]/BKG ranged from 0.13 to 0.61 (normal controls >0.95). The decreased D{sub 2}/D{sub 3} receptor availability revealed an exponential dose-response relationship (r=-0.85, P<0.001). The slope of the curve was similar to that of risperidone and considerably higher than that of clozapine as compared with the results of a previously published study. EPMS were observed in only one patient, presenting with the lowest D{sub 2}/D{sub 3} availability. The frequency of EPMS induced by olanzapine (5%) was considerably lower than the frequency under risperidone treatment (40%). Our findings

  20. The Bonn Astro/Geo Correlator

    Science.gov (United States)

    Bernhart, Simone; Alef, Walter; Bertarini, Alessandra; La Porta, Laura; Muskens, Arno; Rottmann, Helge; Roy, Alan

    2013-01-01

    The Bonn Distributed FX (DiFX) correlator is a software correlator operated jointly by the Max- Planck-Institut fur Radioastronomie (MPIfR), the Institut fur Geodasie und Geoinformation der Universitat Bonn (IGG), and the Bundesamt fur Kartographie und Geodasie (BKG) in Frankfurt.

  1. Beyond quantum mechanics? Hunting the 'impossible' atoms (Pauli Exclusion Principle violation and spontaneous collapse of the wave function at test)

    CERN Document Server

    Piscicchia, K; Bartalucci, S; Bassi, A; Bertolucci, S; Berucci, C; Bragadireanu, A M; Cargnelli, M; Clozza, A; De Paolis, L; Di Matteo, S; Donadi, S; d'Uffizi, A; Egger, J-P; Guaraldo, C; Iliescu, M; Ishiwatari, T; Laubenstein, M; Marton, J; Milotti, E; Pietreanu, D; Ponta, T; Sbardella, E; Scordo, A; Shi, H; Sirghi, D L; Sirghi, F; Sperandio, L; Doce, O Vazquez; Zmeskal, J

    2015-01-01

    The development of mathematically complete and consistent models solving the so-called "measurement problem", strongly renewed the interest of the scientific community for the foundations of quantum mechanics, among these the Dynamical Reduction Models posses the unique characteristic to be experimentally testable. In the first part of the paper an upper limit on the reduction rate parameter of such models will be obtained, based on the analysis of the X-ray spectrum emitted by an isolated slab of germanium and measured by the IGEX experiment. The second part of the paper is devoted to present the results of the VIP (Violation of the Pauli exclusion principle) experiment and to describe its recent upgrade. The VIP experiment established a limit on the probability that the Pauli Exclusion Principle (PEP) is violated by electrons, using the very clean method of searching for PEP forbidden atomic transitions in copper.

  2. Measurement of the W boson helicity fractions in t anti t events at 8 TeV in the lepton+jets channel with the ATLAS detector

    Energy Technology Data Exchange (ETDEWEB)

    Kareem, Mohammad Jawad

    2017-04-20

    Precise measurements of the properties of the top quark allow for testing the Standard Model (SM) and can be used to constrain new physics models. The top quark is predicted in the SM to decay almost exclusively to a W boson and b-quark. Thus, studying the Wtb vertex structure at high precision and in detail is motivated. This thesis presents a measurement of the W boson helicity fractions in top quark decays with t anti t events in the lepton+jets final state using proton-proton collisions at a centre-of-mass energy of √(s)=8 TeV recorded in 2012 with the ATLAS detector at the LHC. The data sample corresponds to an integrated luminosity of 20.2 fb{sup -1}. The angular distribution of two different analysers, the charged lepton and the down-type quark in the W boson rest frame are used to measure the helicity fractions. The most precise measurement is obtained from the leptonic analyser and events which contain at least two b-quark tagged jets. The results of F{sub 0}=0.709±0.012 (stat.+bkg. norm.){sup +0.015}{sub -0.014}(syst.), F{sub L}=0.299±0.008 (stat.+bkg. norm.){sup +0.013}{sub -0.012}(syst.), F{sub R}=-0.008±0.006 (stat.+bkg. norm.)±0.012(syst.), which stand for longitudinal, left- and right-handed W boson helicity fractions respectively, are obtained by performing a combined fit of electron+jets and muon+jets channels to data. The measured helicity fractions are consistent with the Standard Model prediction. As the polarisation state of the W boson in top quark decays is sensitive to the Wtb vertex structure, limits on anomalous Wtb couplings are set.

  3. Wt-ko_trxB1_Filtered bkg correc transf-norm

    OpenAIRE

    Serrano, L.M.; Molenaar, Douwe; Teusink, Bas; Vos, de, Willem; Smid, Eddy

    2007-01-01

    In this experiment we analyzed the impact of the disruption of trxB1in Lactobacillus plantarum at the transcriptome level. Furthermore we studied the effect of 3.5 mM peroxide effect on both Lactobacillus plantarum wild type (strain WCFS1) and a trxB1 mutant (strain NZ7608).

  4. Gerda: A new 76Ge Double Beta Decay Experiment at Gran Sasso

    International Nuclear Information System (INIS)

    Simgen, Hardy

    2005-01-01

    In the new 76 Ge double beta decay experiment Gerda [I. Abt et al., arXiv hep-ex/0404039; Gerda proposal, to be submitted to the Gran Sasso scientific committee] bare diodes of enriched 76 Ge will be operated in highly pure liquid nitrogen or argon. The goal is to reduce the background around Q ββ =2039 keV below 10 -3 counts/(kg-bar keV-bar y). With presently available diodes from the Igex and HdMs experiments the current evidence for neutrinoless double beta decay [H.-V. Klapdor-Kleingrothaus, et al., Mod. Phys. Lett. A16 (2001) 2409ff] can unambigously be checked within one year of measurement

  5. Range verification for eye proton therapy based on proton-induced x-ray emissions from implanted metal markers

    International Nuclear Information System (INIS)

    Rosa, Vanessa La; Royle, Gary; Gibson, Adam; Kacperek, Andrzej

    2014-01-01

    Metal fiducial markers are often implanted on the back of the eye before proton therapy to improve target localization and reduce patient setup errors. We aim to detect characteristic x-ray emissions from metal targets during proton therapy to verify the treatment range accuracy. Initially gold was chosen for its biocompatibility properties. Proton-induced x-ray emissions (PIXE) from a 15 mm diameter gold marker were detected at different penetration depths of a 59 MeV proton beam at the CATANA proton facility at INFN-LNS (Italy). The Monte Carlo code Geant4 was used to reproduce the experiment and to investigate the effect of different size markers, materials, and the response to both mono-energetic and fully modulated beams. The intensity of the emitted x-rays decreases with decreasing proton energy and thus decreases with depth. If we assume the range to be the depth at which the dose is reduced to 10% of its maximum value and we define the residual range as the distance between the marker and the range of the beam, then the minimum residual range which can be detected with 95% confidence level is the depth at which the PIXE peak is equal to 1.96 σ bkg , which is the standard variation of the background noise. With our system and experimental setup this value is 3 mm, when 20 GyE are delivered to a gold marker of 15 mm diameter. Results from silver are more promising. Even when a 5 mm diameter silver marker is placed at a depth equal to the range, the PIXE peak is 2.1 σ bkg . Although these quantitative results are dependent on the experimental setup used in this research study, they demonstrate that the real-time analysis of the PIXE emitted by fiducial metal markers can be used to derive beam range. Further analysis are needed to demonstrate the feasibility of the technique in a clinical setup. (paper)

  6. Range verification for eye proton therapy based on proton-induced x-ray emissions from implanted metal markers

    Science.gov (United States)

    La Rosa, Vanessa; Kacperek, Andrzej; Royle, Gary; Gibson, Adam

    2014-06-01

    Metal fiducial markers are often implanted on the back of the eye before proton therapy to improve target localization and reduce patient setup errors. We aim to detect characteristic x-ray emissions from metal targets during proton therapy to verify the treatment range accuracy. Initially gold was chosen for its biocompatibility properties. Proton-induced x-ray emissions (PIXE) from a 15 mm diameter gold marker were detected at different penetration depths of a 59 MeV proton beam at the CATANA proton facility at INFN-LNS (Italy). The Monte Carlo code Geant4 was used to reproduce the experiment and to investigate the effect of different size markers, materials, and the response to both mono-energetic and fully modulated beams. The intensity of the emitted x-rays decreases with decreasing proton energy and thus decreases with depth. If we assume the range to be the depth at which the dose is reduced to 10% of its maximum value and we define the residual range as the distance between the marker and the range of the beam, then the minimum residual range which can be detected with 95% confidence level is the depth at which the PIXE peak is equal to 1.96 σbkg, which is the standard variation of the background noise. With our system and experimental setup this value is 3 mm, when 20 GyE are delivered to a gold marker of 15 mm diameter. Results from silver are more promising. Even when a 5 mm diameter silver marker is placed at a depth equal to the range, the PIXE peak is 2.1 σbkg. Although these quantitative results are dependent on the experimental setup used in this research study, they demonstrate that the real-time analysis of the PIXE emitted by fiducial metal markers can be used to derive beam range. Further analysis are needed to demonstrate the feasibility of the technique in a clinical setup.

  7. GERDA: Results and perspectives

    Science.gov (United States)

    Cattadori, Carla Maria; GERDA Collaboration

    2015-08-01

    From November 2011 to May 2013, GERDA searched for 0 νββ and 2 νββ of 76Ge, operating bare in a liquid argon bath Ge detectors enriched up to ˜ 87% in 76Ge (enrGe), for a total mass of ˜ 18 kg of enrGe. A total exposure of 21.6 kgṡy, of enrGe was collected, and the existing claim [H. V. Klapdor-Kleingrothaus et al., Phys. Lett. B 586 (2004) 198] of 0 νββ evidence was scrutinized. GERDA didn't observe any peak at Qββ or in its immediate surroundings; the limit of T1/20ν > 2.1 ṡ1025 yr (90 % C.L.) is derived [GERDA collaboration: M. Agostini et al., Phys. Rev. Lett. 111, (2013) 122503]. When combining the GERDA limit with those of past HdM [HdM collaboration: H. V. Klapdor-Kleingrothaus et al., Eur. Phys. J. A12 (2001) 147] and Igex [Igex Collaboration: C. E. Aalseth et al., Phys. Rev. D 65 (2002) 092007] experiments, the lower limit of 3.0 ṡ1025 yr (90 % C.L.) on T1/20ν is achieved. The background index (BI) at Qββ (˜ 2039 keV) is ˜ 2.0 ṡ10-2 cts / (keV ṡkg ṡyr) and ˜ 1.0 ṡ10-2 cts / (keV ṡkg ṡyr), prior and after the pulse shape cuts respectively. Thanks to the low background the 2 νββ dominates the energy spectrum below 1800 keV: the Tν1/2 2 = (1.84-0.10+0.14) ṡ1021y was derived on a first data set corresponding to 5.1 kgṡyr exposure [GERDA collaboration: M. Agostini et al., J. Phys. G 40 (2013), 035110]. The ongoing experimental program, to double the exposed mass by adding new enrGe detectors with improved pulse shape discrimination features, and to implement the liquid argon scintillation light readout is outlined.

  8. Dopamine transporter density by [99mTc]-TRODAT-1 SPECT and neurocognitive performance - a preliminary pilot study

    International Nuclear Information System (INIS)

    Shih, Ming C; Amaro, Edson Jr; Souza, Sayuri de E

    2006-01-01

    Introduction: Cognitive deficits are associated with functional impairment as well as with poor quality of life in Parkinson's disease (PD). The main brain alteration in PD is a progressive loss of nigrostriatal dopamine neurons. Both animal and human studies have demonstrated that basal ganglia dopamine system is important for cognitive and motor functioning. Dopamine transporter (DAT) PET and SPECT radiotracers have been successfully used to estimate, in vivo, dopamine neuronal loss in humans. Objectives: The present study is aimed to examine the relationship between cognitive impairment and striatal dopamine neuronal loss, as estimated by [ 99m Tc]-TRODAT-1 SPECT in PD patients. Methods: Fifteen PD patients were scanned with [ 99m Tc]-TRODAT-1 SPECT (Dual-Head-SPECT Hwakeye, GE). SPECT images were reconstructed with FBP and Butterworth filter 0,40c/px order 10. Regions of interest (ROIs) were striatum (STR) and occipital lobe (BKG nonspecific binding) and were delineated in 3-mm transaxials slices analyzed according to this formula BP=([STR-BKG]/BKG), where BP is the striatal DAT binding potential. Neurocognitive tests, including the Rey Auditory Verbal Learning Test, Wisconsin Card Sorting Test, Ravens Progressive Matrices, Digit Span and Tavis 3, were applied to all subjects by trained neuropsychologists. Results: Striatal DAT binding potential (BP) was negatively correlated with the RAVLT tests 4 (R= 0.57, p≤0.05) and 5 (R=0.57, p≤0.05), which evaluate verbal learning. Striatal DAT BP was also negatively correlated with the WCST learning item (R=0.54, p≤0.05) and the Tavis 3 items, action error (R=0.52, p≤0.05) and number of correct responses (R=0.47, p≤0.05). Conclusions: Although preliminary, the present findings suggest that striatal DAT loss is associated with a poorer performance on verbal learning and cognitive flexibility tasks. These results are in line with a previous study that examined healthy volunteers and found a relationship between

  9. Effect of dietary boron on growth performance, calcium and phosphorus metabolism, and bone mechanical properties in growing barrows.

    Science.gov (United States)

    Armstrong, T A; Spears, J W

    2001-12-01

    An experiment was conducted to evaluate the effects of dietary boron (B) on growth performance, bone mechanical properties, and calcium (Ca) and phosphorus (P) metabolism in pigs. Thirty-six barrows were weaned at approximately 21 d of age and randomly assigned to receive one of three dietary treatments. Treatments consisted of 1) low-B basal diet (control), 2) basal + 5 mg B/kg diet, and 3) basal + 15 mg B/kg diet. Boron was supplemented as sodium borate. Barrows remained on their respective experimental diets throughout the nursery (35 d) and growing (30 d) phases of production. Blood samples were obtained from each barrow at the end of each phase. Following the 30-d growing period, eight barrows per treatment were transferred to stainless steel metabolism crates. Barrows had an adjustment period of 7 d, followed by a 7-d total collection of urine and feces. All barrows were fed at 90% of the previous ad libitum grower intake of the control animals during the adjustment and collection periods. At the end of the 7-d collection period, barrows were killed and femurs and fibulas were harvested for the assessment of bone mechanical properties. During the nursery phase, ADG and ADFI were increased (P 0.05) by dietary B. These data indicate that B supplementation to pigs can increase growth and bone strength without greatly affecting Ca and P metabolism.

  10. The effects of dietary boric acid and borax supplementation on lipid peroxidation, antioxidant activity, and DNA damage in rats.

    Science.gov (United States)

    Ince, Sinan; Kucukkurt, Ismail; Cigerci, Ibrahim Hakki; Fatih Fidan, A; Eryavuz, Abdullah

    2010-07-01

    The aims of this study were to clarify the effects of high dietary supplementation with boric acid and borax, called boron (B) compounds, on lipid peroxidation (LPO), antioxidant activity, some vitamin levels, and DNA damage in rats. Thirty Sprague Dawley male rats were divided into three equal groups: the animals in the first group (control) were fed with a standard rodent diet containing 6.4 mg B/kg, and the animals in the experimental group were fed with a standard rodent diet added with a supra-nutritional amount of boric acid and borax (100 mg B/kg) throughout the experimental period of 28 days. The B compounds decreased malondialdehyde (MDA), DNA damage, the protein carbonyl content (PCO) level in blood, and glutathione (GSH) concentration in the liver, Cu-Zn superoxide dismutase (SOD), and catalase (CAT) activity in the kidney. The B compounds increased GSH concentration in blood and the vitamin C level in plasma. Consequently, our results demonstrate that B supplementation (100 mg/kg) in diet decreases LPO, and enhances the antioxidant defense mechanism and vitamin status. There are no differences in oxidant/antioxidant balance and biochemical parameters except for serum vitamin A and liver GSH concentration, between the boron compounds used in this study. Crown Copyright 2010. Published by Elsevier GmbH. All rights reserved.

  11. New design and facilities for the International Database for Absolute Gravity Measurements (AGrav): A support for the Establishment of a new Global Absolute Gravity Reference System

    Science.gov (United States)

    Wziontek, Hartmut; Falk, Reinhard; Bonvalot, Sylvain; Rülke, Axel

    2017-04-01

    After about 10 years of successful joint operation by BGI and BKG, the International Database for Absolute Gravity Measurements "AGrav" (see references hereafter) was under a major revision. The outdated web interface was replaced by a responsive, high level web application framework based on Python and built on top of Pyramid. Functionality was added, like interactive time series plots or a report generator and the interactive map-based station overview was updated completely, comprising now clustering and the classification of stations. Furthermore, the database backend was migrated to PostgreSQL for better support of the application framework and long-term availability. As comparisons of absolute gravimeters (AG) become essential to realize a precise and uniform gravity standard, the database was extended to document the results on international and regional level, including those performed at monitoring stations equipped with SGs. By this it will be possible to link different AGs and to trace their equivalence back to the key comparisons under the auspices of International Committee for Weights and Measures (CIPM) as the best metrological realization of the absolute gravity standard. In this way the new AGrav database accommodates the demands of the new Global Absolute Gravity Reference System as recommended by the IAG Resolution No. 2 adopted in Prague 2015. The new database will be presented with focus on the new user interface and new functionality, calling all institutions involved in absolute gravimetry to participate and contribute with their information to built up a most complete picture of high precision absolute gravimetry and improve its visibility. A Digital Object Identifier (DOI) will be provided by BGI to contributors to give a better traceability and facilitate the referencing of their gravity surveys. Links and references: BGI mirror site : http://bgi.obs-mip.fr/data-products/Gravity-Databases/Absolute-Gravity-data/ BKG mirror site: http://agrav.bkg

  12. Real Time Monitoring of GPS-IGU orbits and clocks as a tool to disseminate corrections to GPS-Broadcast Ephemerides

    Science.gov (United States)

    Thaler, G.; Opitz, M.; Weber, R.

    2009-04-01

    Nowadays RTIGS and NTRIP have become standards for real time GNSS based positioning applications. The IGS (International GNSS Service) Real-Time Working Group disseminates via Internet (RTIGS) raw observation data of a subset of stations of the IGS network. This observation data can be used to establish a real-time integrity monitoring of the IGS predicted orbits (Ultra Rapid (IGU-) Orbits) and clocks, according to the recommendations of the IGS Workshop 2004 in Bern and in a further step correction terms for improving the accuracy of the GPS broadcast ephemerides can be calculated. The Institute for "Geodesy and Geophysics" of the TU-Vienna develops in cooperation with the IGS Real-Time Working Group the software "RTR- Control", which currently provides a real-time integrity monitoring of predicted IGU Satellite Clock Corrections to GPS Time. The real-time orbit calculation and monitoring of the predicted IGU satellite orbits is currently in a testing phase and will be operable in the near future. A kinematic model and calculated ranges to the satellites are combined in a KALMAN-Filter approach. Currently the most recent GPS- Satellite Clock Corrections are published in Real Time via Internet. A 24 - hour clock RINEX file and the IGU SP3 files modified for the associated clock corrections are stored on the ftp-server of the institute. To perform the task of calculating corrections to the broadcast ephemerides three programs are used, which are BNC (BKG Ntrip Client) and BNS (BKG Ntrip State Space Server) from BKG (Bundesamt für Kartographie und Geoinformation) as well as RTR-Control. BNC receives the GPS-broadcast ephemerides from the Ntrip-Caster and forwards them to BNS. RTR-Control calculates the satellite clocks and in future also the satellite orbits and forwards them in SP3-format to BNS. BNS calculates the correction terms to the broadcast ephemerides and delivers it in RTCM 3.x format (proprietary message 4056) back to the Ntrip-caster. Subsequently

  13. Application of a unique test design to determine the chronic toxicity of boron to the aquatic worm Lumbriculus variegatus and fatmucket mussel Lampsilis siliquoidea.

    Science.gov (United States)

    Hall, Scott; Lockwood, Rick; Harrass, Michael C

    2014-01-01

    The chronic (21- and 28-day) toxicity of boron was determined for two freshwater benthic macroinvertebrates: the fatmucket mussel Lampsilis siliquoidea and the aquatic worm Lumbriculus variegatus. The rapid depletion of boric acid from spiked sediments in tests using flow-through overlying waters was addressed by constant addition of boric acid to overlying water at concentrations matching those of the targeted porewater exposures. This proved highly successful in maintaining constant whole-sediment and sediment porewater boron concentrations. Boron sublethal 25 % inhibition concentration values based on porewater concentrations were 25.9 mg B/L (L. variegatus) and 38.5 mg B/L (L. siliquoidea), indicating similar test organism sensitivity. Expressed as dry whole-sediment values, the respective L. variegatus and L. siliquoidea sublethal (growth) IC25 values for whole-sediment exposures were 235.5 mg B/kg sediment dry weight (dw) and 310.6 mg B/kg dw. The worm lethality-based end points indicated greater sensitivity than the sublethal end points, bringing into question the validity of a "lethality" end point for L. variegatus given its fragmentation mode of reproduction. For comparison, water-only mussel exposures were tested resulting in an IC25 value of 34.6 mg B/L, which was within 20 % of the porewater value. This suggests that the primary route of boron exposure was through the aqueous phase. The results of this study indicated that for test materials that are readily water soluble, standard sediment test designs may be unsuitable, but water-only exposures can provide toxicological data representative of sediment tests.

  14. Plant macroremains from an early Neolithic site in eastern Kuyavia, central Poland

    Directory of Open Access Journals (Sweden)

    Mueller-Bieniek Aldona

    2016-06-01

    Full Text Available The study examined plant remains from the Smólsk 2/10 site, situated on the border of two different landscapes and preserving traces of Neolithic occupation from several cultures: Early Linear Pottery culture (LBK, ca 5300-5200 cal. BC to ca 5000 cal. BC. Stroke Band Pottery culture (SBP, ca 4700-4400 cal. BC, the Brześć Kujawski group of Lengyel culture (BKG, ca 4500-4000/3900 cal. BC, Funnel Beaker culture (TRB, ca 3950-3380 BC, and also some features of the Lusatian culture (Hallstatt C, ca 970-790 cal. BC.

  15. Real-Time Tropospheric Delay Estimation using IGS Products

    Science.gov (United States)

    Stürze, Andrea; Liu, Sha; Söhne, Wolfgang

    2014-05-01

    The Federal Agency for Cartography and Geodesy (BKG) routinely provides zenith tropospheric delay (ZTD) parameter for the assimilation in numerical weather models since more than 10 years. Up to now the results flowing into the EUREF Permanent Network (EPN) or E-GVAP (EUMETNET EIG GNSS water vapour programme) analysis are based on batch processing of GPS+GLONASS observations in differential network mode. For the recently started COST Action ES1206 about "Advanced Global Navigation Satellite Systems tropospheric products for monitoring severe weather events and climate" (GNSS4SWEC), however, rapid updates in the analysis of the atmospheric state for nowcasting applications require changing the processing strategy towards real-time. In the RTCM SC104 (Radio Technical Commission for Maritime Services, Special Committee 104) a format combining the advantages of Precise Point Positioning (PPP) and Real-Time Kinematic (RTK) is under development. The so-called State Space Representation approach is defining corrections, which will be transferred in real-time to the user e.g. via NTRIP (Network Transport of RTCM via Internet Protocol). Meanwhile messages for precise orbits, satellite clocks and code biases compatible to the basic PPP mode using IGS products are defined. Consequently, the IGS Real-Time Service (RTS) was launched in 2013 in order to extend the well-known precise orbit and clock products by a real-time component. Further messages e.g. with respect to ionosphere or phase biases are foreseen. Depending on the level of refinement, so different accuracies up to the RTK level shall be reachable. In co-operation of BKG and the Technical University of Darmstadt the real-time software GEMon (GREF EUREF Monitoring) is under development. GEMon is able to process GPS and GLONASS observation and RTS product data streams in PPP mode. Furthermore, several state-of-the-art troposphere models, for example based on numerical weather prediction data, are implemented. Hence, it

  16. Reconstruction actions in order to achieve and increase the designed nominal heat capacity of a hot-water block boiler with two flame pipes

    International Nuclear Information System (INIS)

    Ninevski, Gjorgji; Sekovanikj, Ivica; Kirovski, Hristo

    2000-01-01

    All companies which are occupied with some kind of useful energy production, must realize two very important things: firstly, maximum utilization of projecting capacities of energetic plants, in condition when specific investment for projecting and erecting of new plants allude use of prominent financial capital, and secondly, the energy must be produced economically, i.e. with the highest possible coefficient of efficiency and the lowest impact of combustion process to the environment. This paper presents the chronology of reconstruction activities of a block hot water boiler with two flame tubes type BKG 200 with 16.28 MWth (made by TPK Zagreb) in order to acquire and increase its thermal capacity app. 11 % (Authors)

  17. Study of Inclusive and Semi-Inclusive Production of eta{prime} in B Decays

    Energy Technology Data Exchange (ETDEWEB)

    Hicheur, Adlene

    2001-09-24

    We report a measurement of the rate for B {yields} {eta}{prime} X{sub s} transitions where the {eta}{prime} meson has center-of-mass momentum in the range 2.0 to 2.7 GeV/c and X{sub s} represents a system comprising a kaon and up to four pions. Our study is based on 22.2 million B{bar B} pairs collected at the {Upsilon}(4S) with the BABAR detector at the Stanford Linear Accelerator Center. We find {Beta}(B {yields} {eta}{prime}X{sub s}) = (6.8{sub -1.0}{sup +0.7}(stat) {+-} 1.0(syst){sub -0.5}{sup +0.0}(bkg)) x 10{sup -4} assuming that the signal is due to b {yields} sg* transitions.

  18. An assessment of boric acid and borax using the IEHR Evaluative Process for Assessing Human Developmental and Reproductive Toxicity of Agents. Expert Scientific Committee.

    Science.gov (United States)

    Moore, J A

    1997-01-01

    drinking water not exceed 0.6 mg B/L [0.06 mM B] over a lifetime of exposure. Dietary exposure to boron for an adult typically ranges from ranges from 0.25 to 3.1 mg B/d with an average of 1.5 mg B/d. The high end of the exposure range, 3.1 mg B/d, was selected by the Expert Committee as best estimate of exposure. It should be noted that a diet high in fruits, vegetables, grains, legumes, and other food stuffs with high boron contents may lead to daily exposures as high as 10 mg B/d from diet alone. Some body building supplements contain boron at levels ranging from 1.5 to 10 mg B, with a median of 4 mg B. Use of the supplements containing the median concentration of boron could equal the daily intake an individual receives from diet and drinking water combined. Adults in the U.S. at the high end of the food exposure range may typically ingest up to 3.5 mg B/d, or a daily dose of 0.005 mmol B/kg b.wt., through exposure from diet (3.1 mg B/d) and drinking water (0.4 mg B/d). Individuals who also use body-building supplements may have a total daily boron intake of 7.5 mg B resulting in a daily dose of 0.01 mmol B/kg b.wt./d. Occupational exposure to boron is mainly through inhalation of borate containing dust during mining and manufacturing processes. Current occupational exposures to boron are reported to result in a daily dose of < 0.0001 to 0.2 mmol B/kg b.wt./d. Current U.S. OSHA permissible exposure limit (PEL) for sodium tetraborates is 10 mg/m3, and the California Occupational Safety and Health Administration PEL is 5 mg/m3. An exposure of 5 mg B/m3 translates to approximately 0.01 mmol B/kg b.wt./d that, coincidentally, is the same as exposure levels associated with combined municipal drinking water, diet, and body building supplement consumption. Infants may receive exposures to boric acid when it is used as a household insecticide for cockroach control. Exposure from boric acid-containing cosmetic and personal care products applie

  19. Dietary boron does not affect tooth strength, micro-hardness, and density, but affects tooth mineral composition and alveolar bone mineral density in rabbits fed a high-energy diet.

    Science.gov (United States)

    Hakki, Sema S; SiddikMalkoc; Dundar, Niyazi; Kayis, Seyit Ali; Hakki, Erdogan E; Hamurcu, Mehmet; Baspinar, Nuri; Basoglu, Abdullah; Nielsen, Forrest H; Götz, Werner

    2015-01-01

    The objective of this study was to determine whether dietary boron (B) affects the strength, density and mineral composition of teeth and mineral density of alveolar bone in rabbits with apparent obesity induced by a high-energy diet. Sixty female, 8-month-old, New Zealand rabbits were randomly assigned for 7 months into five groups as follows: (1) control 1, fed alfalfa hay only (5.91 MJ/kg and 57.5 mg B/kg); (2) control 2, high energy diet (11.76 MJ and 3.88 mg B/kg); (3) B10, high energy diet + 10 mg B gavage/kg body weight/96 h; (4) B30, high energy diet + 30 mg B gavage/kg body weight/96 h; (5) B50, high energy diet + 50 mg B gavage/kg body weight/96 h. Maxillary incisor teeth of the rabbits were evaluated for compression strength, mineral composition, and micro-hardness. Enamel, dentin, cementum and pulp tissue were examined histologically. Mineral densities of the incisor teeth and surrounding alveolar bone were determined by using micro-CT. When compared to controls, the different boron treatments did not significantly affect compression strength, and micro-hardness of the teeth, although the B content of teeth increased in a dose-dependent manner. Compared to control 1, B50 teeth had decreased phosphorus (P) concentrations. Histological examination revealed that teeth structure (shape and thickness of the enamel, dentin, cementum and pulp) was similar in the B-treated and control rabbits. Micro CT evaluation revealed greater alveolar bone mineral density in B10 and B30 groups than in controls. Alveolar bone density of the B50 group was not different than the controls. Although the B treatments did not affect teeth structure, strength, mineral density and micro-hardness, increasing B intake altered the mineral composition of teeth, and, in moderate amounts, had beneficial effects on surrounding alveolar bone.

  20. Pulse shape analysis for the GERDA experiment to set a new limit on the half-life of 0νββ decay of {sup 76}Ge

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, Victoria Elisabeth

    2017-01-25

    The GERDA experiment searches for neutrinoless double beta (0νββ) decay of {sup 76}Ge using high purity germanium (HPGe) detectors operated in liquid argon (LAr). The aim is to explore half-lives of the order of 10{sup 26} yr. Therefore, GERDA relies on improved active background reduction techniques such as pulse shape discrimination (PSD) in which the time structure of the germanium signals is analyzed to discriminate signal- from background-like events. Two types of HPGe detectors are operated: semi-coaxial detectors previously used in the Heidelberg-Moscow and IGEX experiments and new Broad Energy Germanium (BEGe) detectors which feature an improved energy resolution and enhanced PSD. In Phase I of the experiment, five enriched BEGe detectors were used for the first time in the search for 0νββ decay. A PSD based on a single parameter, the ratio of the maximum current amplitude over the energy A/E is applied. 83% of the background events in a 232 keV region around Q{sub ββ} are rejected with a high signal efficiency of (92.1 ± 1.9) %. The achieved background index (BI) is (5.4{sup +4.1}{sub -3.4}) . 10{sup -3} (counts)/(keV.kg.yr). This is an improvement by a factor of 10 compared to previous germanium based 0νββ experiments. Phase II of the experiment includes a major upgrade: for further background rejection, the LAr cryostat is instrumented to detect argon scintillation light. Additional 25 BEGe detectors are installed. After PSD and LAr veto a BI of (0.7{sup +1.3}{sub -0.5}) . 10{sup -3} (counts)/(keV.kg.yr) is achieved. This is the best BI achieved in 0νββ experiments so far. A frequentist statistical analysis is performed on the combined data collected in GERDA Phase I and the first Phase II release. A new limit on the half-life of 0νββ decay of {sup 76}Ge is set to T{sup 0ν}{sub 1/2}>5.3.10{sup 25} yr at 90% C.L., with a median sensitivity of T{sup 0ν}{sub 1/2}>4.0.10{sup 25} yr at 90% C.L.

  1. Melting properties and Lintnerisation of potato starch with different degrees of phosphorylation

    DEFF Research Database (Denmark)

    Wischmann, Bente; Adler-Nissen, Jens

    2002-01-01

    Lintner dextrins were prepared from size fractionated potato starch granules from two potato varieties (90BKG22 and Lady Rosetta) that contain high or low natural content of esterified phosphate, respectively. The time course of hydrolysis showed the typical two-phase kinetics, with a maximal...... degree of hydrolysis of between 74% and 81% after 30 days of hydrolysis, except for the fraction of smallest granules of the low phosphorylated variety (low P), which was hydrolysed to 98%. The relative amount of retained glucose-6-P in the Lintner dextrins was 18.6% for the low P variety and 46...... peak became low and broad during the time course of hydrolysis, with rise in enthalpy change, indicating a strong dependency on the amorphous region of the granules, After annealing the same fractions showed the typical raise in gelatinisation temperature and narrowing of gelatinisation peak...

  2. Characterisation of GERDA Phase-I detectors in liquid argon

    Energy Technology Data Exchange (ETDEWEB)

    Barnabe Heider, Marik; Schoenert, Stefan [Max-Planck-Institut fuer Kernphysik (Germany); Gusev, Konstantin [Russian Research Center, Kurchatov Institute (Russian Federation); Joint Institute for Nuclear Research (Russian Federation)

    2009-07-01

    GERDA will search for neutrinoless double beta decay in {sup 76}Ge by submerging bare enriched HPGe detectors in liquid argon. In GERDA Phase-I, reprocessed enriched-Ge detectors, which were previously operated by the Heidelberg-Moscow and IGEX collaborations, and reprocessed natural-Ge detectors from Genius-TF, will be redeployed. We have tested the operation and performance of bare HPGe detectors in liquid nitrogen and in liquid argon over more than three years with three non-enriched p-type prototype detectors. The detector handling and mounting procedures have been defined and the Phase-I detector technology, the low-mass assembly and the long-term stability in liquid argon have been tested successfully. The Phase-I detectors were reprocessed by Canberra Semiconductor NV, Olen, according to their standard technology but without the evaporation of a passivation layer. After their reprocessing, the detectors have been mounted in their low-mass holders and their characterisation in liquid argon performed. The leakage current, the counting characteristics and the efficiency of the detectors have been measured. The testing of the detectors was carried out in the liquid argon test stand of the GERDA underground Detector Laboratory (GDL) at LNGS. The detectors are now stored underground under vacuum until their operation in GERDA.

  3. Efeito do tratamento com fontes de zinco e boro na germinação e vigor de sementes de milho Effects of maize seed treated with zinc and boron sources on germination and vigour

    Directory of Open Access Journals (Sweden)

    N.D. Ribeiro

    1994-12-01

    Full Text Available Foi conduzido um experimento no Departamento de Fitotecnia da Universidade Federal de Santa Maria, RS, no período de março a novembro de 1992, com o objetivo de verificar o efeito da aplicação de fontes de zinco e boro em sementes de milho tratadas ou não com defensivos agrícolas. Adotou-se o esquema fatorial 3x2x6, em delineamento inteiramente casualizado, com quatro repetições. Os tratamentos constaram de avaliações em três épocas (logo após o tratamento, quatro e oito meses depois, utilização de sementes de milho com e sem tratamento fítossanitário, combinados com seis fontes de zinco e boro (testemunha, Zn-Biocrop, B-Biocrop, E-Orgânico, Zn-Biocrop + B-Biocrop e Zn-Biocrop + B-Orgânico, nas doses únicas de 2,50g Zn e de 0,15g B/kg de sementes. Os resultados obtidos mostram que a aplicação da fonte Zn-Biocrop não prejudica a germinação e o vigor, pelo período de oito meses de armazenamento. O tratamento de sementes com boro (B-Biocrop e B-Orgânico diminui a germinação e o vigor.The experiment was carried during the period of march 1992 to november 1992, to study the effects of maize seed treatment with zinc, boron and pesticides, on the germination and vigour during storage. The experimental design was a complete randomized 3x2x6 fatorial with four replicátions. The treatments were three storage periods (zero, four and eight month, pesticides treatments with or without, and six sources of zinc and boron (control, Zn-Biocrop, B-Biocrop, Organic-B, Zn-Biocrop -I- B-Biocrop and Zn-Biocrop + Organic-B, in the dose 2.50g of the Zn and 0.l5g of the B/kg of seeds. The results show that Zn-Biocrop maintain high germination and vigour for eight month of storage. The boron treatment (B-Biocrop and Organic-B showed a low germination and vigour.

  4. Observation of High Momentum {ital {eta}}{sup {prime}} Production in {ital B} Decays

    Energy Technology Data Exchange (ETDEWEB)

    Browder, T.E.; Li, Y.; Rodriguez, J.L. [University of Hawaii at Manoa, Honolulu, Hawaii 96822 (United States); Bergfeld, T.; Eisenstein, B.I.; Ernst, J.; Gladding, G.E.; Gollin, G.D.; Hans, R.M.; Johnson, E.; Karliner, I.; Marsh, M.A.; Palmer, M.; Selen, M.; Thaler, J.J. [University of Illinois, Urbana-Champaign, Illinois 61801 (United States); Edwards, K.W.; Edwards, K.W. [the Institute of Particle Physics, Montreal, Quebec (Canada); Bellerive, A.; Bellerive, A.; Janicek, R.; Janicek, R.; MacFarlane, D.B.; MacFarlane, D.B.; Patel, P.M.; Patel, P.M. [the Institute of Particle Physics, Montreal, Quebec (Canada); Sadoff, A.J. [Ithaca College, Ithaca, New York 14850 (United States); Ammar, R.; Baringer, P.; Bean, A.; Besson, D.; Coppage, D.; Darling, C.; Davis, R.; Kotov, S.; Kravchenko, I.; Kwak, N.; Zhou, L. [University of Kansas, Lawrence, Kansas 66045 (United States); Anderson, S.; Kubota, Y.; Lee, S.J.; ONeill, J.J.; Poling, R.; Riehle, T.; Smith, A. [University of Minnesota, Minneapolis, Minnesota 55455 (United States); Alam, M.S.; Athar, S.B.; Ling, Z.; Mahmood, A.H.; Timm, S.; Wappler, F. [State University of New York at Albany, Albany, New York 12222 (United States); Anastassov, A.; Duboscq, J.E.; Fujino, D.; Gan, K.K.; Hart, T.; Honscheid, K.; Kagan, H.; Kass, R.; Lee, J.; Spencer, M.B.; Sung, M.; Undrus, A.; Wolf, A.; Zoeller, M.M. [The Ohio State University, Columbus, Ohio 43210 (United States); Nemati, B.; Richichi, S.J.; Ross, W.R.; Severini, H.; Skubic, P. [University of Oklahoma, Norman, Oklahoma 73019 (United States); Bishai, M.; Fast, J.; Hinson, J.W.; Menon, N.; Miller, D.H.; Shibata, E.I.; Shipsey, I.P.; Yurko, M. [Purdue University, West Lafayette, Indiana 47907 (United States); Glenn, S.; Kwon, Y.; Lyon, A.L.; Roberts, S.; Thorndike, E.H. [University of Rochester, Rochester, New York 14627 (United States); Jessop, C.P.; Lingel, K.; Marsiske, H.; Perl, M.L.; Savinov, V.; Ugolini, D.; Zhou, X.; and others

    1998-08-01

    We report the first observation of B{r_arrow}{eta}{sup {prime}}X transitions with high momentum {eta}{sup {prime}} mesons. We observe 39.0{plus_minus}11.6 B decay events with 2.0{lt}p{sub {eta}{sup {prime}}}{lt}2.7 GeV/c , the high momentum region where background from b{r_arrow}c processes is suppressed. We discuss the physical interpretation of the signal, including the possibility that it is due to b{r_arrow}sg{sup {asterisk}} transitions. Given that interpretation, we find B(B{r_arrow}{eta}{sup {prime}}X{sub s} )=[6.2{plus_minus}1.6(stat){plus_minus} 1.3(syst){sup +0.0}{sub {minus}1.5} (bkg)]{times}10{sup {minus}4} for 2.0{lt}p{sub {eta}{sup {prime}}}{lt}2.7 GeV/c . {copyright} {ital 1998} {ital The American Physical Society }

  5. Design, simulation and construction of the GERDA-muon veto; Design, Simulation und Aufbau des GERDA-Myonvetos

    Energy Technology Data Exchange (ETDEWEB)

    Knapp, Markus Alexander

    2009-10-09

    The GERmanium Detector Array (Gerda) is a experiment searching for the neutrinoless double beta decay of {sup 76}Ge. This very rare weakly interacting process is predicted to occur if the neutrino exhibits a mass and is a Majorana particle; i.e. it is its own antiparticle. Although the double beta decay with emission of two neutrinos has been found in several nuclei, there is at this moment only a part of the Heidelberg-Moscow Collaboration claiming to have observed the neutrinoless double beta decay. The best limit for the half life currently is T{sub 1/2} > 1.2.10{sup 25} y. Gerda will expose about 15 kg.y of enriched germanium detectors from the Heidelberg-Moscow and IGEX crystals in phase I. In this phase, it will be able to test the claim within one year, due to reduced background by a factor 10. In phase II about 100 kg.y of data will be accumulated, leading to T{sub 1/2} > 2.10{sup 26} y due to an additional reduction of the background by a factor of 10. For a high sensitivity at these half lives, it is necessary to detect the corresponding rare events. Therefore background reduction to a rate of 10{sup -3} (counts)/(keV.kg.year) is of utmost importance. Therefore different background identification methods, like pulseshape analysis or a muon veto will be used. In this work, the development of the Cherenkov muon veto detectors is presented. First design studies will be shown, including extensive Monte-Carlo simulations. These simulations were also used to optimize the trigger conditions of the data acquisition, to detect all muons, that cause an energy deposition in the germanium detectors. Finally the on site construction at the Laboratori Nazionali del Gran Sasso in Italy will be described. (orig.)

  6. Biological distribution of reactor produced 18F-FDG. Local experience

    International Nuclear Information System (INIS)

    Sierralta, M.P.; Massardo, T.; Gil, M.C.; Chandia, M.; Godoy, N.; Troncoso, F.; Jofre, M.J.

    2002-01-01

    Introduction: Quality control through an animal model that relates bio distribution of a substance is fundamental prior to using it in human beings. For the evaluation of myocardial viability after recent myocardial infarction, the use of reactor produced 18F-FDG (a radiotracer usually obtained in cyclotron) is proposed, production of wish had never been attempted in our country. The aim of the study was to compare the specific activities found in the different tissues after the injection of this reactor produced radiopharmaceutical with those obtained by others authors with cyclotron 18F-FDG. Material WISTAR female white mice, men weight 25,28 +/- 1,09 g (23,8-26,9 range) in standard support conditions was used. 1,22 MBq (33 mCi) of 18F-FDG were injected in a lateral tail vein. Previously anaesthetised with Chloroform, the animals were sacrificed by jugular section at 5, 30 and 60 minutes intervals post injection. Blood and organs were removed (liver, lungs, heart, brain, urine plus bladder, kidneys, femur, muscle and quivers), placed in vials, then weighed, and finally taken to a Gamma Packard Minaxi γ Auto-gamma 5000 serie counter to obtain the counts per minute (cpm) (previously the empty vials were weighed too). At same time, STANDARDS (STD) (3 dilutions) cpm and BACKGROUND (BKG) cpm were collected. We calculate 1) mean BKG cpm, 2) mean STD cpm, who then were corrected by decay factor and dilution, and 3) each one of the tissues cpm, that then were corrected by decay factor, divided by the corresponding dilution cpm and multiplied by 100 to obtain the Injected Activity % (IA%). Finally, the IA% was divided by the tissue weight and get the Specific Activity (SA). Mean and standard deviation for each tissue at the 3 intervals were calculated. Results: The uptake distribution at 30 and 60 minutes were similar between reactor and cyclotron produced 18F-FDG, with significant bigger SA in heart and brain respect of the rest organs. There were significant

  7. Biological distribution of reactor produced 18F-FDG. Local experience

    Energy Technology Data Exchange (ETDEWEB)

    Sierralta, M P [University of Chile Clinical Hospital Nuclear Medicine Centre, Santiago (Chile); Military Hospital Nuclear Medicine Department, Santiago (Chile); Massardo, T [University of Chile Clinical Hospital Nuclear Medicine Centre, Santiago (Chile); Gil, M C [CGM Nuclear, Santiago (Chile); Chandia, M; Godoy, N; Troncoso, F [CCHEN, CEN La Reina, Santiago (Chile); Jofre, M J [Military Hospital Nuclear Medicine Department, Santiago (Chile)

    2002-09-01

    Introduction: Quality control through an animal model that relates bio distribution of a substance is fundamental prior to using it in human beings. For the evaluation of myocardial viability after recent myocardial infarction, the use of reactor produced 18F-FDG (a radiotracer usually obtained in cyclotron) is proposed, production of wish had never been attempted in our country. The aim of the study was to compare the specific activities found in the different tissues after the injection of this reactor produced radiopharmaceutical with those obtained by others authors with cyclotron 18F-FDG. Material WISTAR female white mice, men weight 25,28 +/- 1,09 g (23,8-26,9 range) in standard support conditions was used. 1,22 MBq (33 mCi) of 18F-FDG were injected in a lateral tail vein. Previously anaesthetised with Chloroform, the animals were sacrificed by jugular section at 5, 30 and 60 minutes intervals post injection. Blood and organs were removed (liver, lungs, heart, brain, urine plus bladder, kidneys, femur, muscle and quivers), placed in vials, then weighed, and finally taken to a Gamma Packard Minaxi {gamma} Auto-gamma 5000 serie counter to obtain the counts per minute (cpm) (previously the empty vials were weighed too). At same time, STANDARDS (STD) (3 dilutions) cpm and BACKGROUND (BKG) cpm were collected. We calculate 1) mean BKG cpm, 2) mean STD cpm, who then were corrected by decay factor and dilution, and 3) each one of the tissues cpm, that then were corrected by decay factor, divided by the corresponding dilution cpm and multiplied by 100 to obtain the Injected Activity % (IA%). Finally, the IA% was divided by the tissue weight and get the Specific Activity (SA). Mean and standard deviation for each tissue at the 3 intervals were calculated. Results: The uptake distribution at 30 and 60 minutes were similar between reactor and cyclotron produced 18F-FDG, with significant bigger SA in heart and brain respect of the rest organs. There were significant

  8. Direct detection of non-baryonic dark matter

    International Nuclear Information System (INIS)

    Nollez, G.

    2003-01-01

    Baryonic matter, which constitutes stars and galaxies, amounts to a few percents of the mass of the universe in agreement with the theory of the big-bang nucleosynthesis. Most of the matter in the universe (approximately 85%) is then non-baryonic and dark. One of the most favoured hypothesis is that this non-baryonic dark matter is constituted by a new type, still undiscovered, of elementary weakly interacting massive particles (wimps). These hypothetical particles would appear as thermal relics from the big-bang era during which they were created. A rich spectrum of new elementary particles is predicted by supersymmetry, the lightest of which is the neutralino. If the dark matter halo of our Milky-way is made of neutralinos, their detection in terrestrial detectors should be possible. Neutralinos are coupled to matter through the electroweak interaction, this implies that the detection rate is extraordinary low. About 10 experiments in the world are dedicated to the search after wimps. A first group of experiments (HDMS, IGEX, DAMA and Zeplin) use 'classical' detectors of nuclear physics, germanium semiconductor diodes or NaI scintillators. A second group (CDMS, Edelweiss) gathers cryogenic phonon ionisation experiments and a third group (CRESST, Rosebud) is based on cryogenic phonon-light experiments. Till now no wimps has been clearly detected, the direct detection story is obviously not concluded, most of the future experiments aim to reach a sensitivity of 10 -44 cm 2 . (A.C.)

  9. Possible application of boron neutron capture therapy to canine osteosarcoma

    International Nuclear Information System (INIS)

    Takeuchi, Akira

    1985-01-01

    Possibility for successful treatment of canine osteosarcoma by boron neutron capture therapy (BNCT) was demonstrated based upon an uptake study of the boron compound and an experimental treatment by BNCT. In the up take study following intravenous administration of Na 2 B 12 H 11 SH, satisfactorily higher boron concentration with some variation between tumors is likely to be obtained 12 hours after the administration, together with significantly lower boron levels in blood and bone. Based upon these results, osteosarcoma of a mongrel dog was successfully treated by BNCT. The tumor received approximately 3800 rads with single neutron irradiation (approximately 1.4 x 10 13 n./cm 2 ) about 12 hours after intravenous infusion of Na 2 B 12 H 11 SH of 96 % enriched 10 B in the ratio of 50 mg 10 B/kg. Clinical and radiographical improvements were remarkable and no neoplastic cell was found in any part of the original neoplastic lesion and its surrounding tissue at the time of autopsy after 30 days. (author)

  10. Measurement of environmental radiation using medical scintillation detector in well counter system

    Energy Technology Data Exchange (ETDEWEB)

    Lyu, Kwang Yeul; Park, Yeon Joon; Kim, Min Jeong; Ham, Eun Hye; Yoon, Ji Yeol; Kim, Hyun in; Min, Jung Hwan; Park, Hoon Hee [Dept. of Radiological Technology, Shingu College, Sungnam (Korea, Republic of)

    2015-12-15

    After the Fukushima nuclear accident in 2011, concerns about radiation by people are increasing rapidly. If people could know how much they will be exposed by radiation, it may help them avoiding it and understand what exactly radiation is. By doing this, we were helping to reduce the anxiety of radiation contamination. In this study, we have researched figures of radioactivity with ‘Captus-3000 thyroid uptake measurement systems’ in well counter detector system. The materials were measured with Briquette, Shiitake, Pollock, Button type battery, Alkaline battery, Topsoil, Asphalt, Gasoline, Milk powder, Pine, Basalt stone, Pencil lead, Wasabi, Coarse salt, Tuna(can) Cigar, Beer, and then we categorized those samples into Land resources, Water resources, Foodstuff and Etc (Beer classified as a water resources has been categorized into Foodstuff). Also, we selected the standard radiation source linear 137Cs to measure the sensitivity of well counter detector. After that, we took cpm(counter per minute) for the well counter detector of thyroid uptake system’s sensitivity. Then we compared the results of each material’s cpm and converted those results to Bq/kg unit. There were a little limitation with the measurement equipment because it has less sensitivity than other professional equipment like ‘High purity germanium radiation detector’. Moreover, We didn’t have many choices to decide the materials. As a result, there are macroscopic differences among the rates of material’s spectrum. Therefore, it had meaningful results that showed how much each material had emitted radiation. To compare the material’s cpm with BKG, we’ve compounded their spectrums. By doing that, we were able to detect some differences among the spectrums at specific peak section. Lastly, Button type battery, Alkaline Battery, Briquette, Asphalt and Topsoil showed high value. There were classified emitting high radiation Group A and emitted lower radiation Group B. The Group A

  11. Measurement of environmental radiation using medical scintillation detector in well counter system

    International Nuclear Information System (INIS)

    Lyu, Kwang Yeul; Park, Yeon Joon; Kim, Min Jeong; Ham, Eun Hye; Yoon, Ji Yeol; Kim, Hyun in; Min, Jung Hwan; Park, Hoon Hee

    2015-01-01

    After the Fukushima nuclear accident in 2011, concerns about radiation by people are increasing rapidly. If people could know how much they will be exposed by radiation, it may help them avoiding it and understand what exactly radiation is. By doing this, we were helping to reduce the anxiety of radiation contamination. In this study, we have researched figures of radioactivity with ‘Captus-3000 thyroid uptake measurement systems’ in well counter detector system. The materials were measured with Briquette, Shiitake, Pollock, Button type battery, Alkaline battery, Topsoil, Asphalt, Gasoline, Milk powder, Pine, Basalt stone, Pencil lead, Wasabi, Coarse salt, Tuna(can) Cigar, Beer, and then we categorized those samples into Land resources, Water resources, Foodstuff and Etc (Beer classified as a water resources has been categorized into Foodstuff). Also, we selected the standard radiation source linear 137Cs to measure the sensitivity of well counter detector. After that, we took cpm(counter per minute) for the well counter detector of thyroid uptake system’s sensitivity. Then we compared the results of each material’s cpm and converted those results to Bq/kg unit. There were a little limitation with the measurement equipment because it has less sensitivity than other professional equipment like ‘High purity germanium radiation detector’. Moreover, We didn’t have many choices to decide the materials. As a result, there are macroscopic differences among the rates of material’s spectrum. Therefore, it had meaningful results that showed how much each material had emitted radiation. To compare the material’s cpm with BKG, we’ve compounded their spectrums. By doing that, we were able to detect some differences among the spectrums at specific peak section. Lastly, Button type battery, Alkaline Battery, Briquette, Asphalt and Topsoil showed high value. There were classified emitting high radiation Group A and emitted lower radiation Group B. The Group A

  12. Status of the GERDA Experiment at the Laboratori Nazionali del Gran Sasso

    Directory of Open Access Journals (Sweden)

    R. Brugnera

    2013-01-01

    Full Text Available The Germanium Detector Array (Gerda is a low background experiment at the Laboratori Nazionali del Gran Sasso (LNGS of the INFN designed to search for the rare neutrinoless double beta decay (0νββ of 76Ge. In its first phase, high purity germanium diodes inherited from the former Heidelberg-Moscow and Igex experiments are operated “bare” and immersed in liquid argon, with an overall background environment of 10−2 cts/(keV·kg·yr, a factor of ten better than its predecessors. Measurements on two-neutrino double beta decay (2νββ giving T1/22ν=(1.88±0.10×1021 yr and recently published background model and pulse shape performances of the detectors are discussed in the paper. A new result on 0νββ has been recently published with a half-life limit on 0νββ decay T1/20ν>2.1×1025 yr (90% C.L.. A second phase of the experiment is scheduled to start during the year 2014, after a major upgrade shutdown. Thanks to the increased detector mass with new designed diodes and to the introduction of liquid argon instrumentation techniques, the experiment aims to reduce further the expected background to about 10−3 cts/(keV·kg·yr and to improve the 0νββ sensitivity to about T1/20ν>1.5×1026 yr (90% C.L..

  13. Use of NTRIP for Optimizing the Decoding Algorithm for Real-Time Data Streams

    Directory of Open Access Journals (Sweden)

    Zhanke He

    2014-10-01

    Full Text Available As a network transmission protocol, Networked Transport of RTCM via Internet Protocol (NTRIP is widely used in GPS and Global Orbiting Navigational Satellite System (GLONASS Augmentation systems, such as Continuous Operational Reference System (CORS, Wide Area Augmentation System (WAAS and Satellite Based Augmentation Systems (SBAS. With the deployment of BeiDou Navigation Satellite system(BDS to serve the Asia-Pacific region, there are increasing needs for ground monitoring of the BeiDou Navigation Satellite system and the development of the high-precision real-time BeiDou products. This paper aims to optimize the decoding algorithm of NTRIP Client data streams and the user authentication strategies of the NTRIP Caster based on NTRIP. The proposed method greatly enhances the handling efficiency and significantly reduces the data transmission delay compared with the Federal Agency for Cartography and Geodesy (BKG NTRIP. Meanwhile, a transcoding method is proposed to facilitate the data transformation from the BINary EXchange (BINEX format to the RTCM format. The transformation scheme thus solves the problem of handing real-time data streams from Trimble receivers in the BeiDou Navigation Satellite System indigenously developed by China.

  14. Use of NTRIP for optimizing the decoding algorithm for real-time data streams.

    Science.gov (United States)

    He, Zhanke; Tang, Wenda; Yang, Xuhai; Wang, Liming; Liu, Jihua

    2014-10-10

    As a network transmission protocol, Networked Transport of RTCM via Internet Protocol (NTRIP) is widely used in GPS and Global Orbiting Navigational Satellite System (GLONASS) Augmentation systems, such as Continuous Operational Reference System (CORS), Wide Area Augmentation System (WAAS) and Satellite Based Augmentation Systems (SBAS). With the deployment of BeiDou Navigation Satellite system(BDS) to serve the Asia-Pacific region, there are increasing needs for ground monitoring of the BeiDou Navigation Satellite system and the development of the high-precision real-time BeiDou products. This paper aims to optimize the decoding algorithm of NTRIP Client data streams and the user authentication strategies of the NTRIP Caster based on NTRIP. The proposed method greatly enhances the handling efficiency and significantly reduces the data transmission delay compared with the Federal Agency for Cartography and Geodesy (BKG) NTRIP. Meanwhile, a transcoding method is proposed to facilitate the data transformation from the BINary EXchange (BINEX) format to the RTCM format. The transformation scheme thus solves the problem of handing real-time data streams from Trimble receivers in the BeiDou Navigation Satellite System indigenously developed by China.

  15. Therapeutic effect of ursolic acid in experimental visceral leishmaniasis

    Directory of Open Access Journals (Sweden)

    Jéssica A. Jesus

    2017-04-01

    Full Text Available Leishmaniasis is an important neglected tropical disease, affecting more than 12 million people worldwide. The available treatments are not well tolerated and present diverse side effects in patients, justifying the search for new therapeutic compounds. In the present study, the therapeutic potential and toxicity of ursolic acid (UA, isolated from the leaves of Baccharis uncinella C. DC. (Asteraceae, were evaluated in experimental visceral leishmaniasis. To evaluate the therapeutic potential of UA, hamsters infected with L. (L. infantum were treated daily during 15 days with 1.0 or 2.0 mg UA/kg body weight, or with 5.0 mg amphotericin B/kg body weight by intraperitoneal route. Fifteen days after the last dose, the parasitism of the spleen and liver was stimated and the main histopathological alterations were recorded. The proliferation of splenic mononuclear cells was evaluated and IFN-γ, IL-4, and IL-10 gene expressions were analyzed in spleen fragments. The toxicity of UA and amphotericin B were evaluated in healthy golden hamsters by histological analysis and biochemical parameters. Animals treated with UA had less parasites in the spleen and liver when compared with the infected control group, and they also showed preservation of white and red pulps, which correlate with a high rate of proliferation of splenic mononuclear cells, IFN-γ mRNA and iNOS production. Moreover, animals treated with UA did not present alterations in the levels of AST, ALT, creatinine and urea. Taken together, these findings indicate that UA is an interesting natural compound that should be considered for the development of prototype drugs against visceral leishmaniasis.

  16. Performance and stability tests of bare high purity germanium detectors in liquid argon for the GERDA experiment

    Energy Technology Data Exchange (ETDEWEB)

    Barnabe Heider, Marik

    2009-05-27

    GERDA will search for neutrinoless double beta decay of {sup 76}Ge by using a novel approach of bare germanium detectors in liquid argon (LAr). Enriched germanium detectors from the previous Heidelberg-Moscow and IGEX experiments have been reprocessed and will be deployed in GERDA Phase-I. At the center of this thesis project is the study of the performance of bare germanium detectors in cryogenic liquids. Identical detector performance as in vacuum cryostats (2.2 keV FWHM at 1.3 MeV) was achieved in cryogenic liquids with a new low-mass detector assembly and contacts. One major result is the discovery of a radiation induced leakage current (LC) increase when operating bare detectors with standard passivation layers in LAr. Charge collection and build-up on the passivation layer were identified as the origin of the LC increase. It was found that diodes without passivation do not exhibit this feature. Three month-long stable operation in LAr at {proportional_to} 5 pA LC under periodic gamma irradiation demonstrated the suitability of the modi ed detector design. Based on these results, all Phase-I detectors were reprocessed without passivation layer and subsequently successfully characterized in LAr in the GERDA underground Detector Laboratory. The mass loss during the reprocessing was {proportional_to}300 g out of 17.9 kg and the exposure above ground {proportional_to} 5 days. This results in a negligible cosmogenic background increase of {proportional_to} 5.10{sup -4} cts/(keV.kg.y) at {sup 76}Ge Q{sub {beta}}{sub {beta}} for {sup 60}Co and {sup 68}Ge. (orig.)

  17. Precision Measurement of Neutrino Oscillation Parameters with KamLAND

    Energy Technology Data Exchange (ETDEWEB)

    O' Donnell, Thomas [Univ. of California, Berkeley, CA (United States)

    2011-12-01

    This dissertation describes a measurement of the neutrino oscillation parameters m2 21, θ12 and constraints on θ13 based on a study of reactor antineutrinos at a baseline of ~ 180 km with the KamLAND detector. The data presented here was collected between April 2002 and November 2009, and amounts to a total exposure of 2.64 ± 0.07 × 1032 proton-years. For this exposure we expect 2140 ± 74(syst) antineutrino candidates from reactors, assuming standard model neutrino behavior, and 350±88(syst) candidates from background. The number observed is 1614. The ratio of background-subtracted candidates observed to expected is (NObs - NBkg)/ (NExp) = 0.59 ± 0.02(stat) ± 0.045(syst) which confirms reactor neutrino disappearance at greater than 5σ significance. Interpreting this deficit as being due to neutrino oscillation, the best-fit oscillation parameters from a three-flavor analysis are m2 21= 7.60+0.20 -0.19×10-5eV2, θ12 = 32.5 ± 2.9 degrees and sin2 θ13 = 0.025+0.035 -0.035, the 95% confidence-level upper limit on sin2 θ13 is sin2 θ13 < 0.083. Assuming CPT invariance, a combined analysis of KamLAND and solar neutrino data yields best-fit values: m2 21 = 7.60+0.20 -0.20 × 10-5eV2, θ12 = 33.5+1.0 -1.1 degrees, and sin2 θ13 = 0.013 ± 0.028 or sin2 θ13 < 0.06 at the 95% confidence level.

  18. Application of gamma ray spectrometry and atomic absorption spectrometry for monitoring some radionuclides and heavy metals in sediments from the sudanese red sea coast

    International Nuclear Information System (INIS)

    Idris, A. M.; Eltayeb, M. A.

    2004-01-01

    A total of 31 surface sediment samples were collected from port-sudan harbour, sawakin harbour and the fringing reefs area that are located along the Sudanese coast of the red sea. The sampling was performed to provide good spatial coverage taking into account man's activity in port-sudan harbour and the fringing reefs area. the bulk samples were analyzed for some natural and anthropogenic radionuclides ( 226 Ra, 228 Ra, 40 K, 137 Cs) using direct gamma-ray spectrometry. Concentration of some heavy metals were determined in five fractions with grain-size of 1.000-0.500 m lm, 0.500-0.250 m lm, 0.250-0.125 m lm, 0.125-0.063 m lm and less than 0.063 m lm. The fractionation process was performed using dry sieving method. A total of 155 sub-samples (fractions) were digested by wet digestion method and analyzed for Mn, Fe, Ni, Cu, Zn, and Pb using flame atomic absorption spectrometry. Quality assurance of the obtained data was achieved through the analysis of certified reference materials. the radioactivity concentration ranges of ( 226 Ra, 228 Ra, and 40 K,) are 2.5-25.1b/kg, 2.1-13.1b q/kg,21.6- 429 Bq/kg, respectively. For ( 137 Cs)measurements, the highest value is 8.3 Bq/kg while most of samples were below the detection limits of the system. The concentration ranges of Mn, Fe, Cu, Zn, and Pb are 53.3-819 mg/kg, 1.4-51 mg/g, 8-131 mg/kg 9.5-113 mg/kg, 18.4-142 mg/kg, and 4.0-26.6 mg/kg, respectively. The granulometric normalization shows that some samples were subjected to anthropogenic activities. This finding was reinforced by results that were obtained from enrichment factor calculations and statistical multivariate analysis that is principal component analysis (Pca), also the Pca indicates that silt/clay fraction (>0.063 m lm ) is the dominant source for the emission of anthropogenic activities. From viewpoint of mineralogical composition the cluster analysis has distributed the samples into tow clusters. Dominant elements in sediments (Mn and Fe ) recorded

  19. Small Mammal Sampling in Mortandad and Los Alamos Canyons, 2005

    International Nuclear Information System (INIS)

    Kathy Bennett; Sherri Sherwood; Rhonda Robinson

    2006-01-01

    As part of an ongoing ecological field investigation at Los Alamos National Laboratory, a study was conducted that compared measured contaminant concentrations in sediment to population parameters for small mammals in the Mortandad Canyon watershed. Mortandad Canyon and its tributary canyons have received contaminants from multiple solid waste management units and areas of concern since establishment of the Laboratory in the 1940s. The study included three reaches within Effluent and Mortandad canyons (E-1W, M-2W, and M-3) that had a spread in the concentrations of metals and radionuclides and included locations where polychlorinated biphenyls and perchlorate had been detected. A reference location, reach LA-BKG in upper Los Alamos Canyon, was also included in the study for comparison purposes. A small mammal study was initiated to assess whether potential adverse effects were evident in Mortandad Canyon due to the presence of contaminants, designated as contaminants of potential ecological concern, in the terrestrial media. Study sites, including the reference site, were sampled in late July/early August. Species diversity and the mean daily capture rate were the highest for E-1W reach and the lowest for the reference site. Species composition among the three reaches in Mortandad was similar with very little overlap with the reference canyon. Differences in species composition and diversity were most likely due to differences in habitat. Sex ratios, body weights, and reproductive status of small mammals were also evaluated. However, small sample sizes of some species within some sites affected the analysis. Ratios of males to females by species of each site (n = 5) were tested using a Chi-square analysis. No differences were detected. Where there was sufficient sample size, body weights of adult small mammals were compared between sites. No differences in body weights were found. Reproductive status of species appears to be similar across sites. However, sample

  20. Small Mammal Sampling in Mortandad and Los Alamos Canyons, 2005

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Kathy; Sherwood, Sherri; Robinson, Rhonda

    2006-08-15

    As part of an ongoing ecological field investigation at Los Alamos National Laboratory, a study was conducted that compared measured contaminant concentrations in sediment to population parameters for small mammals in the Mortandad Canyon watershed. Mortandad Canyon and its tributary canyons have received contaminants from multiple solid waste management units and areas of concern since establishment of the Laboratory in the 1940s. The study included three reaches within Effluent and Mortandad canyons (E-1W, M-2W, and M-3) that had a spread in the concentrations of metals and radionuclides and included locations where polychlorinated biphenyls and perchlorate had been detected. A reference location, reach LA-BKG in upper Los Alamos Canyon, was also included in the study for comparison purposes. A small mammal study was initiated to assess whether potential adverse effects were evident in Mortandad Canyon due to the presence of contaminants, designated as contaminants of potential ecological concern, in the terrestrial media. Study sites, including the reference site, were sampled in late July/early August. Species diversity and the mean daily capture rate were the highest for E-1W reach and the lowest for the reference site. Species composition among the three reaches in Mortandad was similar with very little overlap with the reference canyon. Differences in species composition and diversity were most likely due to differences in habitat. Sex ratios, body weights, and reproductive status of small mammals were also evaluated. However, small sample sizes of some species within some sites affected the analysis. Ratios of males to females by species of each site (n = 5) were tested using a Chi-square analysis. No differences were detected. Where there was sufficient sample size, body weights of adult small mammals were compared between sites. No differences in body weights were found. Reproductive status of species appears to be similar across sites. However, sample

  1. Copernicus - Practice of Daily Life in a National Mapping Agency?

    Science.gov (United States)

    Wiatr, T.; Suresh, G.; Gehrke, R.; Hovenbitzer, M.

    2016-06-01

    Copernicus is an European system created for Earth observation and monitoring. It consists of a set of Earth observation satellites and in-situ sensors that provide geo-information that are used, through a set of Copernicus services, for applications related to the environment and global security. The main services of the Copernicus programme address six thematic areas: land, marine, atmosphere, climate change, emergency management and security. In Germany, there is a national service team of Copernicus service coordinators, who are responsible for the national development of the Copernicus services and for providing user-specific information about the Copernicus processes. These coordinators represent the contact points for all the programmes and services concerning their respective Copernicus theme. To publish information about Copernicus, national conferences and workshops are organised. Many people are involved in planning the continuous process of bringing the information to public authorities, research institutes and commercial companies. The Federal Agency for Cartography and Geodesy (Bundesamt für Kartographie und Geodäsie, BKG) is one such organisation, and is mainly responsible for the national land monitoring service of Copernicus. To make use of the freely available data from the Copernicus programme, the Federal Agency for Cartography and Geodesy is currently developing new applications and projects in the field of remote sensing and land monitoring. These projects can be used by other public authorities as examples on how to use the Copernicus data and services for their individual demands and requirements. Copernicus data and services are currently not very commonly used in the daily routine of the national mapping agencies, but they will soon be.

  2. Intracellular targeting of mercaptoundecahydrododecaborate (BSH) to malignant glioma by transferrin-PEG liposomes for boron neutron capture therapy (BNCT)

    International Nuclear Information System (INIS)

    Doi, Atsushi; Miyatake, Shin-ichi; Iida, Kyouko

    2006-01-01

    Malignant glioma is one of the most difficult tumor to control with usual therapies. In our institute, we select boron neutron capture therapy (BNCT) as an adjuvant radiation therapy after surgical resection. This therapy requires the selective delivery of high concentration of 10 B to malignant tumor tissue. In this study, we focused on a tumor-targeting 10 B delivery system (BDS) for BNCT that uses transferrin-conjugated polyethylene-glycol liposome encapsulating BSH (TF-PEG liposome-BSH) and compared 10 B uptake of the tumor among BSH, PEG liposome-BSH and TF-PEG liposome-BSH. In vitro, we analyzed 10 B concentration of the cultured human U87Δ glioma cells incubated in medium containing 20 μg 10 B/ml derived from each BDS by inductively coupled plasma atomic emission spectrometry (ICP-AES). In vivo, human U87Δ glioma-bearing nude mice were administered with each BDS (35mg 10 B/kg) intravenously. We analyzed 10 B concentration of tumor, normal brain and blood by ICP-AES. The TF-PEG liposome-BSH showed higher absolute concentration more than the other BDS. Moreover, TF-PEG liposome-BSH decreased 10 B concentration in blood and normal tissue while it maintained high 10 B concentration in tumor tissue for a couple of days. This showed the TF-PEG liposome-BSH caused the selective delivery of high concentration of 10 B to malignant tumor tissue. The TF-PEG liposome-BSH is more potent BDS for BNCT to obtain absolute high 10 B concentration and good contrast between tumor and normal tissue than BSH and PEG liposome-BSH. (author)

  3. IVS contribution to the next ITRF

    Science.gov (United States)

    Bachmann, Sabine; Messerschmitt, Linda; Thaller, Daniela

    2015-04-01

    Generating the contribution of the International VLBI Service (IVS) to the next ITRF (ITRF2013 or ITRF2014) was the main task of the IVS Combination Center at the Federal Agency for Cartography and Geodesy (BKG, Germany) in 2014. Starting with the ITRF2005, the IVS contribution to the ITRF is an intra-technique combined solution using multiple individual contributions from different institutions. For the upcoming ITRF ten international institutions submitted data files for a combined solution. The data files contain 24h VLBI sessions from the late 1970s until the end of 2014 in SINEX file format containing datum free normal equations with station coordinates and Earth Orientation Parameters (EOP). All contributions have to meet the IVS standards for ITRF contribution in order to guarantee a consistent combined solution. In the course of the generation of the intra-technique combined solution, station coordinate time series for each station as well as a Terrestrial Reference Frame based on the contributed VLBI data (VTRF) were generated and analyzed. Preliminary results using data until the end of 2013 show a scaling factor of -0.47 ppb resulting from a 7-parameter Helmert transformation of the VTRF w.r.t. ITRF2008, which is comparable to the scaling factor that was determined in the precedent ITRF generation. An internal comparison of the EOPs between the combined solution and the individual contributions as well as external comparisons of the EOP series were carried out in order to assure a consistent quality of the EOPs. The data analyses, the combination procedure and results of the combined solution for station coordinates and EOP will be presented.

  4. Combining the Observations from Different GNSS (Invited)

    Science.gov (United States)

    Dach, R.; Lutz, S.; Schaer, S.; Bock, H.; Jäggi, A.; Meindl, M.; Ostini, L.; Thaller, D.; Steinbach, A.; Beutler, G.; Steigenberger, P.

    2009-12-01

    For a very long time GPS has clearly dominated the use of GNSS measurements for scientific purposes. This picture is changing: we are moving from a GPS-only to a multi-GNSS world. This is, e.g., reflected by changing the meaning of the abbreviation IGS in March 2005 from International GPS to GNSS Service. The current situation can be described as follows: GPS has the leading role in the GNSS because it has provided a very stable satellite constellation over many years. Some of the currently active GPS satellites are nearly 15 years old. These old satellites are expected to be decommissioned within the next years. On the other hand, due to the increasing number of active GLONASS satellites and the improved density of multi-GNSS tracking stations in the IGS network, the quality of the GLONASS orbits has drastically improved during the last years. The European Galileo system is under development: currently two test satellites (GIOVE-A and GIOVE-B) are in orbit. The IOV (in-orbit-validation phase) will start soon. Also the first test satellites for the Chinese Compass system are in space. For the maximum benefit the observations of these GNSS will be processed in a combined multi-GNSS analysis in future. CODE (Center for Orbit Determination in Europe) is a joint venture between the Astronomical Institute of the University Bern (AIUB, Bern, Switzerland), the Federal Office of Topography (swisstopo, Wabern, Switzerland), the Federal Agency for Cartography and Geodesy (BKG, Frankfurt am Main, Germany), and the Institut für Astronomische und Physikalische Geodäsie of the Technische Universität München (IAPG/TUM, Munich, Germany). It acts as one of the global analysis centers of the IGS and has started in May 2003 with a rigorous combined processing of GPS and GLONASS measurements for the final, rapid, and even ultra-rapid product lines. All contributions from CODE to the IGS are in fact multi-GNSS products -- the only exception is the satellite and receiver clock

  5. Legionella in industrial cooling towers: monitoring and control strategies.

    Science.gov (United States)

    Carducci, A; Verani, M; Battistini, R

    2010-01-01

    Legionella contamination of industrial cooling towers has been identified as the cause of sporadic cases and outbreaks of legionellosis among people living nearby. To evaluate and control Legionella contamination in industrial cooling tower water, microbiological monitoring was carried out to determine the effectiveness of the following different disinfection treatments: (i) continuous chlorine concentration of 0.01 ppm and monthly chlorine shock dosing (5 ppm) on a single cooling tower; (ii) continuous chlorine concentration of 0.4 ppm and monthly shock of biocide P3 FERROCID 8580 (BKG Water Solution) on seven towers. Legionella spp. and total bacterial count (TBC) were determined 3 days before and after each shock dose. Both strategies demonstrated that when chlorine was maintained at low levels, the Legionella count grew to levels above 10(4) CFU l(-1) while TBC still remained above 10(8 )CFU l(-1). Chlorine shock dosing was able to eliminate bacterial contamination, but only for 10-15 days. Biocide shock dosing was also insufficient to control the problem when the disinfectant concentration was administered at only one point in the plant and at the concentration of 30 ppm. On the other hand, when at a biocide concentration of 30 or 50 ppm was distributed throughout a number of points, depending on the plant hydrodynamics, Legionella counts decreased significantly and often remained below the warning limit. Moreover, the contamination of water entering the plant and the presence of sediment were also important factors for Legionella growth. For effective decontamination of outdoor industrial cooling towers, disinfectants should be distributed in a targeted way, taking into account the possible sources of contamination. The data of the research permitted to modify the procedure of disinfection for better reduce the water and aerosol contamination and consequently the exposure risk.

  6. Decision analysis multicriteria analysis

    International Nuclear Information System (INIS)

    Lombard, J.

    1986-09-01

    The ALARA procedure covers a wide range of decisions from the simplest to the most complex one. For the simplest one the engineering judgement is generally enough and the use of a decision aiding technique is therefore not necessary. For some decisions the comparison of the available protection option may be performed from two or a few criteria (or attributes) (protection cost, collective dose,...) and the use of rather simple decision aiding techniques, like the Cost Effectiveness Analysis or the Cost Benefit Analysis, is quite enough. For the more complex decisions, involving numerous criteria or for decisions involving large uncertainties or qualitative judgement the use of these techniques, even the extended cost benefit analysis, is not recommended and appropriate techniques like multi-attribute decision aiding techniques are more relevant. There is a lot of such particular techniques and it is not possible to present all of them. Therefore only two broad categories of multi-attribute decision aiding techniques will be presented here: decision analysis and the outranking analysis

  7. Shape analysis in medical image analysis

    CERN Document Server

    Tavares, João

    2014-01-01

    This book contains thirteen contributions from invited experts of international recognition addressing important issues in shape analysis in medical image analysis, including techniques for image segmentation, registration, modelling and classification, and applications in biology, as well as in cardiac, brain, spine, chest, lung and clinical practice. This volume treats topics such as, anatomic and functional shape representation and matching; shape-based medical image segmentation; shape registration; statistical shape analysis; shape deformation; shape-based abnormity detection; shape tracking and longitudinal shape analysis; machine learning for shape modeling and analysis; shape-based computer-aided-diagnosis; shape-based medical navigation; benchmark and validation of shape representation, analysis and modeling algorithms. This work will be of interest to researchers, students, and manufacturers in the fields of artificial intelligence, bioengineering, biomechanics, computational mechanics, computationa...

  8. Theoretical numerical analysis a functional analysis framework

    CERN Document Server

    Atkinson, Kendall

    2005-01-01

    This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solu

  9. An example of multidimensional analysis: Discriminant analysis

    International Nuclear Information System (INIS)

    Lutz, P.

    1990-01-01

    Among the approaches on the data multi-dimensional analysis, lectures on the discriminant analysis including theoretical and practical aspects are presented. The discrimination problem, the analysis steps and the discrimination categories are stressed. Examples on the descriptive historical analysis, the discrimination for decision making, the demonstration and separation of the top quark are given. In the linear discriminant analysis the following subjects are discussed: Huyghens theorem, projection, discriminant variable, geometrical interpretation, case for g=2, classification method, separation of the top events. Criteria allowing the obtention of relevant results are included [fr

  10. Energy-Water Modeling and Analysis | Energy Analysis | NREL

    Science.gov (United States)

    Generation (ReEDS Model Analysis) U.S. Energy Sector Vulnerabilities to Climate Change and Extreme Weather Modeling and Analysis Energy-Water Modeling and Analysis NREL's energy-water modeling and analysis vulnerabilities from various factors, including water. Example Projects Renewable Electricity Futures Study

  11. Instrumental analysis

    International Nuclear Information System (INIS)

    Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok

    1989-02-01

    This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.

  12. Instrumental analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok

    1989-02-15

    This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.

  13. Analysis of Project Finance | Energy Analysis | NREL

    Science.gov (United States)

    Analysis of Project Finance Analysis of Project Finance NREL analysis helps potential renewable energy developers and investors gain insights into the complex world of project finance. Renewable energy project finance is complex, requiring knowledge of federal tax credits, state-level incentives, renewable

  14. Performance analysis

    International Nuclear Information System (INIS)

    2008-05-01

    This book introduces energy and resource technology development business with performance analysis, which has business division and definition, analysis of current situation of support, substance of basic plan of national energy, resource technique development, selection of analysis index, result of performance analysis by index, performance result of investigation, analysis and appraisal of energy and resource technology development business in 2007.

  15. Trial Sequential Analysis in systematic reviews with meta-analysis

    Directory of Open Access Journals (Sweden)

    Jørn Wetterslev

    2017-03-01

    Full Text Available Abstract Background Most meta-analyses in systematic reviews, including Cochrane ones, do not have sufficient statistical power to detect or refute even large intervention effects. This is why a meta-analysis ought to be regarded as an interim analysis on its way towards a required information size. The results of the meta-analyses should relate the total number of randomised participants to the estimated required meta-analytic information size accounting for statistical diversity. When the number of participants and the corresponding number of trials in a meta-analysis are insufficient, the use of the traditional 95% confidence interval or the 5% statistical significance threshold will lead to too many false positive conclusions (type I errors and too many false negative conclusions (type II errors. Methods We developed a methodology for interpreting meta-analysis results, using generally accepted, valid evidence on how to adjust thresholds for significance in randomised clinical trials when the required sample size has not been reached. Results The Lan-DeMets trial sequential monitoring boundaries in Trial Sequential Analysis offer adjusted confidence intervals and restricted thresholds for statistical significance when the diversity-adjusted required information size and the corresponding number of required trials for the meta-analysis have not been reached. Trial Sequential Analysis provides a frequentistic approach to control both type I and type II errors. We define the required information size and the corresponding number of required trials in a meta-analysis and the diversity (D2 measure of heterogeneity. We explain the reasons for using Trial Sequential Analysis of meta-analysis when the actual information size fails to reach the required information size. We present examples drawn from traditional meta-analyses using unadjusted naïve 95% confidence intervals and 5% thresholds for statistical significance. Spurious conclusions in

  16. Instrumental analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seung Jae; Seo, Seong Gyu

    1995-03-15

    This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.

  17. Instrumental analysis

    International Nuclear Information System (INIS)

    Kim, Seung Jae; Seo, Seong Gyu

    1995-03-01

    This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.

  18. Improving multi-GNSS ultra-rapid orbit determination for real-time precise point positioning

    Science.gov (United States)

    Li, Xingxing; Chen, Xinghan; Ge, Maorong; Schuh, Harald

    2018-03-01

    Currently, with the rapid development of multi-constellation Global Navigation Satellite Systems (GNSS), the real-time positioning and navigation are undergoing dramatic changes with potential for a better performance. To provide more precise and reliable ultra-rapid orbits is critical for multi-GNSS real-time positioning, especially for the three merging constellations Beidou, Galileo and QZSS which are still under construction. In this contribution, we present a five-system precise orbit determination (POD) strategy to fully exploit the GPS + GLONASS + BDS + Galileo + QZSS observations from CDDIS + IGN + BKG archives for the realization of hourly five-constellation ultra-rapid orbit update. After adopting the optimized 2-day POD solution (updated every hour), the predicted orbit accuracy can be obviously improved for all the five satellite systems in comparison to the conventional 1-day POD solution (updated every 3 h). The orbit accuracy for the BDS IGSO satellites can be improved by about 80, 45 and 50% in the radial, cross and along directions, respectively, while the corresponding accuracy improvement for the BDS MEO satellites reaches about 50, 20 and 50% in the three directions, respectively. Furthermore, the multi-GNSS real-time precise point positioning (PPP) ambiguity resolution has been performed by using the improved precise satellite orbits. Numerous results indicate that combined GPS + BDS + GLONASS + Galileo (GCRE) kinematic PPP ambiguity resolution (AR) solutions can achieve the shortest time to first fix (TTFF) and highest positioning accuracy in all coordinate components. With the addition of the BDS, GLONASS and Galileo observations to the GPS-only processing, the GCRE PPP AR solution achieves the shortest average TTFF of 11 min with 7{°} cutoff elevation, while the TTFF of GPS-only, GR, GE and GC PPP AR solution is 28, 15, 20 and 17 min, respectively. As the cutoff elevation increases, the reliability and accuracy of GPS-only PPP AR solutions

  19. Functional analysis

    CERN Document Server

    Kantorovich, L V

    1982-01-01

    Functional Analysis examines trends in functional analysis as a mathematical discipline and the ever-increasing role played by its techniques in applications. The theory of topological vector spaces is emphasized, along with the applications of functional analysis to applied analysis. Some topics of functional analysis connected with applications to mathematical economics and control theory are also discussed. Comprised of 18 chapters, this book begins with an introduction to the elements of the theory of topological spaces, the theory of metric spaces, and the theory of abstract measure space

  20. Activation analysis in food analysis. Pt. 9

    International Nuclear Information System (INIS)

    Szabo, S.A.

    1992-01-01

    An overview is presented on the application of activation analysis (AA) techniques for food analysis, as reflected at a recent international conference titled Activation Analysis and its Applications. The most popular analytical techniques include instrumental neutron AA, (INAA or NAA), radiochemical NAA (RNAA), X-ray fluorescence analysis and mass spectrometry. Data are presented for the multielemental NAA of instant soups, for elemental composition of drinking water in Iraq, for Na, K, Mn contents of various Indian rices, for As, Hg, Sb and Se determination in various seafoods, for daily microelement takeup in China, for the elemental composition of Chinese teas. Expected development trends in AA are outlined. (R.P.) 24 refs.; 8 tabs

  1. Cross-impacts analysis development and energy policy analysis applications

    Energy Technology Data Exchange (ETDEWEB)

    Roop, J.M.; Scheer, R.M.; Stacey, G.S.

    1986-12-01

    Purpose of this report is to describe the cross-impact analysis process and microcomputer software developed for the Office of Policy, Planning, and Analysis (PPA) of DOE. First introduced in 1968, cross-impact analysis is a technique that produces scenarios of future conditions and possibilities. Cross-impact analysis has several unique attributes that make it a tool worth examining, especially in the current climate when the outlook for the economy and several of the key energy markets is uncertain. Cross-impact analysis complements the econometric, engineering, systems dynamics, or trend approaches already in use at DOE. Cross-impact analysis produces self-consistent scenarios in the broadest sense and can include interaction between the economy, technology, society and the environment. Energy policy analyses that couple broad scenarios of the future with detailed forecasting can produce more powerful results than scenario analysis or forecasts can produce alone.

  2. DTI analysis methods : Voxel-based analysis

    NARCIS (Netherlands)

    Van Hecke, Wim; Leemans, Alexander; Emsell, Louise

    2016-01-01

    Voxel-based analysis (VBA) of diffusion tensor imaging (DTI) data permits the investigation of voxel-wise differences or changes in DTI metrics in every voxel of a brain dataset. It is applied primarily in the exploratory analysis of hypothesized group-level alterations in DTI parameters, as it does

  3. Image analysis

    International Nuclear Information System (INIS)

    Berman, M.; Bischof, L.M.; Breen, E.J.; Peden, G.M.

    1994-01-01

    This paper provides an overview of modern image analysis techniques pertinent to materials science. The usual approach in image analysis contains two basic steps: first, the image is segmented into its constituent components (e.g. individual grains), and second, measurement and quantitative analysis are performed. Usually, the segmentation part of the process is the harder of the two. Consequently, much of the paper concentrates on this aspect, reviewing both fundamental segmentation tools (commonly found in commercial image analysis packages) and more advanced segmentation tools. There is also a review of the most widely used quantitative analysis methods for measuring the size, shape and spatial arrangements of objects. Many of the segmentation and analysis methods are demonstrated using complex real-world examples. Finally, there is a discussion of hardware and software issues. 42 refs., 17 figs

  4. Common pitfalls in statistical analysis: Linear regression analysis

    Directory of Open Access Journals (Sweden)

    Rakesh Aggarwal

    2017-01-01

    Full Text Available In a previous article in this series, we explained correlation analysis which describes the strength of relationship between two continuous variables. In this article, we deal with linear regression analysis which predicts the value of one continuous variable from another. We also discuss the assumptions and pitfalls associated with this analysis.

  5. CSF analysis

    Science.gov (United States)

    Cerebrospinal fluid analysis ... Analysis of CSF can help detect certain conditions and diseases. All of the following can be, but ... An abnormal CSF analysis result may be due to many different causes, ... Encephalitis (such as West Nile and Eastern Equine) Hepatic ...

  6. Semen analysis

    Science.gov (United States)

    ... analysis URL of this page: //medlineplus.gov/ency/article/003627.htm Semen analysis To use the sharing features on this page, please enable JavaScript. Semen analysis measures the amount and quality of a man's semen and sperm. Semen is ...

  7. Models of Economic Analysis

    OpenAIRE

    Adrian Ioana; Tiberiu Socaciu

    2013-01-01

    The article presents specific aspects of management and models for economic analysis. Thus, we present the main types of economic analysis: statistical analysis, dynamic analysis, static analysis, mathematical analysis, psychological analysis. Also we present the main object of the analysis: the technological activity analysis of a company, the analysis of the production costs, the economic activity analysis of a company, the analysis of equipment, the analysis of labor productivity, the anal...

  8. Limestone rocks analysis by X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Izquierdo M, G.; Ponce R, R.; Vazquez J, J.

    1996-01-01

    By request of a private company, employing basically X-ray fluorescence analysis (X RF), was established a fast and accurate method for the analysis of the major elements in limestone rocks. Additionally, for complementing analysis was determined by ion chromatography, the chlorides appearance and by atomic absorption of sodium. By gravimetry, was determined the losses by ignition and the alpha quartz. (Author)

  9. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  10. Emergy-Based Regional Socio-Economic Metabolism Analysis: An Application of Data Envelopment Analysis and Decomposition Analysis

    OpenAIRE

    Zilong Zhang; Xingpeng Chen; Peter Heck

    2014-01-01

    Integrated analysis on socio-economic metabolism could provide a basis for understanding and optimizing regional sustainability. The paper conducted socio-economic metabolism analysis by means of the emergy accounting method coupled with data envelopment analysis and decomposition analysis techniques to assess the sustainability of Qingyang city and its eight sub-region system, as well as to identify the major driving factors of performance change during 2000–2007, to serve as the basis for f...

  11. failure analysis of a uav flight control system using markov analysis

    African Journals Online (AJOL)

    eobe

    2016-01-01

    Jan 1, 2016 ... Tree Analysis (FTA), Dependence Diagram Analysis. (DDA) and Markov Analysis (MA) are the most widely-used methods of probabilistic safety and reliability analysis for airborne system [1]. Fault trees analysis is a backward failure searching ..... [4] Christopher Dabrowski and Fern Hunt Markov Chain.

  12. Incidents analysis

    International Nuclear Information System (INIS)

    Francois, P.

    1996-01-01

    We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs

  13. Incidents analysis

    Energy Technology Data Exchange (ETDEWEB)

    Francois, P

    1997-12-31

    We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs.

  14. RELIABILITY ANALYSIS OF BENDING ELIABILITY ANALYSIS OF ...

    African Journals Online (AJOL)

    eobe

    Reliability analysis of the safety levels of the criteria slabs, have been .... was also noted [2] that if the risk level or β < 3.1), the ... reliability analysis. A study [6] has shown that all geometric variables, ..... Germany, 1988. 12. Hasofer, A. M and ...

  15. Safety analysis fundamentals

    International Nuclear Information System (INIS)

    Wright, A.C.D.

    2002-01-01

    This paper discusses the safety analysis fundamentals in reactor design. This study includes safety analysis done to show consequences of postulated accidents are acceptable. Safety analysis is also used to set design of special safety systems and includes design assist analysis to support conceptual design. safety analysis is necessary for licensing a reactor, to maintain an operating license, support changes in plant operations

  16. Visual physics analysis-from desktop to physics analysis at your fingertips

    International Nuclear Information System (INIS)

    Bretz, H-P; Erdmann, M; Fischer, R; Hinzmann, A; Klingebiel, D; Komm, M; Lingemann, J; Rieger, M; Müller, G; Steggemann, J; Winchen, T

    2012-01-01

    Visual Physics Analysis (VISPA) is an analysis environment with applications in high energy and astroparticle physics. Based on a data-flow-driven paradigm, it allows users to combine graphical steering with self-written C++ and Python modules. This contribution presents new concepts integrated in VISPA: layers, convenient analysis execution, and web-based physics analysis. While the convenient execution offers full flexibility to vary settings for the execution phase of an analysis, layers allow to create different views of the analysis already during its design phase. Thus, one application of layers is to define different stages of an analysis (e.g. event selection and statistical analysis). However, there are other use cases such as to independently optimize settings for different types of input data in order to guide all data through the same analysis flow. The new execution feature makes job submission to local clusters as well as the LHC Computing Grid possible directly from VISPA. Web-based physics analysis is realized in the VISPA-Web project, which represents a whole new way to design and execute analyses via a standard web browser.

  17. Experimental modal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ibsen, Lars Bo; Liingaard, M.

    2006-12-15

    This technical report concerns the basic theory and principles for experimental modal analysis. The sections within the report are: Output-only modal analysis software, general digital analysis, basics of structural dynamics and modal analysis and system identification. (au)

  18. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  19. Urban energy consumption: Different insights from energy flow analysis, input–output analysis and ecological network analysis

    International Nuclear Information System (INIS)

    Chen, Shaoqing; Chen, Bin

    2015-01-01

    Highlights: • Urban energy consumption was assessed from three different perspectives. • A new concept called controlled energy was developed from network analysis. • Embodied energy and controlled energy consumption of Beijing were compared. • The integration of all three perspectives will elucidate sustainable energy use. - Abstract: Energy consumption has always been a central issue for sustainable urban assessment and planning. Different forms of energy analysis can provide various insights for energy policy making. This paper brought together three approaches for energy consumption accounting, i.e., energy flow analysis (EFA), input–output analysis (IOA) and ecological network analysis (ENA), and compared their different perspectives and the policy implications for urban energy use. Beijing was used to exemplify the different energy analysis processes, and the 42 economic sectors of the city were aggregated into seven components. It was determined that EFA quantifies both the primary and final energy consumption of the urban components by tracking the different types of fuel used by the urban economy. IOA accounts for the embodied energy consumption (direct and indirect) used to produce goods and services in the city, whereas the control analysis of ENA quantifies the specific embodied energy that is regulated by the activities within the city’s boundary. The network control analysis can also be applied to determining which economic sectors drive the energy consumption and to what extent these sectors are dependent on each other for energy. So-called “controlled energy” is a new concept that adds to the analysis of urban energy consumption, indicating the adjustable energy consumed by sectors. The integration of insights from all three accounting perspectives further our understanding of sustainable energy use in cities

  20. CADDIS Volume 4. Data Analysis: Exploratory Data Analysis

    Science.gov (United States)

    Intro to exploratory data analysis. Overview of variable distributions, scatter plots, correlation analysis, GIS datasets. Use of conditional probability to examine stressor levels and impairment. Exploring correlations among multiple stressors.

  1. Social network analysis applied to team sports analysis

    CERN Document Server

    Clemente, Filipe Manuel; Mendes, Rui Sousa

    2016-01-01

    Explaining how graph theory and social network analysis can be applied to team sports analysis, This book presents useful approaches, models and methods that can be used to characterise the overall properties of team networks and identify the prominence of each team player. Exploring the different possible network metrics that can be utilised in sports analysis, their possible applications and variances from situation to situation, the respective chapters present an array of illustrative case studies. Identifying the general concepts of social network analysis and network centrality metrics, readers are shown how to generate a methodological protocol for data collection. As such, the book provides a valuable resource for students of the sport sciences, sports engineering, applied computation and the social sciences.

  2. Qualitative Content Analysis

    OpenAIRE

    Philipp Mayring

    2000-01-01

    The article describes an approach of systematic, rule guided qualitative text analysis, which tries to preserve some methodological strengths of quantitative content analysis and widen them to a concept of qualitative procedure. First the development of content analysis is delineated and the basic principles are explained (units of analysis, step models, working with categories, validity and reliability). Then the central procedures of qualitative content analysis, inductive development of ca...

  3. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  4. Information security risk analysis

    CERN Document Server

    Peltier, Thomas R

    2001-01-01

    Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex

  5. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    Science.gov (United States)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  6. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  7. Radioactivation analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1959-07-15

    Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation

  8. Radioactivation analysis

    International Nuclear Information System (INIS)

    1959-01-01

    Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation

  9. Uranium Isotopic Analysis with the FRAM Isotopic Analysis Code

    International Nuclear Information System (INIS)

    Vo, D.T.; Sampson, T.E.

    1999-01-01

    FRAM is the acronym for Fixed-Energy Response-Function Analysis with Multiple efficiency. This software was developed at Los Alamos National Laboratory originally for plutonium isotopic analysis. Later, it was adapted for uranium isotopic analysis in addition to plutonium. It is a code based on a self-calibration using several gamma-ray peaks for determining the isotopic ratios. The versatile-parameter database structure governs all facets of the data analysis. User editing of the parameter sets allows great flexibility in handling data with different isotopic distributions, interfering isotopes, and different acquisition parameters such as energy calibration and detector type

  10. Real analysis a comprehensive course in analysis, part 1

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 1 is devoted to real analysis. From one point of view, it presents the infinitesimal calculus of the twentieth century with the ultimate integral calculus (measure theory)

  11. Text analysis methods, text analysis apparatuses, and articles of manufacture

    Science.gov (United States)

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  12. NCEP SST Analysis

    Science.gov (United States)

    Organization Search Go Search Polar Go MMAB SST Analysis Main page About MMAB Our Mission Our Personnel EMC Branches Global Climate & Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Contact EMC (RTG_SST_HR) analysis For a regional map, click the desired area in the global SST analysis and anomaly maps

  13. Analysis of Heat Transfer

    International Nuclear Information System (INIS)

    2003-08-01

    This book deals with analysis of heat transfer which includes nonlinear analysis examples, radiation heat transfer, analysis of heat transfer in ANSYS, verification of analysis result, analysis of heat transfer of transition with automatic time stepping and open control, analysis of heat transfer using arrangement of ANSYS, resistance of thermal contact, coupled field analysis such as of thermal-structural interaction, cases of coupled field analysis, and phase change.

  14. Combining Conversation Analysis and Nexus Analysis to explore hospital practices

    DEFF Research Database (Denmark)

    Paasch, Bettina Sletten

    , ethnographic observations, interviews, photos and documents were obtained. Inspired by the analytical manoeuvre of zooming in and zooming out proposed by Nicolini (Nicolini, 2009; Nicolini, 2013) the present study uses Conversations Analysis (Sacks, Schegloff, & Jefferson, 1974) and Embodied Interaction...... of interaction. In the conducted interviews nurses report mobile work phones to disturb interactions with patients when they ring, however, analysing the recorded interactions with tools from Conversations Analysis and Embodied Interaction Analysis displays how nurses demonstrate sophisticated awareness...... interrelationships influencing it. The present study thus showcases how Conversation Analysis and Nexus Analysis can be combined to achieve a multi-layered perspective on interactions between nurses, patients and mobile work phones....

  15. http Log Analysis

    DEFF Research Database (Denmark)

    Bøving, Kristian Billeskov; Simonsen, Jesper

    2004-01-01

    This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data...

  16. Importance-performance analysis based SWOT analysis

    OpenAIRE

    Phadermrod, Boonyarat; Crowder, Richard M.; Wills, Gary B.

    2016-01-01

    SWOT analysis, a commonly used tool for strategic planning, is traditionally a form of brainstorming. Hence, it has been criticised that it is likely to hold subjective views of the individuals who participate in a brainstorming session and that SWOT factors are not prioritized by their significance thus it may result in an improper strategic action. While most studies of SWOT analysis have only focused on solving these shortcomings separately, this study offers an approach to diminish both s...

  17. Neutron activation analysis of high-purity iron in comparison with chemical analysis

    International Nuclear Information System (INIS)

    Kinomura, Atsushi; Horino, Yuji; Takaki, Seiichi; Abiko, Kenji

    2000-01-01

    Neutron activation analysis of iron samples of three different purity levels has been performed and compared with chemical analysis for 30 metallic and metalloid impurity elements. The concentration of As, Cl, Cu, Sb and V detected by neutron activation analysis was mostly in agreement with that obtained by chemical analysis. The sensitivity limits of neutron activation analysis of three kinds of iron samples were calculated and found to be reasonable compared with measured values or detection limits of chemical analysis; however, most of them were above the detection limits of chemical analysis. Graphite-shielded irradiation to suppress fast neutron reactions was effective for Mn analysis without decreasing sensitivity to the other impurity elements. (author)

  18. Sensitivity analysis for matched pair analysis of binary data: From worst case to average case analysis.

    Science.gov (United States)

    Hasegawa, Raiden; Small, Dylan

    2017-12-01

    In matched observational studies where treatment assignment is not randomized, sensitivity analysis helps investigators determine how sensitive their estimated treatment effect is to some unmeasured confounder. The standard approach calibrates the sensitivity analysis according to the worst case bias in a pair. This approach will result in a conservative sensitivity analysis if the worst case bias does not hold in every pair. In this paper, we show that for binary data, the standard approach can be calibrated in terms of the average bias in a pair rather than worst case bias. When the worst case bias and average bias differ, the average bias interpretation results in a less conservative sensitivity analysis and more power. In many studies, the average case calibration may also carry a more natural interpretation than the worst case calibration and may also allow researchers to incorporate additional data to establish an empirical basis with which to calibrate a sensitivity analysis. We illustrate this with a study of the effects of cellphone use on the incidence of automobile accidents. Finally, we extend the average case calibration to the sensitivity analysis of confidence intervals for attributable effects. © 2017, The International Biometric Society.

  19. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  20. Mathematical analysis fundamentals

    CERN Document Server

    Bashirov, Agamirza

    2014-01-01

    The author's goal is a rigorous presentation of the fundamentals of analysis, starting from elementary level and moving to the advanced coursework. The curriculum of all mathematics (pure or applied) and physics programs include a compulsory course in mathematical analysis. This book will serve as can serve a main textbook of such (one semester) courses. The book can also serve as additional reading for such courses as real analysis, functional analysis, harmonic analysis etc. For non-math major students requiring math beyond calculus, this is a more friendly approach than many math-centric o

  1. XML-based analysis interface for particle physics data analysis

    International Nuclear Information System (INIS)

    Hu Jifeng; Lu Xiaorui; Zhang Yangheng

    2011-01-01

    The letter emphasizes on an XML-based interface and its framework for particle physics data analysis. The interface uses a concise XML syntax to describe, in data analysis, the basic tasks: event-selection, kinematic fitting, particle identification, etc. and a basic processing logic: the next step goes on if and only if this step succeeds. The framework can perform an analysis without compiling by loading the XML-interface file, setting p in run-time and running dynamically. An analysis coding in XML instead of C++, easy-to-understood arid use, effectively reduces the work load, and enables users to carry out their analyses quickly. The framework has been developed on the BESⅢ offline software system (BOSS) with the object-oriented C++ programming. These functions, required by the regular tasks and the basic processing logic, are implemented with both standard modules or inherited from the modules in BOSS. The interface and its framework have been tested to perform physics analysis. (authors)

  2. Survival analysis using S analysis of time-to-event data

    CERN Document Server

    Tableman, Mara

    2003-01-01

    Survival Analysis Using S: Analysis of Time-to-Event Data is designed as a text for a one-semester or one-quarter course in survival analysis for upper-level or graduate students in statistics, biostatistics, and epidemiology. Prerequisites are a standard pre-calculus first course in probability and statistics, and a course in applied linear regression models. No prior knowledge of S or R is assumed. A wide choice of exercises is included, some intended for more advanced students with a first course in mathematical statistics. The authors emphasize parametric log-linear models, while also detailing nonparametric procedures along with model building and data diagnostics. Medical and public health researchers will find the discussion of cut point analysis with bootstrap validation, competing risks and the cumulative incidence estimator, and the analysis of left-truncated and right-censored data invaluable. The bootstrap procedure checks robustness of cut point analysis and determines cut point(s). In a chapter ...

  3. IAEA Review for Gap Analysis of Safety Analysis Capability

    International Nuclear Information System (INIS)

    Basic, Ivica; Kim, Manwoong; Huges, Peter; Lim, B-K; D'Auria, Francesco; Louis, Vidard Michael

    2014-01-01

    The IAEA Asian Nuclear Safety Network (ANSN) was launched in 2002 in the framework of the Extra Budgetary Programme (EBP) on the Safety of Nuclear Installations in the South East Asia, Pacific and Far East Countries. The main objective is to strengthen and expand human and advanced Information Technology (IT) network to pool, analyse and share nuclear safety knowledge and practical experience for peaceful uses in this region. Under the ANSN framework, a technical group on Safety Analysis (SATG) was established in 2004 aimed to providing a forum for the exchange of experience in the following areas of safety analysis: · To provide a forum for an exchange of experience in the area of safety analysis, · To maintain and improve the knowledge on safety analysis method, · To enhance the utilization of computer codes, · To pool and analyse the issues related with safety analysis of research reactor, and · To facilitate mutual interested on safety analysis among member countries. A sustainable and successful nuclear energy programme requires a strong technical infrastructure, including a workforce made up of highly specialized and well-educated professionals. A significant portion of this technical capacity must be dedicated to safety- especially to safety analysis- as only then can it serve as the basis for making the right decisions during the planning, licensing, construction and operation of new nuclear facilities. In this regard, the IAEA has provided ANSN member countries with comprehensive training opportunities for capacity building in safety analysis. Nevertheless, the SATG recognizes that it is difficult to achieve harmonization in this area among all member countries because of their different competency levels. Therefore, it is necessary to quickly identify the most obvious gaps in safety analysis capability and then to use existing resources to begin to fill those gaps. The goal of this Expert Mission (EM) for gap finding service is to facilitate

  4. Hydroeconomic analysis

    DEFF Research Database (Denmark)

    Bauer-Gottwein, Peter; Riegels, Niels; Pulido-Velazquez, Manuel

    2017-01-01

    Hydroeconomic analysis and modeling provides a consistent and quantitative framework to assess the links between water resources systems and economic activities related to water use, simultaneously modeling water supply and water demand. It supports water managers and decision makers in assessing...... trade-offs between different water uses, different geographic regions, and various economic sectors and between the present and the future. Hydroeconomic analysis provides consistent economic performance criteria for infrastructure development and institutional reform in water policies and management...... organizations. This chapter presents an introduction to hydroeconomic analysis and modeling, and reviews the state of the art in the field. We review available economic water-valuation techniques and summarize the main types of decision problems encountered in hydroeconomic analysis. Popular solution strategies...

  5. Sensitivity analysis

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/article/003741.htm Sensitivity analysis To use the sharing features on this page, please enable JavaScript. Sensitivity analysis determines the effectiveness of antibiotics against microorganisms (germs) ...

  6. The potential for meta-analysis to support decision analysis in ecology.

    Science.gov (United States)

    Mengersen, Kerrie; MacNeil, M Aaron; Caley, M Julian

    2015-06-01

    Meta-analysis and decision analysis are underpinned by well-developed methods that are commonly applied to a variety of problems and disciplines. While these two fields have been closely linked in some disciplines such as medicine, comparatively little attention has been paid to the potential benefits of linking them in ecology, despite reasonable expectations that benefits would be derived from doing so. Meta-analysis combines information from multiple studies to provide more accurate parameter estimates and to reduce the uncertainty surrounding them. Decision analysis involves selecting among alternative choices using statistical information that helps to shed light on the uncertainties involved. By linking meta-analysis to decision analysis, improved decisions can be made, with quantification of the costs and benefits of alternate decisions supported by a greater density of information. Here, we briefly review concepts of both meta-analysis and decision analysis, illustrating the natural linkage between them and the benefits from explicitly linking one to the other. We discuss some examples in which this linkage has been exploited in the medical arena and how improvements in precision and reduction of structural uncertainty inherent in a meta-analysis can provide substantive improvements to decision analysis outcomes by reducing uncertainty in expected loss and maximising information from across studies. We then argue that these significant benefits could be translated to ecology, in particular to the problem of making optimal ecological decisions in the face of uncertainty. Copyright © 2013 John Wiley & Sons, Ltd.

  7. Bayesian Mediation Analysis

    OpenAIRE

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...

  8. Economic and Financial Analysis Tools | Energy Analysis | NREL

    Science.gov (United States)

    Economic and Financial Analysis Tools Economic and Financial Analysis Tools Use these economic and . Job and Economic Development Impact (JEDI) Model Use these easy-to-use, spreadsheet-based tools to analyze the economic impacts of constructing and operating power generation and biofuel plants at the

  9. Computational Music Analysis

    DEFF Research Database (Denmark)

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  10. System analysis and design

    International Nuclear Information System (INIS)

    Son, Seung Hui

    2004-02-01

    This book deals with information technology and business process, information system architecture, methods of system development, plan on system development like problem analysis and feasibility analysis, cases for system development, comprehension of analysis of users demands, analysis of users demands using traditional analysis, users demands analysis using integrated information system architecture, system design using integrated information system architecture, system implementation, and system maintenance.

  11. Semen Analysis Test

    Science.gov (United States)

    ... Sources Ask Us Also Known As Sperm Analysis Sperm Count Seminal Fluid Analysis Formal Name Semen Analysis This ... semen Viscosity—consistency or thickness of the semen Sperm count—total number of sperm Sperm concentration (density)—number ...

  12. Qualitative Content Analysis

    Directory of Open Access Journals (Sweden)

    Satu Elo

    2014-02-01

    Full Text Available Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studies, our own experiences, and methodological textbooks. Trustworthiness was described for the main qualitative content analysis phases from data collection to reporting of the results. We concluded that it is important to scrutinize the trustworthiness of every phase of the analysis process, including the preparation, organization, and reporting of results. Together, these phases should give a reader a clear indication of the overall trustworthiness of the study. Based on our findings, we compiled a checklist for researchers attempting to improve the trustworthiness of a content analysis study. The discussion in this article helps to clarify how content analysis should be reported in a valid and understandable manner, which would be of particular benefit to reviewers of scientific articles. Furthermore, we discuss that it is often difficult to evaluate the trustworthiness of qualitative content analysis studies because of defective data collection method description and/or analysis description.

  13. Real analysis

    CERN Document Server

    McShane, Edward James

    2013-01-01

    This text surveys practical elements of real function theory, general topology, and functional analysis. Discusses the maximality principle, the notion of convergence, the Lebesgue-Stieltjes integral, function spaces and harmonic analysis. Includes exercises. 1959 edition.

  14. Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study.

    Science.gov (United States)

    Vaismoradi, Mojtaba; Turunen, Hannele; Bondas, Terese

    2013-09-01

    Qualitative content analysis and thematic analysis are two commonly used approaches in data analysis of nursing research, but boundaries between the two have not been clearly specified. In other words, they are being used interchangeably and it seems difficult for the researcher to choose between them. In this respect, this paper describes and discusses the boundaries between qualitative content analysis and thematic analysis and presents implications to improve the consistency between the purpose of related studies and the method of data analyses. This is a discussion paper, comprising an analytical overview and discussion of the definitions, aims, philosophical background, data gathering, and analysis of content analysis and thematic analysis, and addressing their methodological subtleties. It is concluded that in spite of many similarities between the approaches, including cutting across data and searching for patterns and themes, their main difference lies in the opportunity for quantification of data. It means that measuring the frequency of different categories and themes is possible in content analysis with caution as a proxy for significance. © 2013 Wiley Publishing Asia Pty Ltd.

  15. Numerical analysis

    CERN Document Server

    Khabaza, I M

    1960-01-01

    Numerical Analysis is an elementary introduction to numerical analysis, its applications, limitations, and pitfalls. Methods suitable for digital computers are emphasized, but some desk computations are also described. Topics covered range from the use of digital computers in numerical work to errors in computations using desk machines, finite difference methods, and numerical solution of ordinary differential equations. This book is comprised of eight chapters and begins with an overview of the importance of digital computers in numerical analysis, followed by a discussion on errors in comput

  16. Recursive analysis

    CERN Document Server

    Goodstein, R L

    2010-01-01

    Recursive analysis develops natural number computations into a framework appropriate for real numbers. This text is based upon primary recursive arithmetic and presents a unique combination of classical analysis and intuitional analysis. Written by a master in the field, it is suitable for graduate students of mathematics and computer science and can be read without a detailed knowledge of recursive arithmetic.Introductory chapters on recursive convergence and recursive and relative continuity are succeeded by explorations of recursive and relative differentiability, the relative integral, and

  17. Gap Analysis: Application to Earned Value Analysis

    OpenAIRE

    Langford, Gary O.; Franck, Raymond (Chip)

    2008-01-01

    Sponsored Report (for Acquisition Research Program) Earned Value is regarded as a useful tool to monitor commercial and defense system acquisitions. This paper applies the theoretical foundations and systematics of Gap Analysis to improve Earned Value Management. As currently implemented, Earned Value inaccurately provides a higher value for the work performed. This preliminary research indicates that Earned Value calculations can be corrected. Value Analysis, properly defined and enacted,...

  18. CONTENT ANALYSIS, DISCOURSE ANALYSIS, AND CONVERSATION ANALYSIS: PRELIMINARY STUDY ON CONCEPTUAL AND THEORETICAL METHODOLOGICAL DIFFERENCES

    Directory of Open Access Journals (Sweden)

    Anderson Tiago Peixoto Gonçalves

    2016-08-01

    Full Text Available This theoretical essay aims to reflect on three models of text interpretation used in qualitative research, which is often confused in its concepts and methodologies (Content Analysis, Discourse Analysis, and Conversation Analysis. After the presentation of the concepts, the essay proposes a preliminary discussion on conceptual and theoretical methodological differences perceived between them. A review of the literature was performed to support the conceptual and theoretical methodological discussion. It could be verified that the models have differences related to the type of strategy used in the treatment of texts, the type of approach, and the appropriate theoretical position.

  19. Moyer's method of mixed dentition analysis: a meta-analysis ...

    African Journals Online (AJOL)

    The applicability of tables derived from the data Moyer used to other ethnic groups has ... This implies that Moyer's method of prediction may have population variations. ... Key Words: meta-analysis, mixed dentition analysis, Moyer's method

  20. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  1. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  2. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  3. Trial Sequential Analysis in systematic reviews with meta-analysis

    DEFF Research Database (Denmark)

    Wetterslev, Jørn; Jakobsen, Janus Christian; Gluud, Christian

    2017-01-01

    BACKGROUND: Most meta-analyses in systematic reviews, including Cochrane ones, do not have sufficient statistical power to detect or refute even large intervention effects. This is why a meta-analysis ought to be regarded as an interim analysis on its way towards a required information size...... from traditional meta-analyses using unadjusted naïve 95% confidence intervals and 5% thresholds for statistical significance. Spurious conclusions in systematic reviews with traditional meta-analyses can be reduced using Trial Sequential Analysis. Several empirical studies have demonstrated...

  4. On-line analysis of reactor noise using time-series analysis

    International Nuclear Information System (INIS)

    McGevna, V.G.

    1981-10-01

    A method to allow use of time series analysis for on-line noise analysis has been developed. On-line analysis of noise in nuclear power reactors has been limited primarily to spectral analysis and related frequency domain techniques. Time series analysis has many distinct advantages over spectral analysis in the automated processing of reactor noise. However, fitting an autoregressive-moving average (ARMA) model to time series data involves non-linear least squares estimation. Unless a high speed, general purpose computer is available, the calculations become too time consuming for on-line applications. To eliminate this problem, a special purpose algorithm was developed for fitting ARMA models. While it is based on a combination of steepest descent and Taylor series linearization, properties of the ARMA model are used so that the auto- and cross-correlation functions can be used to eliminate the need for estimating derivatives. The number of calculations, per iteration varies lineegardless of the mee 0.2% yield strength displayed anisotropy, with axial and circumferential values being greater than radial. For CF8-CPF8 and CF8M-CPF8M castings to meet current ASME Code S acid fuel cells

  5. The Use of Object-Oriented Analysis Methods in Surety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.

  6. Circuit analysis for dummies

    CERN Document Server

    Santiago, John

    2013-01-01

    Circuits overloaded from electric circuit analysis? Many universities require that students pursuing a degree in electrical or computer engineering take an Electric Circuit Analysis course to determine who will ""make the cut"" and continue in the degree program. Circuit Analysis For Dummies will help these students to better understand electric circuit analysis by presenting the information in an effective and straightforward manner. Circuit Analysis For Dummies gives you clear-cut information about the topics covered in an electric circuit analysis courses to help

  7. Hazard Analysis Database Report

    CERN Document Server

    Grams, W H

    2000-01-01

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...

  8. Descriptive data analysis.

    Science.gov (United States)

    Thompson, Cheryl Bagley

    2009-01-01

    This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.

  9. International Market Analysis

    DEFF Research Database (Denmark)

    Sørensen, Olav Jull

    2009-01-01

    The review presents the book International Market Analysis: Theories and Methods, written by John Kuiada, professor at Centre of International Business, Department of Business Studies, Aalborg University. The book is refreshingly new in its way of looking at a classical problem. It looks at market...... analysis from the point of vie of ways of thinking about markets. Furthermore, the book includes the concept of learning in the analysis of markets og how the way we understand business reality influneces our choice of methodology for market analysis....

  10. Improvement of Binary Analysis Components in Automated Malware Analysis Framework

    Science.gov (United States)

    2017-02-21

    AFRL-AFOSR-JP-TR-2017-0018 Improvement of Binary Analysis Components in Automated Malware Analysis Framework Keiji Takeda KEIO UNIVERSITY Final...TYPE Final 3. DATES COVERED (From - To) 26 May 2015 to 25 Nov 2016 4. TITLE AND SUBTITLE Improvement of Binary Analysis Components in Automated Malware ...analyze malicious software ( malware ) with minimum human interaction. The system autonomously analyze malware samples by analyzing malware binary program

  11. Analysis of mineral phases in coal utilizing factor analysis

    International Nuclear Information System (INIS)

    Roscoe, B.A.; Hopke, P.K.

    1982-01-01

    The mineral phase inclusions of coal are discussed. The contribution of these to a coal sample are determined utilizing several techniques. Neutron activation analysis in conjunction with coal washability studies have produced some information on the general trends of elemental variation in the mineral phases. These results have been enhanced by the use of various statistical techniques. The target transformation factor analysis is specifically discussed and shown to be able to produce elemental profiles of the mineral phases in coal. A data set consisting of physically fractionated coal samples was generated. These samples were analyzed by neutron activation analysis and then their elemental concentrations examined using TTFA. Information concerning the mineral phases in coal can thus be acquired from factor analysis even with limited data. Additional data may permit the resolution of additional mineral phases as well as refinement of theose already identified

  12. Meta-analysis with R

    CERN Document Server

    Schwarzer, Guido; Rücker, Gerta

    2015-01-01

    This book provides a comprehensive introduction to performing meta-analysis using the statistical software R. It is intended for quantitative researchers and students in the medical and social sciences who wish to learn how to perform meta-analysis with R. As such, the book introduces the key concepts and models used in meta-analysis. It also includes chapters on the following advanced topics: publication bias and small study effects; missing data; multivariate meta-analysis, network meta-analysis; and meta-analysis of diagnostic studies.  .

  13. Cluster analysis for applications

    CERN Document Server

    Anderberg, Michael R

    1973-01-01

    Cluster Analysis for Applications deals with methods and various applications of cluster analysis. Topics covered range from variables and scales to measures of association among variables and among data units. Conceptual problems in cluster analysis are discussed, along with hierarchical and non-hierarchical clustering methods. The necessary elements of data analysis, statistics, cluster analysis, and computer implementation are integrated vertically to cover the complete path from raw data to a finished analysis.Comprised of 10 chapters, this book begins with an introduction to the subject o

  14. Job Analysis

    OpenAIRE

    Bravená, Helena

    2009-01-01

    This bacherlor thesis deals with the importance of job analysis for personnel activities in the company. The aim of this work is to find the most suitable method of job analysis in a particular enterprise, and continues creating descriptions and specifications of each job.

  15. Visual physics analysis VISPA

    International Nuclear Information System (INIS)

    Actis, Oxana; Brodski, Michael; Erdmann, Martin; Fischer, Robert; Hinzmann, Andreas; Mueller, Gero; Muenzer, Thomas; Plum, Matthias; Steggemann, Jan; Winchen, Tobias; Klimkovich, Tatsiana

    2010-01-01

    VISPA is a development environment for high energy physics analyses which enables physicists to combine graphical and textual work. A physics analysis cycle consists of prototyping, performing, and verifying the analysis. The main feature of VISPA is a multipurpose window for visual steering of analysis steps, creation of analysis templates, and browsing physics event data at different steps of an analysis. VISPA follows an experiment-independent approach and incorporates various tools for steering and controlling required in a typical analysis. Connection to different frameworks of high energy physics experiments is achieved by using different types of interfaces. We present the look-and-feel for an example physics analysis at the LHC and explain the underlying software concepts of VISPA.

  16. Analysis of the interaction between experimental and applied behavior analysis.

    Science.gov (United States)

    Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Cox, Alison D; Pear, Joseph J

    2014-01-01

    To study the influences between basic and applied research in behavior analysis, we analyzed the coauthorship interactions of authors who published in JABA and JEAB from 1980 to 2010. We paid particular attention to authors who published in both JABA and JEAB (dual authors) as potential agents of cross-field interactions. We present a comprehensive analysis of dual authors' coauthorship interactions using social networks methodology and key word analysis. The number of dual authors more than doubled (26 to 67) and their productivity tripled (7% to 26% of JABA and JEAB articles) between 1980 and 2010. Dual authors stood out in terms of number of collaborators, number of publications, and ability to interact with multiple groups within the field. The steady increase in JEAB and JABA interactions through coauthors and the increasing range of topics covered by dual authors provide a basis for optimism regarding the progressive integration of basic and applied behavior analysis. © Society for the Experimental Analysis of Behavior.

  17. K Basin Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    PECH, S.H.

    2000-08-23

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  18. K Basin Hazard Analysis

    International Nuclear Information System (INIS)

    PECH, S.H.

    2000-01-01

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report

  19. Energy analysis handbook. CAC document 214. [Combining process analysis with input-output analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bullard, C. W.; Penner, P. S.; Pilati, D. A.

    1976-10-01

    Methods are presented for calculating the energy required, directly and indirectly, to produce all types of goods and services. Procedures for combining process analysis with input-output analysis are described. This enables the analyst to focus data acquisition cost-effectively, and to achieve a specified degree of accuracy in the results. The report presents sample calculations and provides the tables and charts needed to perform most energy cost calculations, including the cost of systems for producing or conserving energy.

  20. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    Science.gov (United States)

    Sanfilippo, Antonio P [Richland, WA; Cowell, Andrew J [Kennewick, WA; Gregory, Michelle L [Richland, WA; Baddeley, Robert L [Richland, WA; Paulson, Patrick R [Pasco, WA; Tratz, Stephen C [Richland, WA; Hohimer, Ryan E [West Richland, WA

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  1. Analysis I

    CERN Document Server

    Tao, Terence

    2016-01-01

    This is part one of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...

  2. Analysis II

    CERN Document Server

    Tao, Terence

    2016-01-01

    This is part two of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...

  3. CMS analysis operations

    International Nuclear Information System (INIS)

    Andreeva, J; Maier, G; Spiga, D; Calloni, M; Colling, D; Fanzago, F; D'Hondt, J; Maes, J; Van Mulders, P; Villella, I; Klem, J; Letts, J; Padhi, S; Sarkar, S

    2010-01-01

    During normal data taking CMS expects to support potentially as many as 2000 analysis users. Since the beginning of 2008 there have been more than 800 individuals who submitted a remote analysis job to the CMS computing infrastructure. The bulk of these users will be supported at the over 40 CMS Tier-2 centres. Supporting a globally distributed community of users on a globally distributed set of computing clusters is a task that requires reconsidering the normal methods of user support for Analysis Operations. In 2008 CMS formed an Analysis Support Task Force in preparation for large-scale physics analysis activities. The charge of the task force was to evaluate the available support tools, the user support techniques, and the direct feedback of users with the goal of improving the success rate and user experience when utilizing the distributed computing environment. The task force determined the tools needed to assess and reduce the number of non-zero exit code applications submitted through the grid interfaces and worked with the CMS experiment dashboard developers to obtain the necessary information to quickly and proactively identify issues with user jobs and data sets hosted at various sites. Results of the analysis group surveys were compiled. Reference platforms for testing and debugging problems were established in various geographic regions. The task force also assessed the resources needed to make the transition to a permanent Analysis Operations task. In this presentation the results of the task force will be discussed as well as the CMS Analysis Operations plans for the start of data taking.

  4. Harmonic analysis a comprehensive course in analysis, part 3

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 3 returns to the themes of Part 1 by discussing pointwise limits (going beyond the usual focus on the Hardy-Littlewood maximal function by including ergodic theorems and m

  5. Cost benefit analysis cost effectiveness analysis

    International Nuclear Information System (INIS)

    Lombard, J.

    1986-09-01

    The comparison of various protection options in order to determine which is the best compromise between cost of protection and residual risk is the purpose of the ALARA procedure. The use of decision-aiding techniques is valuable as an aid to selection procedures. The purpose of this study is to introduce two rather simple and well known decision aiding techniques: the cost-effectiveness analysis and the cost-benefit analysis. These two techniques are relevant for the great part of ALARA decisions which need the use of a quantitative technique. The study is based on an hypothetical case of 10 protection options. Four methods are applied to the data

  6. Basic complex analysis a comprehensive course in analysis, part 2a

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 2A is devoted to basic complex analysis. It interweaves three analytic threads associated with Cauchy, Riemann, and Weierstrass, respectively. Cauchy's view focuses on th

  7. pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis.

    Science.gov (United States)

    Giannakopoulos, Theodoros

    2015-01-01

    Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library.

  8. Introduction of thermal-hydraulic analysis code and system analysis code for HTGR

    International Nuclear Information System (INIS)

    Tanaka, Mitsuhiro; Izaki, Makoto; Koike, Hiroyuki; Tokumitsu, Masashi

    1984-01-01

    Kawasaki Heavy Industries Ltd. has advanced the development and systematization of analysis codes, aiming at lining up the analysis codes for heat transferring flow and control characteristics, taking up HTGR plants as the main object. In order to make the model of flow when shock waves propagate to heating tubes, SALE-3D which can analyze a complex system was developed, therefore, it is reported in this paper. Concerning the analysis code for control characteristics, the method of sensitivity analysis in a topological space including an example of application is reported. The flow analysis code SALE-3D is that for analyzing the flow of compressible viscous fluid in a three-dimensional system over the velocity range from incompressibility limit to supersonic velocity. The fundamental equations and fundamental algorithm of the SALE-3D, the calculation of cell volume, the plotting of perspective drawings and the analysis of the three-dimensional behavior of shock waves propagating in heating tubes after their rupture accident are described. The method of sensitivity analysis was added to the analysis code for control characteristics in a topological space, and blow-down phenomena was analyzed by its application. (Kako, I.)

  9. Investigation on method of elasto-plastic analysis for piping system (benchmark analysis)

    International Nuclear Information System (INIS)

    Kabaya, Takuro; Kojima, Nobuyuki; Arai, Masashi

    2015-01-01

    This paper provides method of an elasto-plastic analysis for practical seismic design of nuclear piping system. JSME started up the task to establish method of an elasto-plastic analysis for nuclear piping system. The benchmark analyses have been performed in the task to investigate on method of an elasto-plastic analysis. And our company has participated in the benchmark analyses. As a result, we have settled on the method which simulates the result of piping exciting test accurately. Therefore the recommended method of an elasto-plastic analysis is shown as follows; 1) An elasto-plastic analysis is composed of dynamic analysis of piping system modeled by using beam elements and static analysis of deformed elbow modeled by using shell elements. 2) Bi-linear is applied as an elasto-plastic property. Yield point is standardized yield point multiplied by 1.2 times, and second gradient is 1/100 young's modulus. Kinematic hardening is used as a hardening rule. 3) The fatigue life is evaluated on strain ranges obtained by elasto-plastic analysis, by using the rain flow method and the fatigue curve of previous studies. (author)

  10. K Basins Hazard Analysis

    International Nuclear Information System (INIS)

    WEBB, R.H.

    1999-01-01

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062/Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report

  11. Functional data analysis

    CERN Document Server

    Ramsay, J O

    1997-01-01

    Scientists today collect samples of curves and other functional observations. This monograph presents many ideas and techniques for such data. Included are expressions in the functional domain of such classics as linear regression, principal components analysis, linear modelling, and canonical correlation analysis, as well as specifically functional techniques such as curve registration and principal differential analysis. Data arising in real applications are used throughout for both motivation and illustration, showing how functional approaches allow us to see new things, especially by exploiting the smoothness of the processes generating the data. The data sets exemplify the wide scope of functional data analysis; they are drwan from growth analysis, meterology, biomechanics, equine science, economics, and medicine. The book presents novel statistical technology while keeping the mathematical level widely accessible. It is designed to appeal to students, to applied data analysts, and to experienced researc...

  12. Dimensional Analysis

    Indian Academy of Sciences (India)

    Dimensional analysis is a useful tool which finds important applications in physics and engineering. It is most effective when there exist a maximal number of dimensionless quantities constructed out of the relevant physical variables. Though a complete theory of dimen- sional analysis was developed way back in 1914 in a.

  13. A sensory analysis of butter cookies: An application of generalized procrustes analysis

    DEFF Research Database (Denmark)

    Juhl, Hans Jørn

    1994-01-01

    Executive Summary: 1. A sensory analysis is one of the first steps in product development in the food industry. A thorough analysis of the results from such an analysis may give important input to the development process. 2. A sensory analysis on butter cookies is conducted in order to evaluate...... if some butter may be replaced by vegetable fat without a significant change in the sensory profile. The conclusion is that the replacement is possible without a considerable change in the sensory profile. 3. Generalized Procrustes Analysis is used to analyze the results. It is a relatively new technique...

  14. Contributions to sensitivity analysis and generalized discriminant analysis

    International Nuclear Information System (INIS)

    Jacques, J.

    2005-12-01

    Two topics are studied in this thesis: sensitivity analysis and generalized discriminant analysis. Global sensitivity analysis of a mathematical model studies how the output variables of this last react to variations of its inputs. The methods based on the study of the variance quantify the part of variance of the response of the model due to each input variable and each subset of input variables. The first subject of this thesis is the impact of a model uncertainty on results of a sensitivity analysis. Two particular forms of uncertainty are studied: that due to a change of the model of reference, and that due to the use of a simplified model with the place of the model of reference. A second problem was studied during this thesis, that of models with correlated inputs. Indeed, classical sensitivity indices not having significance (from an interpretation point of view) in the presence of correlation of the inputs, we propose a multidimensional approach consisting in expressing the sensitivity of the output of the model to groups of correlated variables. Applications in the field of nuclear engineering illustrate this work. Generalized discriminant analysis consists in classifying the individuals of a test sample in groups, by using information contained in a training sample, when these two samples do not come from the same population. This work extends existing methods in a Gaussian context to the case of binary data. An application in public health illustrates the utility of generalized discrimination models thus defined. (author)

  15. Chemical analysis of carbonates and carbonate rocks by atomic absorption analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tardon, S

    1981-01-01

    Evaluates methods of determining chemical composition of rocks surrounding black coal seams. Carbonate rock samples were collected in the Ostrava-Karvina coal mines. Sampling methods are described. Determination of the following elements and compounds in carbonate rocks is discussed: calcium, magnesium, iron, manganese, barium, silicon, aluminium, titanium, sodium, potassium, sulfur trioxide, phosphorus pentoxide, water and carbon dioxide. Proportion of compounds insoluble in water in the investigated rocks is also determined. Most of the elements are determined by means of atomic absorption analysis. Phosphorus is also determined by atomic absorption analysis. Other compounds are determined gravimetrically. The described procedure permits weight of a rock sample to be reduced to 0.5 g without reducing analysis accuracy. The results of determining carbonate rock components by X-ray analysis and by chemical analysis are compared. Equipment used for atomic absorption analysis is characterized (the 503 Perkin-Elmer and the CF-4 Optica-Milano spectrophotometers). The analyzed method for determining carbonate rock permits more accurate classification of rocks surrounding coal seams and rock impurities in run-of-mine coal. (22 refs.) (In Czech)

  16. Practical data analysis

    CERN Document Server

    Cuesta, Hector

    2013-01-01

    Each chapter of the book quickly introduces a key 'theme' of Data Analysis, before immersing you in the practical aspects of each theme. You'll learn quickly how to perform all aspects of Data Analysis.Practical Data Analysis is a book ideal for home and small business users who want to slice & dice the data they have on hand with minimum hassle.

  17. Left ventricular wall motion abnormalities evaluated by factor analysis as compared with Fourier analysis

    International Nuclear Information System (INIS)

    Hirota, Kazuyoshi; Ikuno, Yoshiyasu; Nishikimi, Toshio

    1986-01-01

    Factor analysis was applied to multigated cardiac pool scintigraphy to evaluate its ability to detect left ventricular wall motion abnormalities in 35 patients with old myocardial infarction (MI), and in 12 control cases with normal left ventriculography. All cases were also evaluated by conventional Fourier analysis. In most cases with normal left ventriculography, the ventricular and atrial factors were extracted by factor analysis. In cases with MI, the third factor was obtained in the left ventricle corresponding to wall motion abnormality. Each case was scored according to the coincidence of findings of ventriculography and those of factor analysis or Fourier analysis. Scores were recorded for three items; the existence, location, and degree of asynergy. In cases of MI, the detection rate of asynergy was 94 % by factor analysis, 83 % by Fourier analysis, and the agreement in respect to location was 71 % and 66 %, respectively. Factor analysis had higher scores than Fourier analysis, but this was not significant. The interobserver error of factor analysis was less than that of Fourier analysis. Factor analysis can display locations and dynamic motion curves of asynergy, and it is regarded as a useful method for detecting and evaluating left ventricular wall motion abnormalities. (author)

  18. Time-frequency analysis : mathematical analysis of the empirical mode decomposition.

    Science.gov (United States)

    2009-01-01

    Invented over 10 years ago, empirical mode : decomposition (EMD) provides a nonlinear : time-frequency analysis with the ability to successfully : analyze nonstationary signals. Mathematical : Analysis of the Empirical Mode Decomposition : is a...

  19. QUANTITATIVE ANALYSIS OF FLUX REGULATION THROUGH HIERARCHICAL REGULATION ANALYSIS

    NARCIS (Netherlands)

    van Eunen, Karen; Rossell, Sergio; Bouwman, Jildau; Westerhoff, Hans V.; Bakker, Barbara M.; Jameson, D; Verma, M; Westerhoff, HV

    2011-01-01

    Regulation analysis is a methodology that quantifies to what extent a change in the flux through a metabolic pathway is regulated by either gene expression or metabolism. Two extensions to regulation analysis were developed over the past years: (i) the regulation of V(max) can be dissected into the

  20. Quantitative analysis of flux regulation through hierarchical regulation analysis

    NARCIS (Netherlands)

    Eunen, K. van; Rossell, S.; Bouwman, J.; Westerhoff, H.V.; Bakker, B.M.

    2011-01-01

    Regulation analysis is a methodology that quantifies to what extent a change in the flux through a metabolic pathway is regulated by either gene expression or metabolism. Two extensions to regulation analysis were developed over the past years: (i) the regulation of Vmax can be dissected into the

  1. Object-Oriented Analysis, Structured Analysis, and Jackson System Development

    NARCIS (Netherlands)

    Van Assche, F.; Wieringa, Roelf J.; Moulin, B.; Rolland, C

    1991-01-01

    Conceptual modeling is the activity of producing a conceptual model of an actual or desired version of a universe of discourse (UoD). In this paper, two methods of conceptual modeling are compared, structured analysis (SA) and object-oriented analysis (OOA). This is done by transforming a model

  2. Foundations of mathematical analysis

    CERN Document Server

    Johnsonbaugh, Richard

    2010-01-01

    This classroom-tested volume offers a definitive look at modern analysis, with views of applications to statistics, numerical analysis, Fourier series, differential equations, mathematical analysis, and functional analysis. Upper-level undergraduate students with a background in calculus will benefit from its teachings, along with beginning graduate students seeking a firm grounding in modern analysis. A self-contained text, it presents the necessary background on the limit concept, and the first seven chapters could constitute a one-semester introduction to limits. Subsequent chapters discuss

  3. Advanced complex analysis a comprehensive course in analysis, part 2b

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 2B provides a comprehensive look at a number of subjects of complex analysis not included in Part 2A. Presented in this volume are the theory of conformal metrics (includ

  4. Development of Performance Analysis Program for an Axial Compressor with Meanline Analysis

    International Nuclear Information System (INIS)

    Park, Jun Young; Park, Moo Ryong; Choi, Bum Suk; Song, Je Wook

    2009-01-01

    Axial-flow compressor is one of the most important parts of gas turbine units with axial turbine and combustor. Therefore, precise prediction of performance is very important for development of new compressor or modification of existing one. Meanline analysis is a simple, fast and powerful method for performance prediction of axial-flow compressors with different geometries. So, Meanline analysis is frequently used in preliminary design stage and performance analysis for given geometry data. Much correlations for meanline analysis have been developed theoretically and experimentally for estimating various types of losses and flow deviation angle for long time. In present study, meanline analysis program was developed to estimate compressor losses, incidence angles, deviation angles, stall and surge conditions with many correlations. Performance prediction of one stage axial compressors is conducted with this meanline analysis program. The comparison between experimental and numerical results show a good agreement. This meanline analysis program can be used for various types of single stage axial-flow compressors with different geometries, as well as multistage axial-flow compressors

  5. Nonactivation interaction analysis. Chapter 5

    International Nuclear Information System (INIS)

    1976-01-01

    Analyses are described including the alpha scattering analysis, beta absorption and scattering analysis, gamma and X-ray absorption and scattering analysis, X-ray fluorescence analysis, neutron absorption and scattering analysis, Moessbauer effect application and an analysis based on the application of radiation ionizing effects. (J.P.)

  6. Goal-oriented failure analysis - a systems analysis approach to hazard identification

    International Nuclear Information System (INIS)

    Reeves, A.B.; Davies, J.; Foster, J.; Wells, G.L.

    1990-01-01

    Goal-Oriented Failure Analysis, GOFA, is a methodology which is being developed to identify and analyse the potential failure modes of a hazardous plant or process. The technique will adopt a structured top-down approach, with a particular failure goal being systematically analysed. A systems analysis approach is used, with the analysis being organised around a systems diagram of the plant or process under study. GOFA will also use checklists to supplement the analysis -these checklists will be prepared in advance of a group session and will help to guide the analysis and avoid unnecessary time being spent on identifying obvious failure modes or failing to identify certain hazards or failures. GOFA is being developed with the aim of providing a hazard identification methodology which is more efficient and stimulating than the conventional approach to HAZOP. The top-down approach should ensure that the analysis is more focused and the use of a systems diagram will help to pull the analysis together at an early stage whilst also helping to structure the sessions in a more stimulating way than the conventional techniques. GOFA will be, essentially, an extension of the HAZOP methodology. GOFA is currently being computerised using a knowledge-based systems approach for implementation. The Goldworks II expert systems development tool is being used. (author)

  7. Trace analysis

    International Nuclear Information System (INIS)

    Warner, M.

    1987-01-01

    What is the current state of quantitative trace analytical chemistry? What are today's research efforts? And what challenges does the future hold? These are some of the questions addressed at a recent four-day symposium sponsored by the National Bureau of Standards (NBS) entitled Accuracy in Trace Analysis - Accomplishments, Goals, Challenges. The two plenary sessions held on the first day of the symposium reviewed the history of quantitative trace analysis, discussed the present situation from academic and industrial perspectives, and summarized future needs. The remaining three days of the symposium consisted of parallel sessions dealing with the measurement process; quantitation in materials; environmental, clinical, and nutrient analysis; and advances in analytical techniques

  8. Analysis in usability evaluations

    DEFF Research Database (Denmark)

    Følstad, Asbjørn; Lai-Chong Law, Effie; Hornbæk, Kasper

    2010-01-01

    While the planning and implementation of usability evaluations are well described in the literature, the analysis of the evaluation data is not. We present interviews with 11 usability professionals on how they conduct analysis, describing the resources, collaboration, creation of recommendations......, and prioritization involved. The interviews indicate a lack of structure in the analysis process and suggest activities, such as generating recommendations, that are unsupported by existing methods. We discuss how to better support analysis, and propose four themes for future research on analysis in usability...

  9. What Is Public Agency Strategic Analysis (PASA and How Does It Differ from Public Policy Analysis and Firm Strategy Analysis?

    Directory of Open Access Journals (Sweden)

    Aidan R. Vining

    2016-12-01

    Full Text Available Public agency strategic analysis (PASA is different from public policy analysis because public agency executives face numerous constraints that those performing “unconstrained” policy analysis do not. It is also different from private sector strategic analysis. But because of similar constraints and realities, some generic and private sector strategic analysis techniques can be useful to those carrying out PASA, if appropriately modified. Analysis of the external agency environment (external forces and internal value creation processes (“value chains”, “modular assembly” processes or “multi-sided intermediation platforms” are the most important components of PASA. Also, agency executives must focus on feasible alternatives. In sum, PASA must be practical. But public executives need to take seriously public value, and specifically social efficiency, when engaging in PASA. Unless they do so, their strategic analyses will not have normative legitimacy because enhancing public value is not the same as in some versions of public value or in agency “profit maximization”. Although similarly constrained, normatively appropriate public agency strategic analysis is not “giving clients what they want” or “making the public sector business case”. PASA must be both practical and principled.

  10. Foundations of factor analysis

    CERN Document Server

    Mulaik, Stanley A

    2009-01-01

    Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti

  11. Is activation analysis still active?

    International Nuclear Information System (INIS)

    Chai Zhifang

    2001-01-01

    This paper reviews some aspects of neutron activation analysis (NAA), covering instrumental neutron activation analysis (INAA), k 0 method, prompt gamma-ray neutron activation analysis (PGNAA), radiochemical neutron activation analysis (RNAA) and molecular activation analysis (MAA). The comparison of neutron activation analysis with other analytical techniques are also made. (author)

  12. Semiotic Analysis.

    Science.gov (United States)

    Thiemann, Francis C.

    Semiotic analysis is a method of analyzing signs (e.g., words) to reduce non-numeric data to their component parts without losing essential meanings. Semiotics dates back to Aristotle's analysis of language; it was much advanced by nineteenth-century analyses of style and logic and by Whitehead and Russell's description in this century of the role…

  13. Multivariate analysis with LISREL

    CERN Document Server

    Jöreskog, Karl G; Y Wallentin, Fan

    2016-01-01

    This book traces the theory and methodology of multivariate statistical analysis and shows how it can be conducted in practice using the LISREL computer program. It presents not only the typical uses of LISREL, such as confirmatory factor analysis and structural equation models, but also several other multivariate analysis topics, including regression (univariate, multivariate, censored, logistic, and probit), generalized linear models, multilevel analysis, and principal component analysis. It provides numerous examples from several disciplines and discusses and interprets the results, illustrated with sections of output from the LISREL program, in the context of the example. The book is intended for masters and PhD students and researchers in the social, behavioral, economic and many other sciences who require a basic understanding of multivariate statistical theory and methods for their analysis of multivariate data. It can also be used as a textbook on various topics of multivariate statistical analysis.

  14. An Array of Qualitative Data Analysis Tools: A Call for Data Analysis Triangulation

    Science.gov (United States)

    Leech, Nancy L.; Onwuegbuzie, Anthony J.

    2007-01-01

    One of the most important steps in the qualitative research process is analysis of data. The purpose of this article is to provide elements for understanding multiple types of qualitative data analysis techniques available and the importance of utilizing more than one type of analysis, thus utilizing data analysis triangulation, in order to…

  15. Essential real analysis

    CERN Document Server

    Field, Michael

    2017-01-01

    This book provides a rigorous introduction to the techniques and results of real analysis, metric spaces and multivariate differentiation, suitable for undergraduate courses. Starting from the very foundations of analysis, it offers a complete first course in real analysis, including topics rarely found in such detail in an undergraduate textbook such as the construction of non-analytic smooth functions, applications of the Euler-Maclaurin formula to estimates, and fractal geometry.  Drawing on the author’s extensive teaching and research experience, the exposition is guided by carefully chosen examples and counter-examples, with the emphasis placed on the key ideas underlying the theory. Much of the content is informed by its applicability: Fourier analysis is developed to the point where it can be rigorously applied to partial differential equations or computation, and the theory of metric spaces includes applications to ordinary differential equations and fractals. Essential Real Analysis will appeal t...

  16. Robust multivariate analysis

    CERN Document Server

    J Olive, David

    2017-01-01

    This text presents methods that are robust to the assumption of a multivariate normal distribution or methods that are robust to certain types of outliers. Instead of using exact theory based on the multivariate normal distribution, the simpler and more applicable large sample theory is given.  The text develops among the first practical robust regression and robust multivariate location and dispersion estimators backed by theory.   The robust techniques  are illustrated for methods such as principal component analysis, canonical correlation analysis, and factor analysis.  A simple way to bootstrap confidence regions is also provided. Much of the research on robust multivariate analysis in this book is being published for the first time. The text is suitable for a first course in Multivariate Statistical Analysis or a first course in Robust Statistics. This graduate text is also useful for people who are familiar with the traditional multivariate topics, but want to know more about handling data sets with...

  17. Data analysis workbench

    International Nuclear Information System (INIS)

    Goetz, A.; Gerring, M.; Svensson, O.; Brockhauser, S.

    2012-01-01

    Data Analysis Workbench (DAWB) is a new software tool being developed at the ESRF. Its goal is to provide a tool for both online data analysis which can be used on the beamlines and for offline data analysis which users can use during experiments or take home. The tool includes support for data visualization and work-flows. work-flows allow algorithms which exploit parallel architectures to be designed from existing high level modules for data analysis in combination with data collection. The workbench uses Passerelle as the work-flow engine and EDNA plug-ins for data analysis. Actors talking to Tango are used for sending commands to a limited set of hardware to start existing data collection algorithms. A Tango server allows work-flows to be executed from existing applications. There are scripting interfaces to Python, Javascript and SPEC. The current state at the ESRF is the workbench is in test on a selected number of beamlines. (authors)

  18. Data-variant kernel analysis

    CERN Document Server

    Motai, Yuichi

    2015-01-01

    Describes and discusses the variants of kernel analysis methods for data types that have been intensely studied in recent years This book covers kernel analysis topics ranging from the fundamental theory of kernel functions to its applications. The book surveys the current status, popular trends, and developments in kernel analysis studies. The author discusses multiple kernel learning algorithms and how to choose the appropriate kernels during the learning phase. Data-Variant Kernel Analysis is a new pattern analysis framework for different types of data configurations. The chapters include

  19. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  20. Emission spectrochemical analysis

    International Nuclear Information System (INIS)

    Rives, R.D.; Bruks, R.R.

    1983-01-01

    The emission spectrochemical method of analysis based on the fact that atoms of elements can be excited in the electric arc or in the laser beam and will emit radiation with characteristic wave lengths is considered. The review contains the data on spectrochemical analysis, of liquids geological materials, scheme of laser microprobe. The main characteristics of emission spectroscopy, atomic absorption spectroscopy and X-ray fluorescent analysis, are aeneralized

  1. Comparative risk analysis

    International Nuclear Information System (INIS)

    Niehaus, F.

    1988-01-01

    In this paper, the risks of various energy systems are discussed considering severe accidents analysis, particularly the probabilistic safety analysis, and probabilistic safety criteria, and the applications of these criteria and analysis. The comparative risk analysis has demonstrated that the largest source of risk in every society is from daily small accidents. Nevertheless, we have to be more concerned about severe accidents. The comparative risk analysis of five different energy systems (coal, oil, gas, LWR and STEC (Solar)) for the public has shown that the main sources of risks are coal and oil. The latest comparative risk study of various energy has been conducted in the USA and has revealed that the number of victims from coal is 42 as many than victims from nuclear. A study for severe accidents from hydro-dams in United States has estimated the probability of dam failures at 1 in 10,000 years and the number of victims between 11,000 and 260,000. The average occupational risk from coal is one fatal accident in 1,000 workers/year. The probabilistic safety analysis is a method that can be used to assess nuclear energy risks, and to analyze the severe accidents, and to model all possible accident sequences and consequences. The 'Fault tree' analysis is used to know the probability of failure of the different systems at each point of accident sequences and to calculate the probability of risks. After calculating the probability of failure, the criteria for judging the numerical results have to be developed, that is the quantitative and qualitative goals. To achieve these goals, several systems have been devised by various countries members of AIEA. The probabilistic safety ana-lysis method has been developed by establishing a computer program permit-ting to know different categories of safety related information. 19 tabs. (author)

  2. Fault tree analysis

    International Nuclear Information System (INIS)

    1981-09-01

    Suggestion are made concerning the method of the fault tree analysis, the use of certain symbols in the examination of system failures. This purpose of the fault free analysis is to find logical connections of component or subsystem failures leading to undesirable occurrances. The results of these examinations are part of the system assessment concerning operation and safety. The objectives of the analysis are: systematical identification of all possible failure combinations (causes) leading to a specific undesirable occurrance, finding of reliability parameters such as frequency of failure combinations, frequency of the undesirable occurrance or non-availability of the system when required. The fault tree analysis provides a near and reconstructable documentation of the examination. (orig./HP) [de

  3. Social Set Analysis

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Mukkamala, Raghava Rao; Hussain, Abid

    2016-01-01

    , conceptual and formal models of social data, and an analytical framework for combining big social data sets with organizational and societal data sets. Three empirical studies of big social data are presented to illustrate and demonstrate social set analysis in terms of fuzzy set-theoretical sentiment...... automata and agent-based modeling). However, when it comes to organizational and societal units of analysis, there exists no approach to conceptualize, model, analyze, explain, and predict social media interactions as individuals' associations with ideas, values, identities, and so on. To address...... analysis, crisp set-theoretical interaction analysis, and event-studies-oriented set-theoretical visualizations. Implications for big data analytics, current limitations of the set-theoretical approach, and future directions are outlined....

  4. Hermeneutic phenomenological analysis: the 'possibility' beyond 'actuality' in thematic analysis.

    Science.gov (United States)

    Ho, Ken H M; Chiang, Vico C L; Leung, Doris

    2017-07-01

    This article discusses the ways researchers may become open to manifold interpretations of lived experience through thematic analysis that follows the tradition of hermeneutic phenomenology. Martin Heidegger's thinking about historical contexts of understandings and the notions of 'alētheia' and 'techne' disclose what he called meaning of lived experience, as the 'unchanging Being of changing beings'. While these notions remain central to hermeneutic phenomenological research, novice phenomenologists usually face the problem of how to incorporate these philosophical tenets into thematic analysis. Discussion paper. This discussion paper is based on our experiences of hermeneutic analysis supported by the writings of Heidegger. Literature reviewed for this paper ranges from 1927 - 2014. We draw on data from a study of foreign domestic helpers in Hong Kong to demonstrate how 'dwelling' in the language of participants' 'ek-sistence' supported us in a process of thematic analysis. Data were collected from December 2013 - February 2016. Nurses doing hermeneutic phenomenology have to develop self-awareness of one's own 'taken-for-granted' thinking to disclose the unspoken meanings hidden in the language of participants. Understanding the philosophical tenets of hermeneutic phenomenology allows nurses to preserve possibilities of interpretations in thinking. In so doing, methods of thematic analysis can uncover and present the structure of the meaning of lived experience. We provide our readers with vicarious experience of how to begin cultivating thinking that is aligned with hermeneutic phenomenological philosophical tenets to conduct thematic analysis. © 2017 John Wiley & Sons Ltd.

  5. Physicochemical properties of different corn varieties by principal components analysis and cluster analysis

    International Nuclear Information System (INIS)

    Zeng, J.; Li, G.; Sun, J.

    2013-01-01

    Principal components analysis and cluster analysis were used to investigate the properties of different corn varieties. The chemical compositions and some properties of corn flour which processed by drying milling were determined. The results showed that the chemical compositions and physicochemical properties were significantly different among twenty six corn varieties. The quality of corn flour was concerned with five principal components from principal component analysis and the contribution rate of starch pasting properties was important, which could account for 48.90%. Twenty six corn varieties could be classified into four groups by cluster analysis. The consistency between principal components analysis and cluster analysis indicated that multivariate analyses were feasible in the study of corn variety properties. (author)

  6. Compatibility analysis of DUPIC fuel(4) - thermal hydraulic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jee Won; Chae, Kyung Myung; Choi, Hang Bok

    2000-07-01

    Thermal-hydraulic compatibility of the DUPIC fuel bundle in the CANDU reactor has been studied. The critical channel power, the critical power ratio, the channel exit quality and the channel flow are calculated for the DUPIC and the standard fuels by using the NUCIRC code. The physical models and associated parametric values for the NUCIRC analysis of the fuels are also presented. Based upon the slave channel analysis, the critical channel power and the critical power ratios have been found to be very similar for the two fuel types. The same dryout model is used in this study for the standard and the DUPIC fuel bundles. To assess the dryout characteristics of the DUPIC fuel bundle, the ASSERT-PV code has been used for the subchannel analysis. Based upon the results of the subchannel analysis, it is found that the dryout location and the power for the two fuel types are indeed very similar. This study shows that thermal performance of the DUPIC fuel is not significantly different from that of the standard fuel.

  7. The ATLAS Analysis Model

    CERN Multimedia

    Amir Farbin

    The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...

  8. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  9. A sensory analysis of butter cookies: An application of generalized procrustes analysis

    OpenAIRE

    Juhl, Hans Jørn

    1994-01-01

    Executive Summary: 1. A sensory analysis is one of the first steps in product development in the food industry. A thorough analysis of the results from such an analysis may give important input to the development process. 2. A sensory analysis on butter cookies is conducted in order to evaluate if some butter may be replaced by vegetable fat without a significant change in the sensory profile. The conclusion is that the replacement is possible without a considerable change in the sensory prof...

  10. Data analysis and approximate models model choice, location-scale, analysis of variance, nonparametric regression and image analysis

    CERN Document Server

    Davies, Patrick Laurie

    2014-01-01

    Introduction IntroductionApproximate Models Notation Two Modes of Statistical AnalysisTowards One Mode of Analysis Approximation, Randomness, Chaos, Determinism ApproximationA Concept of Approximation Approximation Approximating a Data Set by a Model Approximation Regions Functionals and EquivarianceRegularization and Optimality Metrics and DiscrepanciesStrong and Weak Topologies On Being (almost) Honest Simulations and Tables Degree of Approximation and p-values ScalesStability of Analysis The Choice of En(α, P) Independence Procedures, Approximation and VaguenessDiscrete Models The Empirical Density Metrics and Discrepancies The Total Variation Metric The Kullback-Leibler and Chi-Squared Discrepancies The Po(λ) ModelThe b(k, p) and nb(k, p) Models The Flying Bomb Data The Student Study Times Data OutliersOutliers, Data Analysis and Models Breakdown Points and Equivariance Identifying Outliers and Breakdown Outliers in Multivariate Data Outliers in Linear Regression Outliers in Structured Data The Location...

  11. Frontier Assignment for Sensitivity Analysis of Data Envelopment Analysis

    Science.gov (United States)

    Naito, Akio; Aoki, Shingo; Tsuji, Hiroshi

    To extend the sensitivity analysis capability for DEA (Data Envelopment Analysis), this paper proposes frontier assignment based DEA (FA-DEA). The basic idea of FA-DEA is to allow a decision maker to decide frontier intentionally while the traditional DEA and Super-DEA decide frontier computationally. The features of FA-DEA are as follows: (1) provides chances to exclude extra-influential DMU (Decision Making Unit) and finds extra-ordinal DMU, and (2) includes the function of the traditional DEA and Super-DEA so that it is able to deal with sensitivity analysis more flexibly. Simple numerical study has shown the effectiveness of the proposed FA-DEA and the difference from the traditional DEA.

  12. Overview of methods for uncertainty analysis and sensitivity analysis in probabilistic risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.

    1985-01-01

    Probabilistic Risk Assessment (PRA) is playing an increasingly important role in the nuclear reactor regulatory process. The assessment of uncertainties associated with PRA results is widely recognized as an important part of the analysis process. One of the major criticisms of the Reactor Safety Study was that its representation of uncertainty was inadequate. The desire for the capability to treat uncertainties with the MELCOR risk code being developed at Sandia National Laboratories is indicative of the current interest in this topic. However, as yet, uncertainty analysis and sensitivity analysis in the context of PRA is a relatively immature field. In this paper, available methods for uncertainty analysis and sensitivity analysis in a PRA are reviewed. This review first treats methods for use with individual components of a PRA and then considers how these methods could be combined in the performance of a complete PRA. In the context of this paper, the goal of uncertainty analysis is to measure the imprecision in PRA outcomes of interest, and the goal of sensitivity analysis is to identify the major contributors to this imprecision. There are a number of areas that must be considered in uncertainty analysis and sensitivity analysis for a PRA: (1) information, (2) systems analysis, (3) thermal-hydraulic phenomena/fission product behavior, (4) health and economic consequences, and (5) display of results. Each of these areas and the synthesis of them into a complete PRA are discussed

  13. Meta-Analysis for Primary and Secondary Data Analysis: The Super-Experiment Metaphor.

    Science.gov (United States)

    Jackson, Sally

    1991-01-01

    Considers the relation between meta-analysis statistics and analysis of variance statistics. Discusses advantages and disadvantages as a primary data analysis tool. Argues that the two approaches are partial paraphrases of one another. Advocates an integrative approach that introduces the best of meta-analytic thinking into primary analysis…

  14. Canonical Information Analysis

    DEFF Research Database (Denmark)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg

    2015-01-01

    is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator......Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables...... for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical...

  15. Biorefinery Sustainability Analysis

    DEFF Research Database (Denmark)

    J. S. M. Silva, Carla; Prunescu, Remus Mihail; Gernaey, Krist

    2017-01-01

    This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system and of t......This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system...... and of the biorefinery-based system. Socio-economic criteria and indicators used in sustainability frameworks assessment are presented and discussed. There is not one single methodology that can aptly cover the synergies of environmental, economic, social and governance issues required to assess the sustainable...

  16. Application of optical deformation analysis system on wedge splitting test and its inverse analysis

    DEFF Research Database (Denmark)

    Skocek, Jan; Stang, Henrik

    2010-01-01

    . Results of the inverse analysis are compared with traditional inverse analysis based on clip gauge data. Then the optically measured crack profile and crack tip position are compared with predictions done by the non-linear hinge model and a finite element analysis. It is shown that the inverse analysis...... based on the optically measured data can provide material parameters of the fictitious crack model matching favorably those obtained by classical inverse analysis based on the clip gauge data. Further advantages of using of the optical deformation analysis lie in identification of such effects...

  17. Handbook of Applied Analysis

    CERN Document Server

    Papageorgiou, Nikolaos S

    2009-01-01

    Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.

  18. How Content Analysis may Complement and Extend the Insights of Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Tracey Feltham-King

    2016-02-01

    Full Text Available Although discourse analysis is a well-established qualitative research methodology, little attention has been paid to how discourse analysis may be enhanced through careful supplementation with the quantification allowed in content analysis. In this article, we report on a research study that involved the use of both Foucauldian discourse analysis (FDA and directed content analysis based on social constructionist theory and our qualitative research findings. The research focused on the discourses deployed, and the ways in which women were discursively positioned, in relation to abortion in 300 newspaper articles, published in 25 national and regional South African newspapers over 28 years, from 1978 to 2005. While the FDA was able to illuminate the constitutive network of power relations constructing women as subjects of a particular kind, questions emerged that were beyond the scope of the FDA. These questions concerned understanding the relative weightings of various discourses and tracing historical changes in the deployment of these discourses. In this article, we show how the decision to combine FDA and content analysis affected our sampling methodology. Using specific examples, we illustrate the contribution of the FDA to the study. Then, we indicate how subject positioning formed the link between the FDA and the content analysis. Drawing on the same examples, we demonstrate how the content analysis supplemented the FDA through tracking changes over time and providing empirical evidence of the extent to which subject positionings were deployed.

  19. Abstract interfaces for data analysis - component architecture for data analysis tools

    International Nuclear Information System (INIS)

    Barrand, G.; Binko, P.; Doenszelmann, M.; Pfeiffer, A.; Johnson, A.

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis'99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. The authors give an overview of the architecture and design of the various components for data analysis as discussed in AIDA

  20. LULU analysis program

    International Nuclear Information System (INIS)

    Crawford, H.J.; Lindstrom, P.J.

    1983-06-01

    Our analysis program LULU has proven very useful in all stages of experiment analysis, from prerun detector debugging through final data reduction. It has solved our problem of having arbitrary word length events and is easy enough to use that many separate experimenters are now analyzing with LULU. The ability to use the same software for all stages of experiment analysis greatly eases the programming burden. We may even get around to making the graphics elegant someday

  1. Stochastic Analysis 2010

    CERN Document Server

    Crisan, Dan

    2011-01-01

    "Stochastic Analysis" aims to provide mathematical tools to describe and model high dimensional random systems. Such tools arise in the study of Stochastic Differential Equations and Stochastic Partial Differential Equations, Infinite Dimensional Stochastic Geometry, Random Media and Interacting Particle Systems, Super-processes, Stochastic Filtering, Mathematical Finance, etc. Stochastic Analysis has emerged as a core area of late 20th century Mathematics and is currently undergoing a rapid scientific development. The special volume "Stochastic Analysis 2010" provides a sa

  2. Ecosystem Analysis Program

    International Nuclear Information System (INIS)

    Burgess, R.L.

    1978-01-01

    Progress is reported on the following research programs: analysis and modeling of ecosystems; EDFB/IBP data center; biome analysis studies; land/water interaction studies; and computer programs for development of models

  3. Trend analysis

    International Nuclear Information System (INIS)

    Smith, M.; Jones, D.R.

    1991-01-01

    The goal of exploration is to find reserves that will earn an adequate rate of return on the capital invested. Neither exploration nor economics is an exact science. The authors must therefore explore in those trends (plays) that have the highest probability of achieving this goal. Trend analysis is a technique for organizing the available data to make these strategic exploration decisions objectively and is in conformance with their goals and risk attitudes. Trend analysis differs from resource estimation in its purpose. It seeks to determine the probability of economic success for an exploration program, not the ultimate results of the total industry effort. Thus the recent past is assumed to be the best estimate of the exploration probabilities for the near future. This information is combined with economic forecasts. The computer software tools necessary for trend analysis are (1) Information data base - requirements and sources. (2) Data conditioning program - assignment to trends, correction of errors, and conversion into usable form. (3) Statistical processing program - calculation of probability of success and discovery size probability distribution. (4) Analytical processing - Monte Carlo simulation to develop the probability distribution of the economic return/investment ratio for a trend. Limited capital (short-run) effects are analyzed using the Gambler's Ruin concept in the Monte Carlo simulation and by a short-cut method. Multiple trend analysis is concerned with comparing and ranking trends, allocating funds among acceptable trends, and characterizing program risk by using risk profiles. In summary, trend analysis is a reality check for long-range exploration planning

  4. Radiation analysis devices, radiation analysis methods, and articles of manufacture

    Science.gov (United States)

    Roybal, Lyle Gene

    2010-06-08

    Radiation analysis devices include circuitry configured to determine respective radiation count data for a plurality of sections of an area of interest and combine the radiation count data of individual of sections to determine whether a selected radioactive material is present in the area of interest. An amount of the radiation count data for an individual section is insufficient to determine whether the selected radioactive material is present in the individual section. An article of manufacture includes media comprising programming configured to cause processing circuitry to perform processing comprising determining one or more correction factors based on a calibration of a radiation analysis device, measuring radiation received by the radiation analysis device using the one or more correction factors, and presenting information relating to an amount of radiation measured by the radiation analysis device having one of a plurality of specified radiation energy levels of a range of interest.

  5. Incorporation of advanced accident analysis methodology into safety analysis reports

    International Nuclear Information System (INIS)

    2003-05-01

    The IAEA Safety Guide on Safety Assessment and Verification defines that the aim of the safety analysis should be by means of appropriate analytical tools to establish and confirm the design basis for the items important to safety, and to ensure that the overall plant design is capable of meeting the prescribed and acceptable limits for radiation doses and releases for each plant condition category. Practical guidance on how to perform accident analyses of nuclear power plants (NPPs) is provided by the IAEA Safety Report on Accident Analysis for Nuclear Power Plants. The safety analyses are performed both in the form of deterministic and probabilistic analyses for NPPs. It is customary to refer to deterministic safety analyses as accident analyses. This report discusses the aspects of using the advanced accident analysis methods to carry out accident analyses in order to introduce them into the Safety Analysis Reports (SARs). In relation to the SAR, purposes of deterministic safety analysis can be further specified as (1) to demonstrate compliance with specific regulatory acceptance criteria; (2) to complement other analyses and evaluations in defining a complete set of design and operating requirements; (3) to identify and quantify limiting safety system set points and limiting conditions for operation to be used in the NPP limits and conditions; (4) to justify appropriateness of the technical solutions employed in the fulfillment of predetermined safety requirements. The essential parts of accident analyses are performed by applying sophisticated computer code packages, which have been specifically developed for this purpose. These code packages include mainly thermal-hydraulic system codes and reactor dynamics codes meant for the transient and accident analyses. There are also specific codes such as those for the containment thermal-hydraulics, for the radiological consequences and for severe accident analyses. In some cases, codes of a more general nature such

  6. Professionalizing Intelligence Analysis

    Directory of Open Access Journals (Sweden)

    James B. Bruce

    2015-09-01

    Full Text Available This article examines the current state of professionalism in national security intelligence analysis in the U.S. Government. Since the introduction of major intelligence reforms directed by the Intelligence Reform and Terrorism Prevention Act (IRTPA in December, 2004, we have seen notable strides in many aspects of intelligence professionalization, including in analysis. But progress is halting, uneven, and by no means permanent. To consolidate its gains, and if it is to continue improving, the U.S. intelligence community (IC should commit itself to accomplishing a new program of further professionalization of analysis to ensure that it will develop an analytic cadre that is fully prepared to deal with the complexities of an emerging multipolar and highly dynamic world that the IC itself is forecasting. Some recent reforms in intelligence analysis can be assessed against established standards of more fully developed professions; these may well fall short of moving the IC closer to the more fully professionalized analytical capability required for producing the kind of analysis needed now by the United States.

  7. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    Directory of Open Access Journals (Sweden)

    Kiuru Aaro

    2003-01-01

    Full Text Available The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT and nuclear medicine (NM studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.

  8. Electronic Circuit Analysis Language (ECAL)

    Science.gov (United States)

    Chenghang, C.

    1983-03-01

    The computer aided design technique is an important development in computer applications and it is an important component of computer science. The special language for electronic circuit analysis is the foundation of computer aided design or computer aided circuit analysis (abbreviated as CACD and CACA) of simulated circuits. Electronic circuit analysis language (ECAL) is a comparatively simple and easy to use circuit analysis special language which uses the FORTRAN language to carry out the explanatory executions. It is capable of conducting dc analysis, ac analysis, and transient analysis of a circuit. Futhermore, the results of the dc analysis can be used directly as the initial conditions for the ac and transient analyses.

  9. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    Science.gov (United States)

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  10. Bearing defect signature analysis using advanced nonlinear signal analysis in a controlled environment

    Science.gov (United States)

    Zoladz, T.; Earhart, E.; Fiorucci, T.

    1995-01-01

    Utilizing high-frequency data from a highly instrumented rotor assembly, seeded bearing defect signatures are characterized using both conventional linear approaches, such as power spectral density analysis, and recently developed nonlinear techniques such as bicoherence analysis. Traditional low-frequency (less than 20 kHz) analysis and high-frequency envelope analysis of both accelerometer and acoustic emission data are used to recover characteristic bearing distress information buried deeply in acquired data. The successful coupling of newly developed nonlinear signal analysis with recovered wideband envelope data from accelerometers and acoustic emission sensors is the innovative focus of this research.

  11. New trends in applied harmonic analysis sparse representations, compressed sensing, and multifractal analysis

    CERN Document Server

    Cabrelli, Carlos; Jaffard, Stephane; Molter, Ursula

    2016-01-01

    This volume is a selection of written notes corresponding to courses taught at the CIMPA School: "New Trends in Applied Harmonic Analysis: Sparse Representations, Compressed Sensing and Multifractal Analysis". New interactions between harmonic analysis and signal and image processing have seen striking development in the last 10 years, and several technological deadlocks have been solved through the resolution of deep theoretical problems in harmonic analysis. New Trends in Applied Harmonic Analysis focuses on two particularly active areas that are representative of such advances: multifractal analysis, and sparse representation and compressed sensing. The contributions are written by leaders in these areas, and covers both theoretical aspects and applications. This work should prove useful not only to PhD students and postdocs in mathematics and signal and image processing, but also to researchers working in related topics.

  12. Introductory numerical analysis

    CERN Document Server

    Pettofrezzo, Anthony J

    2006-01-01

    Written for undergraduates who require a familiarity with the principles behind numerical analysis, this classical treatment encompasses finite differences, least squares theory, and harmonic analysis. Over 70 examples and 280 exercises. 1967 edition.

  13. Regression analysis by example

    CERN Document Server

    Chatterjee, Samprit

    2012-01-01

    Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded

  14. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    Science.gov (United States)

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e

  15. Synovial fluid analysis

    Science.gov (United States)

    Joint fluid analysis; Joint fluid aspiration ... El-Gabalawy HS. Synovial fluid analysis, synovial biopsy, and synovial pathology. In: Firestein GS, Budd RC, Gabriel SE, McInnes IB, O'Dell JR, eds. Kelly's Textbook of ...

  16. Study on mixed analysis method for fatigue analysis of oblique safety injection nozzle on main piping

    International Nuclear Information System (INIS)

    Lu Xifeng; Zhang Yixiong; Ai Honglei; Wang Xinjun; He Feng

    2014-01-01

    The simplified analysis method and the detailed analysis method were used for the fatigue analysis of the nozzle on the main piping. Because the structure of the oblique safety injection nozzle is complex and some more severe transients are subjected. The results obtained are more penalized and cannot be validate when the simplified analysis method used for the fatigue analysis. It will be little conservative when the detailed analysis method used, but it is more complex and time-consuming and boring labor. To reduce the conservatism and save time, the mixed analysis method which combining the simplified analysis method with the detailed analysis method is used for the fatigue analysis. The heat transfer parameters between the fluid and the structure which used for analysis were obtained by heat transfer property experiment. The results show that the mixed analysis which heat transfer property is considered can reduce the conservatism effectively, and the mixed analysis method is a more effective and practical method used for the fatigue analysis of the oblique safety injection nozzle. (authors)

  17. Factor analysis

    CERN Document Server

    Gorsuch, Richard L

    2013-01-01

    Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.

  18. Cask crush pad analysis using detailed and simplified analysis methods

    International Nuclear Information System (INIS)

    Uldrich, E.D.; Hawkes, B.D.

    1997-01-01

    A crush pad has been designed and analyzed to absorb the kinetic energy of a hypothetically dropped spent nuclear fuel shipping cask into a 44-ft. deep cask unloading pool at the Fluorinel and Storage Facility (FAST). This facility, located at the Idaho Chemical Processing Plant (ICPP) at the Idaho national Engineering and Environmental Laboratory (INEEL), is a US Department of Energy site. The basis for this study is an analysis by Uldrich and Hawkes. The purpose of this analysis was to evaluate various hypothetical cask drop orientations to ensure that the crush pad design was adequate and the cask deceleration at impact was less than 100 g. It is demonstrated herein that a large spent fuel shipping cask, when dropped onto a foam crush pad, can be analyzed by either hand methods or by sophisticated dynamic finite element analysis using computer codes such as ABAQUS. Results from the two methods are compared to evaluate accuracy of the simplified hand analysis approach

  19. Plug-in Based Analysis Framework for LHC Post-Mortem Analysis

    CERN Document Server

    Gorbonosov, R; Zerlauth, M; Baggiolini, V

    2014-01-01

    Plug-in based software architectures [1] are extensible, enforce modularity and allow several teams to work in parallel. But they have certain technical and organizational challenges, which we discuss in this paper. We gained our experience when developing the Post-Mortem Analysis (PMA) system, which is a mission critical system for the Large Hadron Collider (LHC). We used a plugin-based architecture with a general-purpose analysis engine, for which physicists and equipment experts code plugins containing the analysis algorithms. We have over 45 analysis plugins developed by a dozen of domain experts. This paper focuses on the design challenges we faced in order to mitigate the risks of executing third-party code: assurance that even a badly written plugin doesn't perturb the work of the overall application; plugin execution control which allows to detect plugin misbehaviour and react; robust communication mechanism between plugins, diagnostics facilitation in case of plugin failure; testing of the plugins be...

  20. Functional analysis and applications

    CERN Document Server

    Siddiqi, Abul Hasan

    2018-01-01

    This self-contained textbook discusses all major topics in functional analysis. Combining classical materials with new methods, it supplies numerous relevant solved examples and problems and discusses the applications of functional analysis in diverse fields. The book is unique in its scope, and a variety of applications of functional analysis and operator-theoretic methods are devoted to each area of application. Each chapter includes a set of problems, some of which are routine and elementary, and some of which are more advanced. The book is primarily intended as a textbook for graduate and advanced undergraduate students in applied mathematics and engineering. It offers several attractive features making it ideally suited for courses on functional analysis intended to provide a basic introduction to the subject and the impact of functional analysis on applied and computational mathematics, nonlinear functional analysis and optimization. It introduces emerging topics like wavelets, Gabor system, inverse pro...

  1. Easy instrumental analysis

    International Nuclear Information System (INIS)

    Ko, Myeong Su; Kim, Tae Hwa; Park, Gyu Hyeon; Yang, Jong Beom; Oh, Chang Hwan; Lee, Kyoung Hye

    2010-04-01

    This textbook describes instrument analysis in easy way with twelve chapters. The contents of the book are pH measurement on principle, pH meter, pH measurement, examples of the experiments, centrifugation, Absorptiometry, Fluorescent method, Atomic absorption analysis, Gas-chromatography, Gas chromatography-mass spectrometry, High performance liquid chromatography liquid chromatograph-mass spectrometry, Electrophoresis on practical case and analysis of the result and examples, PCR on principle, device, application and examples and Enzyme-linked immunosorbent assay with indirect ELISA, sandwich ELISA and ELISA reader.

  2. Easy instrumental analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Myeong Su; Kim, Tae Hwa; Park, Gyu Hyeon; Yang, Jong Beom; Oh, Chang Hwan; Lee, Kyoung Hye

    2010-04-15

    This textbook describes instrument analysis in easy way with twelve chapters. The contents of the book are pH measurement on principle, pH meter, pH measurement, examples of the experiments, centrifugation, Absorptiometry, Fluorescent method, Atomic absorption analysis, Gas-chromatography, Gas chromatography-mass spectrometry, High performance liquid chromatography liquid chromatograph-mass spectrometry, Electrophoresis on practical case and analysis of the result and examples, PCR on principle, device, application and examples and Enzyme-linked immunosorbent assay with indirect ELISA, sandwich ELISA and ELISA reader.

  3. Fundamentals of PIXE analysis

    International Nuclear Information System (INIS)

    Ishii, Keizo

    1997-01-01

    Elemental analysis based on the particle induced x-ray emission (PIXE) is a novel technique to analyze trace elements. It is a very simple method, its sensitivity is very high, multiple elements in a sample can be simultaneously analyzed and a few 10 μg of a sample is enough to be analyzed. Owing to these characteristics, the PIXE analysis is now used in many fields (e.g. biology, medicine, dentistry, environmental pollution, archaeology, culture assets etc.). Fundamentals of the PIXE analysis are described here: the production of characteristic x-rays and inner shell ionization by heavy charged particles, the continuous background in PIXE spectrum, quantitative formulae of the PIXE analysis, the detection limit of PIXE analysis, etc. (author)

  4. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  5. Instrumental neutron activation analysis as a routine method for rock analysis

    International Nuclear Information System (INIS)

    Rosenberg, R.J.

    1977-06-01

    Instrumental neutron activation methods for the analysis of geological samples have been developed. Special emphasis has been laid on the improvement of sensitivity and accuracy in order to maximize tha quality of the analyses. Furthermore, the procedures have been automated as far as possible in order to minimize the cost of the analysis. A short review of the basic literature is given followed by a description of the principles of the method. All aspects concerning the sensitivity are discussed thoroughly in view of the analyst's possibility of influencing them. Experimentally determined detection limits for Na, Al, K, Ca, Sc, Cr, Ti, V, Mn, Fe, Ni, Co, Rb, Zr, Sb, Cs, Ba, La, Ce, Nd, Sm, Eu, Gd, Tb, Dy, Yb, Lu, Hf, Ta, Th and U are given. The errors of the method are discussed followed by actions taken to avoid them. The most significant error was caused by flux deviation, but this was avoided by building a rotating sample holder for rotating the samples during irradiation. A scheme for the INAA of 32 elements is proposed. The method has been automated as far as possible and an automatic γ-spectrometer and a computer program for the automatic calculation of the results are described. Furthermore, a completely automated uranium analyzer based on delayed neutron counting is described. The methods are discussed in view of their applicability to rock analysis. It is stated that the sensitivity varies considerably from element to element and instrumental activation analysis is an excellent method for the analysis of some specific elements like lanthanides, thorium and uranium but less so for many other elements. The accuracy is good varying from 2% to 10% for most elements. Instrumental activation analysis for most elements is rather an expensive method there being, however, a few exceptions. The most important of these is uranium. The analysis of uranium by delayed neutron counting is an inexpensive means for the analysis of large numbers of samples needed for

  6. Numerical analysis

    CERN Document Server

    Rao, G Shanker

    2006-01-01

    About the Book: This book provides an introduction to Numerical Analysis for the students of Mathematics and Engineering. The book is designed in accordance with the common core syllabus of Numerical Analysis of Universities of Andhra Pradesh and also the syllabus prescribed in most of the Indian Universities. Salient features: Approximate and Numerical Solutions of Algebraic and Transcendental Equation Interpolation of Functions Numerical Differentiation and Integration and Numerical Solution of Ordinary Differential Equations The last three chapters deal with Curve Fitting, Eigen Values and Eigen Vectors of a Matrix and Regression Analysis. Each chapter is supplemented with a number of worked-out examples as well as number of problems to be solved by the students. This would help in the better understanding of the subject. Contents: Errors Solution of Algebraic and Transcendental Equations Finite Differences Interpolation with Equal Intervals Interpolation with Unequal Int...

  7. Wind energy analysis system

    OpenAIRE

    2014-01-01

    M.Ing. (Electrical & Electronic Engineering) One of the most important steps to be taken before a site is to be selected for the extraction of wind energy is the analysis of the energy within the wind on that particular site. No wind energy analysis system exists for the measurement and analysis of wind power. This dissertation documents the design and development of a Wind Energy Analysis System (WEAS). Using a micro-controller based design in conjunction with sensors, WEAS measure, calcu...

  8. Discourse analysis and Foucault's

    Directory of Open Access Journals (Sweden)

    Jansen I.

    2008-01-01

    Full Text Available Discourse analysis is a method with up to now was less recognized in nursing science, althoughmore recently nursing scientists are discovering it for their purposes. However, several authors have criticized thatdiscourse analysis is often misinterpreted because of a lack of understanding of its theoretical backgrounds. In thisarticle, I reconstruct Foucault’s writings in his “Archaeology of Knowledge” to provide a theoretical base for futurearchaeological discourse analysis, which can be categorized as a socio-linguistic discourse analysis.

  9. Slice hyperholomorphic Schur analysis

    CERN Document Server

    Alpay, Daniel; Sabadini, Irene

    2016-01-01

    This book defines and examines the counterpart of Schur functions and Schur analysis in the slice hyperholomorphic setting. It is organized into three parts: the first introduces readers to classical Schur analysis, while the second offers background material on quaternions, slice hyperholomorphic functions, and quaternionic functional analysis. The third part represents the core of the book and explores quaternionic Schur analysis and its various applications. The book includes previously unpublished results and provides the basis for new directions of research.

  10. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  11. WHAT IF (Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Iulian N. BUJOREANU

    2011-01-01

    Full Text Available Sensitivity analysis represents such a well known and deeply analyzed subject that anyone to enter the field feels like not being able to add anything new. Still, there are so many facets to be taken into consideration.The paper introduces the reader to the various ways sensitivity analysis is implemented and the reasons for which it has to be implemented in most analyses in the decision making processes. Risk analysis is of outmost importance in dealing with resource allocation and is presented at the beginning of the paper as the initial cause to implement sensitivity analysis. Different views and approaches are added during the discussion about sensitivity analysis so that the reader develops an as thoroughly as possible opinion on the use and UTILITY of the sensitivity analysis. Finally, a round-up conclusion brings us to the question of the possibility of generating the future and analyzing it before it unfolds so that, when it happens it brings less uncertainty.

  12. Fundamentals of functional analysis

    CERN Document Server

    Farenick, Douglas

    2016-01-01

    This book provides a unique path for graduate or advanced undergraduate students to begin studying the rich subject of functional analysis with fewer prerequisites than is normally required. The text begins with a self-contained and highly efficient introduction to topology and measure theory, which focuses on the essential notions required for the study of functional analysis, and which are often buried within full-length overviews of the subjects. This is particularly useful for those in applied mathematics, engineering, or physics who need to have a firm grasp of functional analysis, but not necessarily some of the more abstruse aspects of topology and measure theory normally encountered. The reader is assumed to only have knowledge of basic real analysis, complex analysis, and algebra. The latter part of the text provides an outstanding treatment of Banach space theory and operator theory, covering topics not usually found together in other books on functional analysis. Written in a clear, concise manner,...

  13. From analysis to surface

    DEFF Research Database (Denmark)

    Bemman, Brian; Meredith, David

    it with a “ground truth” analysis of the same music pro- duced by a human expert (see, in particular, [5]). In this paper, we explore the problem of generating an encoding of the musical surface of a work automatically from a systematic encoding of an analysis. The ability to do this depends on one having...... an effective (i.e., comput- able), correct and complete description of some aspect of the structure of the music. Generating the surface struc- ture of a piece from an analysis in this manner serves as a proof of the analysis' correctness, effectiveness and com- pleteness. We present a reductive analysis......In recent years, a significant body of research has focused on developing algorithms for computing analyses of mu- sical works automatically from encodings of these works' surfaces [3,4,7,10,11]. The quality of the output of such analysis algorithms is typically evaluated by comparing...

  14. Functional Object Analysis

    DEFF Research Database (Denmark)

    Raket, Lars Lau

    We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...

  15. Basic stress analysis

    CERN Document Server

    Iremonger, M J

    1982-01-01

    BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c

  16. Biological sequence analysis

    DEFF Research Database (Denmark)

    Durbin, Richard; Eddy, Sean; Krogh, Anders Stærmose

    This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis, and phylogene...

  17. Confirmatory Composite Analysis

    NARCIS (Netherlands)

    Schuberth, Florian; Henseler, Jörg; Dijkstra, Theo K.

    2018-01-01

    We introduce confirmatory composite analysis (CCA) as a structural equation modeling technique that aims at testing composite models. CCA entails the same steps as confirmatory factor analysis: model specification, model identification, model estimation, and model testing. Composite models are

  18. Evaluation of pavement life cycle cost analysis: Review and analysis

    Directory of Open Access Journals (Sweden)

    Peyman Babashamsi

    2016-07-01

    Full Text Available The cost of road construction consists of design expenses, material extraction, construction equipment, maintenance and rehabilitation strategies, and operations over the entire service life. An economic analysis process known as Life-Cycle Cost Analysis (LCCA is used to evaluate the cost-efficiency of alternatives based on the Net Present Value (NPV concept. It is essential to evaluate the above-mentioned cost aspects in order to obtain optimum pavement life-cycle costs. However, pavement managers are often unable to consider each important element that may be required for performing future maintenance tasks. Over the last few decades, several approaches have been developed by agencies and institutions for pavement Life-Cycle Cost Analysis (LCCA. While the transportation community has increasingly been utilising LCCA as an essential practice, several organisations have even designed computer programs for their LCCA approaches in order to assist with the analysis. Current LCCA methods are analysed and LCCA software is introduced in this article. Subsequently, a list of economic indicators is provided along with their substantial components. Collecting previous literature will help highlight and study the weakest aspects so as to mitigate the shortcomings of existing LCCA methods and processes. LCCA research will become more robust if improvements are made, facilitating private industries and government agencies to accomplish their economic aims. Keywords: Life-Cycle Cost Analysis (LCCA, Pavement management, LCCA software, Net Present Value (NPV

  19. Real time analysis under EDS

    International Nuclear Information System (INIS)

    Schneberk, D.

    1985-07-01

    This paper describes the analysis component of the Enrichment Diagnostic System (EDS) developed for the Atomic Vapor Laser Isotope Separation Program (AVLIS) at Lawrence Livermore National Laboratory (LLNL). Four different types of analysis are performed on data acquired through EDS: (1) absorption spectroscopy on laser-generated spectral lines, (2) mass spectrometer analysis, (3) general purpose waveform analysis, and (4) separation performance calculations. The information produced from this data includes: measures of particle density and velocity, partial pressures of residual gases, and overall measures of isotope enrichment. The analysis component supports a variety of real-time modeling tasks, a means for broadcasting data to other nodes, and a great degree of flexibility for tailoring computations to the exact needs of the process. A particular data base structure and program flow is common to all types of analysis. Key elements of the analysis component are: (1) a fast access data base which can configure all types of analysis, (2) a selected set of analysis routines, (3) a general purpose data manipulation and graphics package for the results of real time analysis. Each of these components are described with an emphasis upon how each contributes to overall system capability. 3 figs

  20. Trend Analysis Using Microcomputers.

    Science.gov (United States)

    Berger, Carl F.

    A trend analysis statistical package and additional programs for the Apple microcomputer are presented. They illustrate strategies of data analysis suitable to the graphics and processing capabilities of the microcomputer. The programs analyze data sets using examples of: (1) analysis of variance with multiple linear regression; (2) exponential…

  1. Life-Cycle Cost-Benefit Analysis

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    2010-01-01

    The future use of Life-Cycle Cost-Benefit (LCCB) analysis is discussed in this paper. A more complete analysis including not only the traditional factors and user costs, but also factors which are difficult to include in the analysis is needed in the future.......The future use of Life-Cycle Cost-Benefit (LCCB) analysis is discussed in this paper. A more complete analysis including not only the traditional factors and user costs, but also factors which are difficult to include in the analysis is needed in the future....

  2. Evaluating Style Analysis

    NARCIS (Netherlands)

    F.A. de Roon (Frans); T.E. Nijman (Theo); B.J.M. Werker

    2000-01-01

    textabstractIn this paper we evaluate applications of (return based) style analysis. The portfolio and positivity constraints imposed by style analysis are useful in constructing mimicking portfolios without short positions. Such mimicking portfolios can be used e.g. to construct efficient

  3. Strictness Analysis for Attribute Grammars

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1992-01-01

    interpretation of attribute grammars. The framework is used to construct a strictness analysis for attribute grammars. Results of the analysis enable us to transform an attribute grammar such that attributes are evaluated during parsing, if possible. The analysis is proved correct by relating it to a fixpoint...... semantics for attribute grammars. An implementation of the analysis is discussed and some extensions to the analysis are mentioned....

  4. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  5. Rescaled Range Analysis and Detrended Fluctuation Analysis: Finite Sample Properties and Confidence Intervals

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    4/2010, č. 3 (2010), s. 236-250 ISSN 1802-4696 R&D Projects: GA ČR GD402/09/H045; GA ČR GA402/09/0965 Grant - others:GA UK(CZ) 118310 Institutional research plan: CEZ:AV0Z10750506 Keywords : rescaled range analysis * detrended fluctuation analysis * Hurst exponent * long-range dependence Subject RIV: AH - Economics http://library.utia.cas.cz/separaty/2010/E/kristoufek-rescaled range analysis and detrended fluctuation analysis finite sample properties and confidence intervals.pdf

  6. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  7. Isogeometric failure analysis

    NARCIS (Netherlands)

    Verhoosel, C.V.; Scott, M.A.; Borden, M.J.; Borst, de R.; Hughes, T.J.R.; Mueller-Hoeppe, D.; Loehnert, S.; Reese, S.

    2011-01-01

    Isogeometric analysis is a versatile tool for failure analysis. On the one hand, the excellent control over the inter-element continuity conditions enables a natural incorporation of continuum constitutive relations that incorporate higher-order strain gradients, as in gradient plasticity or damage.

  8. Cognitive task analysis

    NARCIS (Netherlands)

    Schraagen, J.M.C.

    2000-01-01

    Cognitive task analysis is defined as the extension of traditional task analysis techniques to yield information about the knowledge, thought processes and goal structures that underlie observable task performance. Cognitive task analyses are conducted for a wide variety of purposes, including the

  9. Evaluating Style Analysis

    NARCIS (Netherlands)

    de Roon, F.A.; Nijman, T.E.; Ter Horst, J.R.

    2000-01-01

    In this paper we evaluate applications of (return based) style analysis.The portfolio and positivity constraints imposed by style analysis are useful in constructing mimicking portfolios without short positions.Such mimicking portfolios can be used, e.g., to construct efficient portfolios of mutual

  10. Circuit analysis with Multisim

    CERN Document Server

    Baez-Lopez, David

    2011-01-01

    This book is concerned with circuit simulation using National Instruments Multisim. It focuses on the use and comprehension of the working techniques for electrical and electronic circuit simulation. The first chapters are devoted to basic circuit analysis.It starts by describing in detail how to perform a DC analysis using only resistors and independent and controlled sources. Then, it introduces capacitors and inductors to make a transient analysis. In the case of transient analysis, it is possible to have an initial condition either in the capacitor voltage or in the inductor current, or bo

  11. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, John (Massachusetts Institute of Technology)

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  12. Panel Analysis

    DEFF Research Database (Denmark)

    Brænder, Morten; Andersen, Lotte Bøgh

    2014-01-01

    Based on our 2013-article, ”Does Deployment to War Affect Soldiers' Public Service Motivation – A Panel Study of Soldiers Before and After their Service in Afghanistan”, we present Panel Analysis as a methodological discipline. Panels consist of multiple units of analysis, observed at two or more...... in research settings where it is not possible to distribute units of analysis randomly or where the independent variables cannot be manipulated. The greatest disadvantage in regard to using panel studies is that data may be difficult to obtain. This is most clearly vivid in regard to the use of panel surveys...... points in time. In comparison with traditional cross-sectional studies, the advantage of using panel studies is that the time dimension enables us to study effects. Whereas experimental designs may have a clear advantage in regard to causal inference, the strength of panel studies is difficult to match...

  13. Real analysis

    CERN Document Server

    DiBenedetto, Emmanuele

    2016-01-01

    The second edition of this classic textbook presents a rigorous and self-contained introduction to real analysis with the goal of providing a solid foundation for future coursework and research in applied mathematics. Written in a clear and concise style, it covers all of the necessary subjects as well as those often absent from standard introductory texts. Each chapter features a “Problems and Complements” section that includes additional material that briefly expands on certain topics within the chapter and numerous exercises for practicing the key concepts. The first eight chapters explore all of the basic topics for training in real analysis, beginning with a review of countable sets before moving on to detailed discussions of measure theory, Lebesgue integration, Banach spaces, functional analysis, and weakly differentiable functions. More topical applications are discussed in the remaining chapters, such as maximal functions, functions of bounded mean oscillation, rearrangements, potential theory, a...

  14. Numerical analysis

    CERN Document Server

    Scott, L Ridgway

    2011-01-01

    Computational science is fundamentally changing how technological questions are addressed. The design of aircraft, automobiles, and even racing sailboats is now done by computational simulation. The mathematical foundation of this new approach is numerical analysis, which studies algorithms for computing expressions defined with real numbers. Emphasizing the theory behind the computation, this book provides a rigorous and self-contained introduction to numerical analysis and presents the advanced mathematics that underpin industrial software, including complete details that are missing from most textbooks. Using an inquiry-based learning approach, Numerical Analysis is written in a narrative style, provides historical background, and includes many of the proofs and technical details in exercises. Students will be able to go beyond an elementary understanding of numerical simulation and develop deep insights into the foundations of the subject. They will no longer have to accept the mathematical gaps that ex...

  15. Modern real analysis

    CERN Document Server

    Ziemer, William P

    2017-01-01

    This first year graduate text is a comprehensive resource in real analysis based on a modern treatment of measure and integration. Presented in a definitive and self-contained manner, it features a natural progression of concepts from simple to difficult. Several innovative topics are featured, including differentiation of measures, elements of Functional Analysis, the Riesz Representation Theorem, Schwartz distributions, the area formula, Sobolev functions and applications to harmonic functions. Together, the selection of topics forms a sound foundation in real analysis that is particularly suited to students going on to further study in partial differential equations. This second edition of Modern Real Analysis contains many substantial improvements, including the addition of problems for practicing techniques, and an entirely new section devoted to the relationship between Lebesgue and improper integrals. Aimed at graduate students with an understanding of advanced calculus, the text will also appeal to mo...

  16. A PROOF Analysis Framework

    International Nuclear Information System (INIS)

    González Caballero, I; Cuesta Noriega, A; Rodríguez Marrero, A; Fernández del Castillo, E

    2012-01-01

    The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of the routine tasks as much as possible. We describe the details of the PAF implementation as well as how we succeeded in engaging a group of CMS physicists to use PAF as their daily analysis framework.

  17. Marketing Mix Formulation for Higher Education: An Integrated Analysis Employing Analytic Hierarchy Process, Cluster Analysis and Correspondence Analysis

    Science.gov (United States)

    Ho, Hsuan-Fu; Hung, Chia-Chi

    2008-01-01

    Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…

  18. Manual versus Automated Narrative Analysis of Agrammatic Production Patterns: The Northwestern Narrative Language Analysis and Computerized Language Analysis

    Science.gov (United States)

    Hsu, Chien-Ju; Thompson, Cynthia K.

    2018-01-01

    Purpose: The purpose of this study is to compare the outcomes of the manually coded Northwestern Narrative Language Analysis (NNLA) system, which was developed for characterizing agrammatic production patterns, and the automated Computerized Language Analysis (CLAN) system, which has recently been adopted to analyze speech samples of individuals…

  19. Cost analysis guidelines

    International Nuclear Information System (INIS)

    Strait, R.S.

    1996-01-01

    The first phase of the Depleted Uranium Hexafluoride Management Program (Program)--management strategy selection--consists of several program elements: Technology Assessment, Engineering Analysis, Cost Analysis, and preparation of an Environmental Impact Statement (EIS). Cost Analysis will estimate the life-cycle costs associated with each of the long-term management strategy alternatives for depleted uranium hexafluoride (UF6). The scope of Cost Analysis will include all major expenditures, from the planning and design stages through decontamination and decommissioning. The costs will be estimated at a scoping or preconceptual design level and are intended to assist decision makers in comparing alternatives for further consideration. They will not be absolute costs or bid-document costs. The purpose of the Cost Analysis Guidelines is to establish a consistent approach to analyzing of cost alternatives for managing Department of Energy's (DOE's) stocks of depleted uranium hexafluoride (DUF6). The component modules that make up the DUF6 management program differ substantially in operational maintenance, process-options, requirements for R and D, equipment, facilities, regulatory compliance, (O and M), and operations risk. To facilitate a consistent and equitable comparison of costs, the guidelines offer common definitions, assumptions or basis, and limitations integrated with a standard approach to the analysis. Further, the goal is to evaluate total net life-cycle costs and display them in a way that gives DOE the capability to evaluate a variety of overall DUF6 management strategies, including commercial potential. The cost estimates reflect the preconceptual level of the designs. They will be appropriate for distinguishing among management strategies

  20. Radiation and environmental data analysis computer (REDAC) hardware, software band analysis procedures

    International Nuclear Information System (INIS)

    Hendricks, T.J.

    1985-01-01

    The REDAC was conceived originally as a tape verifier for the Radiation and Environmental Data Acquisition Recorder (REDAR). From that simple beginning in 1971, the REDAC has evolved into a family of systems used for complete analysis of data obtained by the REDAR and other acquisition systems. Portable or mobile REDACs are deployed to support checkout and analysis tasks in the field. Laboratory systems are additionally used for software development, physics investigations, data base management and graphics. System configurations range from man-portable systems to a large laboratory-based system which supports time-shared analysis and development tasks. Custom operating software allows the analyst to process data either interactively or by batch procedures. Analysis packages are provided for numerous necessary functions. All these analysis procedures can be performed even on the smallest man-portable REDAC. Examples of the multi-isotope stripping and radiation isopleth mapping are presented. Techniques utilized for these operations are also presented

  1. Systems engineering and analysis

    CERN Document Server

    Blanchard, Benjamin S

    2010-01-01

    For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.

  2. Numerical analysis

    CERN Document Server

    Brezinski, C

    2012-01-01

    Numerical analysis has witnessed many significant developments in the 20th century. This book brings together 16 papers dealing with historical developments, survey papers and papers on recent trends in selected areas of numerical analysis, such as: approximation and interpolation, solution of linear systems and eigenvalue problems, iterative methods, quadrature rules, solution of ordinary-, partial- and integral equations. The papers are reprinted from the 7-volume project of the Journal of Computational and Applied Mathematics on '/homepage/sac/cam/na2000/index.html<

  3. Automation of activation analysis

    International Nuclear Information System (INIS)

    Ivanov, I.N.; Ivanets, V.N.; Filippov, V.V.

    1985-01-01

    The basic data on the methods and equipment of activation analysis are presented. Recommendations on the selection of activation analysis techniques, and especially the technique envisaging the use of short-lived isotopes, are given. The equipment possibilities to increase dataway carrying capacity, using modern computers for the automation of the analysis and data processing procedure, are shown

  4. Intelligent audio analysis

    CERN Document Server

    Schuller, Björn W

    2013-01-01

    This book provides the reader with the knowledge necessary for comprehension of the field of Intelligent Audio Analysis. It firstly introduces standard methods and discusses the typical Intelligent Audio Analysis chain going from audio data to audio features to audio recognition.  Further, an introduction to audio source separation, and enhancement and robustness are given. After the introductory parts, the book shows several applications for the three types of audio: speech, music, and general sound. Each task is shortly introduced, followed by a description of the specific data and methods applied, experiments and results, and a conclusion for this specific task. The books provides benchmark results and standardized test-beds for a broader range of audio analysis tasks. The main focus thereby lies on the parallel advancement of realism in audio analysis, as too often today’s results are overly optimistic owing to idealized testing conditions, and it serves to stimulate synergies arising from transfer of ...

  5. Advantages of Wavelet analysis compared to Fourier analysis for the interpretation of electrochemical noise

    International Nuclear Information System (INIS)

    Espada, L.; Sanjurjo, M.; Urrejola, S.; Bouzada, F.; Rey, G.; Sanchez, A.

    2003-01-01

    Given its simplicity and low cost compared to other types of methodologies, the measurement and interpretation of Electrochemical Noise, is consolidating itself as one of the analysis methods most frequently used for the interpretation of corrosion. As the technique is still evolving, standard treatment methodologies for data retrieved in experiments do not exist yet. To date, statistical analysis and the Fourier analysis are commonly used in order to establish the parameters that may characterize the recording of potential and current electrochemical noise. This study introduces a new methodology based on wavelet analysis and presents its advantages with regards to the Fourier analysis in distinguishes periodical and non-periodical variations in the signal power in time and frequency, as opposed to the Fourier analysis that only considers the frequency. (Author) 15 refs

  6. 21 CFR 1230.34 - Analysis.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Analysis. 1230.34 Section 1230.34 Food and Drugs... POISON ACT Administrative Procedures § 1230.34 Analysis. (a) The methods of examination or analysis..., provided, however, that any method of analysis or examination satisfactory to the Food and Drug...

  7. Real analysis

    CERN Document Server

    Loeb, Peter A

    2016-01-01

    This textbook is designed for a year-long course in real analysis taken by beginning graduate and advanced undergraduate students in mathematics and other areas such as statistics, engineering, and economics. Written by one of the leading scholars in the field, it elegantly explores the core concepts in real analysis and introduces new, accessible methods for both students and instructors. The first half of the book develops both Lebesgue measure and, with essentially no additional work for the student, general Borel measures for the real line. Notation indicates when a result holds only for Lebesgue measure. Differentiation and absolute continuity are presented using a local maximal function, resulting in an exposition that is both simpler and more general than the traditional approach. The second half deals with general measures and functional analysis, including Hilbert spaces, Fourier series, and the Riesz representation theorem for positive linear functionals on continuous functions with compact support....

  8. Numerical analysis

    CERN Document Server

    Jacques, Ian

    1987-01-01

    This book is primarily intended for undergraduates in mathematics, the physical sciences and engineering. It introduces students to most of the techniques forming the core component of courses in numerical analysis. The text is divided into eight chapters which are largely self-contained. However, with a subject as intricately woven as mathematics, there is inevitably some interdependence between them. The level of difficulty varies and, although emphasis is firmly placed on the methods themselves rather than their analysis, we have not hesitated to include theoretical material when we consider it to be sufficiently interesting. However, it should be possible to omit those parts that do seem daunting while still being able to follow the worked examples and to tackle the exercises accompanying each section. Familiarity with the basic results of analysis and linear algebra is assumed since these are normally taught in first courses on mathematical methods. For reference purposes a list of theorems used in the t...

  9. Physics and Video Analysis

    Science.gov (United States)

    Allain, Rhett

    2016-05-01

    We currently live in a world filled with videos. There are videos on YouTube, feature movies and even videos recorded with our own cameras and smartphones. These videos present an excellent opportunity to not only explore physical concepts, but also inspire others to investigate physics ideas. With video analysis, we can explore the fantasy world in science-fiction films. We can also look at online videos to determine if they are genuine or fake. Video analysis can be used in the introductory physics lab and it can even be used to explore the make-believe physics embedded in video games. This book covers the basic ideas behind video analysis along with the fundamental physics principles used in video analysis. The book also includes several examples of the unique situations in which video analysis can be used.

  10. Real analysis and applications

    CERN Document Server

    Botelho, Fabio Silva

    2018-01-01

    This textbook introduces readers to real analysis in one and n dimensions. It is divided into two parts: Part I explores real analysis in one variable, starting with key concepts such as the construction of the real number system, metric spaces, and real sequences and series. In turn, Part II addresses the multi-variable aspects of real analysis. Further, the book presents detailed, rigorous proofs of the implicit theorem for the vectorial case by applying the Banach fixed-point theorem and the differential forms concept to surfaces in Rn. It also provides a brief introduction to Riemannian geometry. With its rigorous, elegant proofs, this self-contained work is easy to read, making it suitable for undergraduate and beginning graduate students seeking a deeper understanding of real analysis and applications, and for all those looking for a well-founded, detailed approach to real analysis.

  11. Comparison of descriptive sensory analysis and chemical analysis for oxidative changes in milk

    DEFF Research Database (Denmark)

    Hedegaard, R V; Kristensen, D; Nielsen, Jacob Holm

    2006-01-01

    and lipolytic changes occurring in the milk during chill storage for 4 d. Sensory analysis and chemical analysis showed high correlation between the typical descriptors for oxidation such as cardboard, metallic taste, and boiled milk and specific chemical markers for oxidation such as hexanal. Notably, primary......Oxidation in 3 types of bovine milk with different fatty acid profiles obtained through manipulation of feed was evaluated by analytical methods quantifying the content of potential antioxidants, the tendency of formation of free radicals, and the accumulation of primary and secondary oxidation...... products. The milk samples were evaluated in parallel by descriptive sensory analysis by a trained panel, and the correlation between the chemical analysis and the descriptive sensory analysis was evaluated. The fatty acid composition of the 3 types of milk was found to influence the oxidative...

  12. Comparison of descriptive sensory analysis and chemical analysis for oxidative changes in milk

    DEFF Research Database (Denmark)

    Hedegaard, Rikke Susanne Vingborg; Kristensen, D.; Nielsen, J. H.

    2006-01-01

    products. The milk samples were evaluated in parallel by descriptive sensory analysis by a trained panel, and the correlation between the chemical analysis and the descriptive sensory analysis was evaluated. The fatty acid composition of the 3 types of milk was found to influence the oxidative...... and lipolytic changes occurring in the milk during chill storage for 4 d. Sensory analysis and chemical analysis showed high correlation between the typical descriptors for oxidation such as cardboard, metallic taste, and boiled milk and specific chemical markers for oxidation such as hexanal. Notably, primary...... oxidation products (i.e., lipid hydroperoxides) and even the tendency of formation of radicals as measured by electron spin resonance spectroscopy were also highly correlated to the sensory descriptors for oxidation. Electron spin resonance spectroscopy should accordingly be further explored as a routine...

  13. Containment vessel stability analysis

    International Nuclear Information System (INIS)

    Harstead, G.A.; Morris, N.F.; Unsal, A.I.

    1983-01-01

    The stability analysis for a steel containment shell is presented herein. The containment is a freestanding shell consisting of a vertical cylinder with a hemispherical dome. It is stiffened by large ring stiffeners and relatively small longitudinal stiffeners. The containment vessel is subjected to both static and dynamic loads which can cause buckling. These loads must be combined prior to their use in a stability analysis. The buckling loads were computed with the aid of the ASME Code case N-284 used in conjunction with general purpose computer codes and in-house programs. The equations contained in the Code case were used to compute the knockdown factors due to shell imperfections. After these knockdown factors were applied to the critical stress states determined by freezing the maximum dynamic stresses and combining them with other static stresses, a linear bifurcation analysis was carried out with the aid of the BOSOR4 program. Since the containment shell contained large penetrations, the Code case had to be supplemented by a local buckling analysis of the shell area surrounding the largest penetration. This analysis was carried out with the aid of the NASTRAN program. Although the factor of safety against buckling obtained in this analysis was satisfactory, it is claimed that the use of the Code case knockdown factors are unduly conservative when applied to the analysis of buckling around penetrations. (orig.)

  14. Statistical data analysis

    International Nuclear Information System (INIS)

    Hahn, A.A.

    1994-11-01

    The complexity of instrumentation sometimes requires data analysis to be done before the result is presented to the control room. This tutorial reviews some of the theoretical assumptions underlying the more popular forms of data analysis and presents simple examples to illuminate the advantages and hazards of different techniques

  15. Elemental analysis of granite by instrumental neutron activation analysis (INAA) and X-ray fluorescence analysis (XRF)

    International Nuclear Information System (INIS)

    El-Taher, A.

    2012-01-01

    The instrumental neutron activation analysis technique (INAA) was used for qualitative and quantitative analysis of granite samples collected from four locations in the Aswan area in South Egypt. The samples were prepared together with their standards and simultaneously irradiated in a neutron flux of 7×10 11 n/cm 2 s in the TRIGA Mainz research reactor. Gamma-ray spectra from an hyper-pure germanium detector were analyzed. The present study provides the basic data of elemental concentrations of granite rocks. The following elements have been determined Na, Mg, K, Fe, Mn, Sc, Cr, Ti, Co, Zn, Ga, Rb, Zr, Nb, Sn, Ba, Cs, La, Ce, Nd, Sm, Eu, Yb, Lu, Hf, Ta, Th and U. The X-ray fluorescence (XRF) was used for comparison and to detect elements, which can be detected only by XRF such as F, S, Cl, Co, Cu, Mo, Ni, Pb, Se and V. The data presented here are our contribution to understanding the elemental composition of the granite rocks. Because there are no existing databases for the elemental analysis of granite, our results are a start to establishing a database for the Egyptian granite. It is hoped that the data presented here will be useful to those dealing with geochemistry, granite chemistry and related fields. - Highlights: ► Instrumental neutron activation analysis technique (INAA) was used for qualitative and quantitative analysis of granite. ► The samples were prepared together with their standards and simultaneously irradiated in a neutron flux of 7×10 11 n/cm 2 s in the TRIGA Mainz research reactor. ► Following elements have been determined Na, Mg, K, Fe, Mn, Sc, Cr, Ti, Co, Zn, Ga, Rb, Zr, Nb, Sn, Ba, Cs, La, Ce, Nd, Sm, Eu, Yb, Lu, Hf, Ta, Th and U.

  16. Regional energy facility siting analysis

    International Nuclear Information System (INIS)

    Eberhart, R.C.; Eagles, T.W.

    1976-01-01

    Results of the energy facility siting analysis portion of a regional pilot study performed for the anticipated National Energy Siting and Facility Report are presented. The question of cell analysis versus site-specific analysis is explored, including an evaluation of the difference in depth between the two approaches. A discussion of the possible accomplishments of regional analysis is presented. It is concluded that regional sitting analysis could be of use in a national siting study, if its inherent limits are recognized

  17. Analysis of Lead and Zinc by Mercury-Free Potentiometric Stripping Analysis

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    A method is presented for trace-element analysis of lead and zinc by potentiometric stripping analysis (PSA) where both the glassy-carbon working electrode and the electrolyte are free of mercury. Analysis of zinc requires an activation procedure of the glassy-carbon electrode. The activation...... is performed by pre-concentrating zinc on glassy carbon at -1400 mV(SCE) in a mercury-free electrolyte containing 0.1 M HCl and 2 ppm Zn2+, followed by stripping at approx. -1050 mV. A linear relationship between stripping peak areas, recorded in the derivative mode, and concentration was found...

  18. Evaluation and presentation of analysis methods for reception analysis in reprocessing

    International Nuclear Information System (INIS)

    Mainka, E.

    1985-01-01

    The fissile material content in the dissolving or balancing tank of a reprocessing plant has special significance in nuclear fuel balancing. This is the first opportunity for destructive analysis of the fuel content of the material after burn-up of fuel elements in the reactor. In the current state-of-the-art, all balancing methods are based directly or indirectly on data obtained by chemical analysis. The following methods are evaluated: Mass-spectroscopic isotope dilution analysis, X-ray fluorescence spectroscopy, Isotopic correlation, Gamma absorptiometry, Redox titration, Emission spectroscopy after plasma excitation, Alpha spectroscopy, and Laser Raman spectroscopy

  19. Analysis apparatus and method of analysis

    International Nuclear Information System (INIS)

    1976-01-01

    A continuous streaming method developed for the excution of immunoassays is described in this patent. In addition, a suitable apparatus for the method was developed whereby magnetic particles are automatically employed for the consecutive analysis of a series of liquid samples via the RIA technique

  20. Defining disease beyond conceptual analysis: an analysis of conceptual analysis in philosophy of medicine.

    Science.gov (United States)

    Lemoine, Maël

    2013-08-01

    Conceptual analysis of health and disease is portrayed as consisting in the confrontation of a set of criteria--a "definition"--with a set of cases, called instances of either "health" or "disease." Apart from logical counter-arguments, there is no other way to refute an opponent's definition than by providing counter-cases. As resorting to intensional stipulation (stipulation of meaning) is not forbidden, several contenders can therefore be deemed to have succeeded. This implies that conceptual analysis alone is not likely to decide between naturalism and normativism. An alternative to this approach would be to examine whether the concept of disease can be naturalized.

  1. Mastering Clojure data analysis

    CERN Document Server

    Rochester, Eric

    2014-01-01

    This book consists of a practical, example-oriented approach that aims to help you learn how to use Clojure for data analysis quickly and efficiently.This book is great for those who have experience with Clojure and who need to use it to perform data analysis. This book will also be hugely beneficial for readers with basic experience in data analysis and statistics.

  2. Evaluation of Analysis by Cross-Validation, Part II: Diagnostic and Optimization of Analysis Error Covariance

    Directory of Open Access Journals (Sweden)

    Richard Ménard

    2018-02-01

    Full Text Available We present a general theory of estimation of analysis error covariances based on cross-validation as well as a geometric interpretation of the method. In particular, we use the variance of passive observation-minus-analysis residuals and show that the true analysis error variance can be estimated, without relying on the optimality assumption. This approach is used to obtain near optimal analyses that are then used to evaluate the air quality analysis error using several different methods at active and passive observation sites. We compare the estimates according to the method of Hollingsworth-Lönnberg, Desroziers et al., a new diagnostic we developed, and the perceived analysis error computed from the analysis scheme, to conclude that, as long as the analysis is near optimal, all estimates agree within a certain error margin.

  3. Convex analysis

    CERN Document Server

    Rockafellar, Ralph Tyrell

    2015-01-01

    Available for the first time in paperback, R. Tyrrell Rockafellar's classic study presents readers with a coherent branch of nonlinear mathematical analysis that is especially suited to the study of optimization problems. Rockafellar's theory differs from classical analysis in that differentiability assumptions are replaced by convexity assumptions. The topics treated in this volume include: systems of inequalities, the minimum or maximum of a convex function over a convex set, Lagrange multipliers, minimax theorems and duality, as well as basic results about the structure of convex sets and

  4. Outlier analysis

    CERN Document Server

    Aggarwal, Charu C

    2013-01-01

    With the increasing advances in hardware technology for data collection, and advances in software technology (databases) for data organization, computer scientists have increasingly participated in the latest advancements of the outlier analysis field. Computer scientists, specifically, approach this field based on their practical experiences in managing large amounts of data, and with far fewer assumptions- the data can be of any type, structured or unstructured, and may be extremely large.Outlier Analysis is a comprehensive exposition, as understood by data mining experts, statisticians and

  5. Elementary analysis

    CERN Document Server

    Snell, K S; Langford, W J; Maxwell, E A

    1966-01-01

    Elementary Analysis, Volume 2 introduces several of the ideas of modern mathematics in a casual manner and provides the practical experience in algebraic and analytic operations that lays a sound foundation of basic skills. This book focuses on the nature of number, algebraic and logical structure, groups, rings, fields, vector spaces, matrices, sequences, limits, functions and inverse functions, complex numbers, and probability. The logical structure of analysis given through the treatment of differentiation and integration, with applications to the trigonometric and logarithmic functions, is

  6. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  7. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  8. Maritime transportation risk analysis: Review and analysis in light of some foundational issues

    International Nuclear Information System (INIS)

    Goerlandt, Floris; Montewka, Jakub

    2015-01-01

    Many methods and applications for maritime transportation risk analysis have been presented in the literature. In parallel, there is a recent focus on foundational issues in risk analysis, with calls for intensified research on fundamental concepts and principles underlying the scientific field. This paper presents a review and analysis of risk definitions, perspectives and scientific approaches to risk analysis found in the maritime transportation application area, focusing on applications addressing accidental risk of shipping in a sea area. For this purpose, a classification of risk definitions, an overview of elements in risk perspectives and a classification of approaches to risk analysis science are applied. Results reveal that in the application area, risk is strongly tied to probability, both in definitions and perspectives, while alternative views exist. A diffuse situation is also found concerning the scientific approach to risk analysis, with realist, proceduralist and constructivist foundations co-existing. Realist approaches dominate the application area. Very few applications systematically account for uncertainty, neither concerning the evidence base nor in relation to the limitations of the risk model in relation to the space of possible outcomes. Some suggestions are made to improve the current situation, aiming to strengthen the scientific basis for risk analysis. - Highlights: • Risk analyses in maritime transportation analysed in light of foundational issues. • Focus on definitions, perspectives and scientific approaches to risk analysis. • Probability-based definitions and realist approaches dominate the field. • Findings support calls for increased focus on foundational issues in risk research. • Some suggestions are made to improve the current situation

  9. Analysis of a Braking System on the Basis of Structured Analysis Methods

    OpenAIRE

    Ben Salem J.; Lakhoua M.N.; El Amraoui L.

    2016-01-01

    In this paper, we present the general context of the research in the domain of analysis and modeling of mechatronic systems. In fact, we present à bibliographic review on some works of research about the systemic analysis of mechatronic systems. To better understand its characteristics, we start with an introduction about mechatronic systems and various fields related to these systems, after we present a few analysis and design methods applied to mechatronic systems. Finally, we apply the two...

  10. Integrated piping structural analysis system

    International Nuclear Information System (INIS)

    Motoi, Toshio; Yamadera, Masao; Horino, Satoshi; Idehata, Takamasa

    1979-01-01

    Structural analysis of the piping system for nuclear power plants has become larger in scale and in quantity. In addition, higher quality analysis is regarded as of major importance nowadays from the point of view of nuclear plant safety. In order to fulfill to the above requirements, an integrated piping structural analysis system (ISAP-II) has been developed. Basic philosophy of this system is as follows: 1. To apply the date base system. All information is concentrated. 2. To minimize the manual process in analysis, evaluation and documentation. Especially to apply the graphic system as much as possible. On the basis of the above philosophy four subsystems were made. 1. Data control subsystem. 2. Analysis subsystem. 3. Plotting subsystem. 4. Report subsystem. Function of the data control subsystem is to control all information of the data base. Piping structural analysis can be performed by using the analysis subsystem. Isometric piping drawing and mode shape, etc. can be plotted by using the plotting subsystem. Total analysis report can be made without the manual process through the reporting subsystem. (author)

  11. Longitudinal Meta-analysis

    NARCIS (Netherlands)

    Hox, J.J.; Maas, C.J.M.; Lensvelt-Mulders, G.J.L.M.

    2004-01-01

    The goal of meta-analysis is to integrate the research results of a number of studies on a specific topic. Characteristic for meta-analysis is that in general only the summary statistics of the studies are used and not the original data. When the published research results to be integrated

  12. Enabling interdisciplinary analysis

    Science.gov (United States)

    L. M. Reid

    1996-01-01

    'New requirements for evaluating environmental conditions in the Pacific Northwest have led to increased demands for interdisciplinary analysis of complex environmental problems. Procedures for watershed analysis have been developed for use on public and private lands in Washington State (Washington Forest Practices Board 1993) and for federal lands in the Pacific...

  13. Incorporating technical analysis in undergraduate curricula

    Directory of Open Access Journals (Sweden)

    Michael R. Melton

    2017-11-01

    Full Text Available Purpose – The purpose of this paper is to introduce instruction of technical analysis on the undergraduate level that can coincide with traditional teachings of fundamental analysis. Design/methodology/approach – Through examples using the latest in security analysis technology, this paper illustrates the importance of technical security analysis. Findings – This research illustrates how technical analysis techniques may be used to make more significant investment decisions. Originality/value – Kirkpatrick and Dahlquist define technical analysis as a security analysis discipline for forecasting future direction of prices through the study of past market data primarily price and volume This form of analysis has stood in direct contrast to the fundamental analysis approach whereby actual facts of the company its industry and sector may be ignored. Understanding this contrast, much of academia has chosen to continue to focus its finance curricula on fundamental analysis techniques. As more universities implement trading rooms to reflect that of industry, they must recognize that any large brokerage trading group or financial institution will typically have both a technical analysis and fundamental analysis team. Thus, the need to incorporate technical analysis into undergraduate finance curricula.

  14. Survival analysis models and applications

    CERN Document Server

    Liu, Xian

    2012-01-01

    Survival analysis concerns sequential occurrences of events governed by probabilistic laws.  Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin

  15. WWW-based remote analysis framework for UniSampo and Shaman analysis software

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Ala-Heikkilae, J.J.; Routti, J.T.; Nikkinen, M.T.

    2005-01-01

    UniSampo and Shaman are well-established analytical tools for gamma-ray spectrum analysis and the subsequent radionuclide identification. These tools are normally run locally on a Unix or Linux workstation in interactive mode. However, it is also possible to run them in batch/non-interactive mode by starting them with the correct parameters. This is how they are used in the standard analysis pipeline operation. This functionality also makes it possible to use them for remote operation over the network. Framework for running UniSampo and Shaman analysis using the standard WWW-protocol has been developed. A WWW-server receives requests from the client WWW-browser and runs the analysis software via a set of CGI-scripts. Authentication, input data transfer, and output and display of the final analysis results is all carried out using standard WWW-mechanisms. This WWW-framework can be utilized, for example, by organizations that have radioactivity surveillance stations in a wide area. A computer with a standard internet/intranet connection suffices for on-site analyses. (author)

  16. Integrated genetic analysis microsystems

    International Nuclear Information System (INIS)

    Lagally, Eric T; Mathies, Richard A

    2004-01-01

    With the completion of the Human Genome Project and the ongoing DNA sequencing of the genomes of other animals, bacteria, plants and others, a wealth of new information about the genetic composition of organisms has become available. However, as the demand for sequence information grows, so does the workload required both to generate this sequence and to use it for targeted genetic analysis. Microfabricated genetic analysis systems are well poised to assist in the collection and use of these data through increased analysis speed, lower analysis cost and higher parallelism leading to increased assay throughput. In addition, such integrated microsystems may point the way to targeted genetic experiments on single cells and in other areas that are otherwise very difficult. Concomitant with these advantages, such systems, when fully integrated, should be capable of forming portable systems for high-speed in situ analyses, enabling a new standard in disciplines such as clinical chemistry, forensics, biowarfare detection and epidemiology. This review will discuss the various technologies available for genetic analysis on the microscale, and efforts to integrate them to form fully functional robust analysis devices. (topical review)

  17. Physics Metacognition Inventory Part II: Confirmatory factor analysis and Rasch analysis

    Science.gov (United States)

    Taasoobshirazi, Gita; Bailey, MarLynn; Farley, John

    2015-11-01

    The Physics Metacognition Inventory was developed to measure physics students' metacognition for problem solving. In one of our earlier studies, an exploratory factor analysis provided evidence of preliminary construct validity, revealing six components of students' metacognition when solving physics problems including knowledge of cognition, planning, monitoring, evaluation, debugging, and information management. The college students' scores on the inventory were found to be reliable and related to students' physics motivation and physics grade. However, the results of the exploratory factor analysis indicated that the questionnaire could be revised to improve its construct validity. The goal of this study was to revise the questionnaire and establish its construct validity through a confirmatory factor analysis. In addition, a Rasch analysis was applied to the data to better understand the psychometric properties of the inventory and to further evaluate the construct validity. Results indicated that the final, revised inventory is a valid, reliable, and efficient tool for assessing student metacognition for physics problem solving.

  18. Abstract Interfaces for Data Analysis Component Architecture for Data Analysis Tools

    CERN Document Server

    Barrand, G; Dönszelmann, M; Johnson, A; Pfeiffer, A

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and i...

  19. Virtual data in CMS analysis

    International Nuclear Information System (INIS)

    Arbree, A.

    2003-01-01

    The use of virtual data for enhancing the collaboration between large groups of scientists is explored in several ways: by defining ''virtual'' parameter spaces which can be searched and shared in an organized way by a collaboration of scientists in the course of their analysis; by providing a mechanism to log the provenance of results and the ability to trace them back to the various stages in the analysis of real or simulated data; by creating ''check points'' in the course of an analysis to permit collaborators to explore their own analysis branches by refining selections, improving the signal to background ratio, varying the estimation of parameters, etc.; by facilitating the audit of an analysis and the reproduction of its results by a different group, or in a peer review context. We describe a prototype for the analysis of data from the CMS experiment based on the virtual data system Chimera and the object-oriented data analysis framework ROOT. The Chimera system is used to chain together several steps in the analysis process including the Monte Carlo generation of data, the simulation of detector response, the reconstruction of physics objects and their subsequent analysis, histogramming and visualization using the ROOT framework

  20. Contrast and Critique of Two Approaches to Discourse Analysis: Conversation Analysis and Speech Act Theory

    Directory of Open Access Journals (Sweden)

    Nguyen Van Han

    2014-08-01

    Full Text Available Discourse analysis, as Murcia and Olshtain (2000 assume, is a vast study of language in use that extends beyond sentence level, and it involves a more cognitive and social perspective on language use and communication exchanges. Holding a wide range of phenomena about language with society, culture and thought, discourse analysis contains various approaches: speech act, pragmatics, conversation analysis, variation analysis, and critical discourse analysis. Each approach works in its different domain to discourse. For one dimension, it shares the same assumptions or general problems in discourse analysis with the other approaches: for instance, the explanation on how we organize language into units beyond sentence boundaries, or how language is used to convey information about the world, ourselves and human relationships (Schiffrin 1994: viii. For other dimensions, each approach holds its distinctive characteristics contributing to the vastness of discourse analysis. This paper will mainly discuss two approaches to discourse analysis- conversation analysis and speech act theory- and will attempt to point out some similarities as well as contrasting features between the two approaches, followed by a short reflection on their strengths and weaknesses in the essence of each approach. The organizational and discourse features in the exchanges among three teachers at the College of Finance and Customs in Vietnam will be analysed in terms of conversation analysis and speech act theory.

  1. ADAGE signature analysis: differential expression analysis with data-defined gene sets.

    Science.gov (United States)

    Tan, Jie; Huyck, Matthew; Hu, Dongbo; Zelaya, René A; Hogan, Deborah A; Greene, Casey S

    2017-11-22

    Gene set enrichment analysis and overrepresentation analyses are commonly used methods to determine the biological processes affected by a differential expression experiment. This approach requires biologically relevant gene sets, which are currently curated manually, limiting their availability and accuracy in many organisms without extensively curated resources. New feature learning approaches can now be paired with existing data collections to directly extract functional gene sets from big data. Here we introduce a method to identify perturbed processes. In contrast with methods that use curated gene sets, this approach uses signatures extracted from public expression data. We first extract expression signatures from public data using ADAGE, a neural network-based feature extraction approach. We next identify signatures that are differentially active under a given treatment. Our results demonstrate that these signatures represent biological processes that are perturbed by the experiment. Because these signatures are directly learned from data without supervision, they can identify uncurated or novel biological processes. We implemented ADAGE signature analysis for the bacterial pathogen Pseudomonas aeruginosa. For the convenience of different user groups, we implemented both an R package (ADAGEpath) and a web server ( http://adage.greenelab.com ) to run these analyses. Both are open-source to allow easy expansion to other organisms or signature generation methods. We applied ADAGE signature analysis to an example dataset in which wild-type and ∆anr mutant cells were grown as biofilms on the Cystic Fibrosis genotype bronchial epithelial cells. We mapped active signatures in the dataset to KEGG pathways and compared with pathways identified using GSEA. The two approaches generally return consistent results; however, ADAGE signature analysis also identified a signature that revealed the molecularly supported link between the MexT regulon and Anr. We designed

  2. Gait Analysis Using Wearable Sensors

    Directory of Open Access Journals (Sweden)

    Hutian Feng

    2012-02-01

    Full Text Available Gait analysis using wearable sensors is an inexpensive, convenient, and efficient manner of providing useful information for multiple health-related applications. As a clinical tool applied in the rehabilitation and diagnosis of medical conditions and sport activities, gait analysis using wearable sensors shows great prospects. The current paper reviews available wearable sensors and ambulatory gait analysis methods based on the various wearable sensors. After an introduction of the gait phases, the principles and features of wearable sensors used in gait analysis are provided. The gait analysis methods based on wearable sensors is divided into gait kinematics, gait kinetics, and electromyography. Studies on the current methods are reviewed, and applications in sports, rehabilitation, and clinical diagnosis are summarized separately. With the development of sensor technology and the analysis method, gait analysis using wearable sensors is expected to play an increasingly important role in clinical applications.

  3. Gait Analysis Using Wearable Sensors

    Science.gov (United States)

    Tao, Weijun; Liu, Tao; Zheng, Rencheng; Feng, Hutian

    2012-01-01

    Gait analysis using wearable sensors is an inexpensive, convenient, and efficient manner of providing useful information for multiple health-related applications. As a clinical tool applied in the rehabilitation and diagnosis of medical conditions and sport activities, gait analysis using wearable sensors shows great prospects. The current paper reviews available wearable sensors and ambulatory gait analysis methods based on the various wearable sensors. After an introduction of the gait phases, the principles and features of wearable sensors used in gait analysis are provided. The gait analysis methods based on wearable sensors is divided into gait kinematics, gait kinetics, and electromyography. Studies on the current methods are reviewed, and applications in sports, rehabilitation, and clinical diagnosis are summarized separately. With the development of sensor technology and the analysis method, gait analysis using wearable sensors is expected to play an increasingly important role in clinical applications. PMID:22438763

  4. Safety analysis for research reactors

    International Nuclear Information System (INIS)

    2008-01-01

    The aim of safety analysis for research reactors is to establish and confirm the design basis for items important to safety using appropriate analytical tools. The design, manufacture, construction and commissioning should be integrated with the safety analysis to ensure that the design intent has been incorporated into the as-built reactor. Safety analysis assesses the performance of the reactor against a broad range of operating conditions, postulated initiating events and other circumstances, in order to obtain a complete understanding of how the reactor is expected to perform in these situations. Safety analysis demonstrates that the reactor can be kept within the safety operating regimes established by the designer and approved by the regulatory body. This analysis can also be used as appropriate in the development of operating procedures, periodic testing and inspection programmes, proposals for modifications and experiments and emergency planning. The IAEA Safety Requirements publication on the Safety of Research Reactors states that the scope of safety analysis is required to include analysis of event sequences and evaluation of the consequences of the postulated initiating events and comparison of the results of the analysis with radiological acceptance criteria and design limits. This Safety Report elaborates on the requirements established in IAEA Safety Standards Series No. NS-R-4 on the Safety of Research Reactors, and the guidance given in IAEA Safety Series No. 35-G1, Safety Assessment of Research Reactors and Preparation of the Safety Analysis Report, providing detailed discussion and examples of related topics. Guidance is given in this report for carrying out safety analyses of research reactors, based on current international good practices. The report covers all the various steps required for a safety analysis; that is, selection of initiating events and acceptance criteria, rules and conventions, types of safety analysis, selection of

  5. Fast neutron activation analysis

    International Nuclear Information System (INIS)

    Pepelnik, R.

    1986-01-01

    Since 1981 numerous 14 MeV neutron activation analyses were performed at Korona. On the basis of that work the advantages of this analysis technique and therewith obtained results are compared with other analytical methods. The procedure of activation analysis, the characteristics of Korona, some analytical investigations in environmental research and material physics, as well as sources of systematic errors in trace analysis are described. (orig.) [de

  6. The ATLAS Analysis Architecture

    International Nuclear Information System (INIS)

    Cranmer, K.S.

    2008-01-01

    We present an overview of the ATLAS analysis architecture including the relevant aspects of the computing model and the major architectural aspects of the Athena framework. Emphasis will be given to the interplay between the analysis use cases and the technical aspects of the architecture including the design of the event data model, transient-persistent separation, data reduction strategies, analysis tools, and ROOT interoperability

  7. Dynamical coupled-channel analysis at EBAC. (Excited Baryon Analysis Center)

    International Nuclear Information System (INIS)

    Lee, T.-S.H.; Thomas Jefferson National Accelerator Facility, Newport News, VA

    2008-01-01

    In this contribution, the author reports on the dynamical coupled-channels analysis being pursued at the Excited Baryon Analysis Center (EBAC) of Jefferson Laboratory. EBAC was established in January 2006. Its objective is to extract the parameters associated with the excited states (N*) of the nucleon from the world data of meson production reactions, and to also develop theoretical interpretations of the extracted N* parameters

  8. Citation analysis of meta-analysis articles on posttraumatic stress disorder.

    Science.gov (United States)

    Liao, Xi-Ming; Chen, Ping-Yan

    2011-04-01

    In the past two decades enormously scientific researches on posttraumatic stress disorder (PTSD) have been undertaken and many related meta-analyses have been published. Citation analysis was used to get comprehensive perspectives of meta-analysis articles (MA articles) on PTSD for the purpose of facilitating the researchers, physicians and policy-makers to understand the PTSD. MA articles on PTSD in any languages from January 1980 to March 2009 were included if they presented meta-analytical methods and received at least one citation recorded in the Web of Science (WoS). Whereas studies, in which any effect sizes of PTSD were not distinguished from other psychological disorders, were excluded. Citations to and by identified MA articles were documented basing on records in WoS. Citation analysis was used to examine distribution patterns of characteristics and citation impact of MA articles on PTSD. Canonical analysis was used to explore the relationship between the characteristics of MA articles and citation impact. Thirty-four MA articles published during 1998 and 2008 were identified and revealed multiple study topics on PTSD: 10 (29.4%) were about epidemiology, 13 (38.2%) about treatment or intervention, 6 (17.6%) about pathophysiology or neurophysiology or neuroendocrine, 3 (8.8%) about childhood and 2 (5.9%) about psychosocial adversity. Two articles cited most frequently with 456 and 145 counts were published in Journal of Consulting and Clinical Psychology by Brewin (2000) and Psychological Bulletin by Ozer (2003), respectively. Mean cited count was 7.48 ± 10.56 and mean age (year 2009 minus article publication year) was (4.24 ± 2.91) years. They had been cited approximately by 67 disciplines and by authors from 42 countries or territories. Characteristics of meta-analysis highly correlated with citation impact and reflected by canonical correlation of 0.899 (P < 0.000 01). The age of MA articles predicted their citation impact. Citation analysis would

  9. Clustering analysis

    International Nuclear Information System (INIS)

    Romli

    1997-01-01

    Cluster analysis is the name of group of multivariate techniques whose principal purpose is to distinguish similar entities from the characteristics they process.To study this analysis, there are several algorithms that can be used. Therefore, this topic focuses to discuss the algorithms, such as, similarity measures, and hierarchical clustering which includes single linkage, complete linkage and average linkage method. also, non-hierarchical clustering method, which is popular name K -mean method ' will be discussed. Finally, this paper will be described the advantages and disadvantages of every methods

  10. Reentry analysis

    International Nuclear Information System (INIS)

    Biehl, F.A.

    1984-05-01

    This paper presents the criteria, previous nuclear experience in space, analysis techniques, and possible breakup enhancement devices applicable to an acceptable SP-100 reentry from space. Reactor operation in nuclear-safe orbit will minimize the radiological risk; the remaining safeguards criteria need to be defined. A simple analytical point mass reentry technique and a more comprehensive analysis method that considers vehicle dynamics and orbit insertion malfunctions are presented. Vehicle trajectory, attitude, and possible breakup enhancement devices will be integrated in the simulation as required to ensure an adequate representation of the reentry process

  11. Cluster analysis

    CERN Document Server

    Everitt, Brian S; Leese, Morven; Stahl, Daniel

    2011-01-01

    Cluster analysis comprises a range of methods for classifying multivariate data into subgroups. By organizing multivariate data into such subgroups, clustering can help reveal the characteristics of any structure or patterns present. These techniques have proven useful in a wide range of areas such as medicine, psychology, market research and bioinformatics.This fifth edition of the highly successful Cluster Analysis includes coverage of the latest developments in the field and a new chapter dealing with finite mixture models for structured data.Real life examples are used throughout to demons

  12. NGNP Data Management and Analysis System Analysis and Web Delivery Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Cynthia D. Gentillon

    2011-09-01

    Projects for the Very High Temperature Reactor (VHTR) Technology Development Office provide data in support of Nuclear Regulatory Commission licensing of the very high temperature reactor. Fuel and materials to be used in the reactor are tested and characterized to quantify performance in high-temperature and high-fluence environments. The NGNP Data Management and Analysis System (NDMAS) at the Idaho National Laboratory has been established to ensure that VHTR data are (1) qualified for use, (2) stored in a readily accessible electronic form, and (3) analyzed to extract useful results. This document focuses on the third NDMAS objective. It describes capabilities for displaying the data in meaningful ways and for data analysis to identify useful relationships among the measured quantities. The capabilities are described from the perspective of NDMAS users, starting with those who just view experimental data and analytical results on the INL NDMAS web portal. Web display and delivery capabilities are described in detail. Also the current web pages that show Advanced Gas Reactor, Advanced Graphite Capsule, and High Temperature Materials test results are itemized. Capabilities available to NDMAS developers are more extensive, and are described using a second series of examples. Much of the data analysis efforts focus on understanding how thermocouple measurements relate to simulated temperatures and other experimental parameters. Statistical control charts and correlation monitoring provide an ongoing assessment of instrument accuracy. Data analysis capabilities are virtually unlimited for those who use the NDMAS web data download capabilities and the analysis software of their choice. Overall, the NDMAS provides convenient data analysis and web delivery capabilities for studying a very large and rapidly increasing database of well-documented, pedigreed data.

  13. Blind Analysis in Particle Physics

    International Nuclear Information System (INIS)

    Roodman, A

    2003-01-01

    A review of the blind analysis technique, as used in particle physics measurements, is presented. The history of blind analyses in physics is briefly discussed. Next the dangers of and the advantages of a blind analysis are described. Three distinct kinds of blind analysis in particle physics are presented in detail. Finally, the BABAR collaboration's experience with the blind analysis technique is discussed

  14. PIXE analysis of thin samples

    International Nuclear Information System (INIS)

    Kiss, Ildiko; Koltay, Ede; Szabo, Gyula; Laszlo, S.; Meszaros, A.

    1985-01-01

    Particle-induced X-ray emission (PIXE) multielemental analysis of thin film samples are reported. Calibration methods of K and L X-lines are discussed. Application of PIXE analysis to aerosol monitoring, multielement aerosol analysis is described. Results of PIXE analysis of samples from two locations in Hungary are compared with the results of aerosol samples from Scandinavia and the USA. (D.Gy.)

  15. Proton exciting X ray analysis

    International Nuclear Information System (INIS)

    Ma Xinpei

    1986-04-01

    The analyzing capability of proton exciting X ray analysis for different elements in organisms was discussed, and dealing with examples of trace element analysis in the human body and animal organisms, such as blood serum, urine, and hair. The sensitivity, accuracy, and capability of multielement analysis were discussed. Its strong points for the trace element analysis in biomedicine were explained

  16. COMPARATIVE ANALYSIS BETWEEN THE FUNDAMENTAL AND TECHNICAL ANALYSIS OF STOCKS

    Directory of Open Access Journals (Sweden)

    Nada Petrusheva

    2016-04-01

    Full Text Available In the world of investing and trading, in order to have a definite advantage and constantly create profit, you need to have a strategic approach. Generally speaking, the two main schools of thought and strategies in financial markets are fundamental and technical analysis. Fundamental and technical analysis differ in several aspects, such as the way of functioning and execution, the time horizon used, the tools used and their objective. These differences lead to certain advantages and disadvantages of each of the analyses. Fundamental and technical analysis are also a subject of critical reviews by the academic and scientific community and many of these reviews concern the methods of their application, i.e. the possibility of combining the two analyses and using them complementarily to fully utilize their strengths and advantages.

  17. Canister storage building hazard analysis report

    International Nuclear Information System (INIS)

    POWERS, T.B.

    1999-01-01

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis was performed in accordance with the DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', and meets the intent of HNF-PRO-704, ''Hazard and Accident Analysis Process''. This hazard analysis implements the requirements of DOE Order 5480.23, ''Nuclear Safety Analysis Reports''

  18. Drift Degradation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Dwayne C. Kicker

    2001-09-28

    A statistical description of the probable block sizes formed by fractures around the emplacement drifts has been developed for each of the lithologic units of the repository host horizon. A range of drift orientations with the drift azimuth varied in 15{sup o} increments has been considered in the static analysis. For the quasi-static seismic analysis, and the time-dependent and thermal effects analysis, two drift orientations have been considered: a drift azimuth of 105{sup o} and the current emplacement drift azimuth of 75{sup o}. The change in drift profile resulting from progressive deterioration of the emplacement drifts has been assessed both with and without backfill. Drift profiles have been determined for four different time increments, including static (i.e., upon excavation), 200 years, 2,000 years, and 10,000 years. The effect of seismic events on rock fall has been analyzed. Block size distributions and drift profiles have been determined for three seismic levels, including a 1,000-year event, a 5,000-year event, and a 10,000-year event. Data developed in this modeling and analysis activity have been entered into the TDMS (DTN: MO0109RDDAAMRR.003). The following conclusions have resulted from this drift degradation analysis: (1) The available fracture data are suitable for supporting a detailed key block analysis of the repository host horizon rock mass. The available data from the north-south Main Drift and the east-west Cross Drift provide a sufficient representative fracture sample of the repository emplacement drift horizon. However, the Tptpln fracture data are only available from a relatively small section of the Cross Drift, resulting in a smaller fracture sample size compared to the other lithologic units. This results in a lower degree of confidence that the key block data based on the Tptpln data set is actually representative of the overall Tptpln key block population. (2) The seismic effect on the rock fall size distribution for all events

  19. Longitudinal analysis of meta-analysis literatures in the database of ISI Web of Science.

    Science.gov (United States)

    Zhu, Changtai; Jiang, Ting; Cao, Hao; Sun, Wenguang; Chen, Zhong; Liu, Jinming

    2015-01-01

    The meta-analysis is regarded as an important evidence for making scientific decision. The database of ISI Web of Science collected a great number of high quality literatures including meta-analysis literatures. However, it is significant to understand the general characteristics of meta-analysis literatures to outline the perspective of meta-analysis. In this present study, we summarized and clarified some features on these literatures in the database of ISI Web of Science. We retrieved the meta-analysis literatures in the database of ISI Web of Science including SCI-E, SSCI, A&HCI, CPCI-S, CPCI-SSH, CCR-E, and IC. The annual growth rate, literature category, language, funding, index citation, agencies and countries/territories of the meta-analysis literatures were analyzed, respectively. A total of 95,719 records, which account for 0.38% (99% CI: 0.38%-0.39%) of all literatures, were found in the database. From 1997 to 2012, the annual growth rate of meta-analysis literatures was 18.18%. The literatures involved in many categories, languages, fundings, citations, publication agencies, and countries/territories. Interestingly, the index citation frequencies of the meta-analysis were significantly higher than that of other type literatures such as multi-centre study, randomize controlled trial, cohort study, case control study, and cases report (Panalysis has been becoming more and more prominent in recent years. In future, in order to promote the validity of meta-analysis, the CONSORT and PRISMA standard should be continuously popularized in the field of evidence-based medicine.

  20. IPAD: the Integrated Pathway Analysis Database for Systematic Enrichment Analysis.

    Science.gov (United States)

    Zhang, Fan; Drabier, Renee

    2012-01-01

    Next-Generation Sequencing (NGS) technologies and Genome-Wide Association Studies (GWAS) generate millions of reads and hundreds of datasets, and there is an urgent need for a better way to accurately interpret and distill such large amounts of data. Extensive pathway and network analysis allow for the discovery of highly significant pathways from a set of disease vs. healthy samples in the NGS and GWAS. Knowledge of activation of these processes will lead to elucidation of the complex biological pathways affected by drug treatment, to patient stratification studies of new and existing drug treatments, and to understanding the underlying anti-cancer drug effects. There are approximately 141 biological human pathway resources as of Jan 2012 according to the Pathguide database. However, most currently available resources do not contain disease, drug or organ specificity information such as disease-pathway, drug-pathway, and organ-pathway associations. Systematically integrating pathway, disease, drug and organ specificity together becomes increasingly crucial for understanding the interrelationships between signaling, metabolic and regulatory pathway, drug action, disease susceptibility, and organ specificity from high-throughput omics data (genomics, transcriptomics, proteomics and metabolomics). We designed the Integrated Pathway Analysis Database for Systematic Enrichment Analysis (IPAD, http://bioinfo.hsc.unt.edu/ipad), defining inter-association between pathway, disease, drug and organ specificity, based on six criteria: 1) comprehensive pathway coverage; 2) gene/protein to pathway/disease/drug/organ association; 3) inter-association between pathway, disease, drug, and organ; 4) multiple and quantitative measurement of enrichment and inter-association; 5) assessment of enrichment and inter-association analysis with the context of the existing biological knowledge and a "gold standard" constructed from reputable and reliable sources; and 6) cross-linking of

  1. Traffic analysis toolbox volume XI : weather and traffic analysis, modeling and simulation.

    Science.gov (United States)

    2010-12-01

    This document presents a weather module for the traffic analysis tools program. It provides traffic engineers, transportation modelers and decisions makers with a guide that can incorporate weather impacts into transportation system analysis and mode...

  2. Reliability on intra-laboratory and inter-laboratory data of hair mineral analysis comparing with blood analysis.

    Science.gov (United States)

    Namkoong, Sun; Hong, Seung Phil; Kim, Myung Hwa; Park, Byung Cheol

    2013-02-01

    Nowadays, although its clinical value remains controversial institutions utilize hair mineral analysis. Arguments about the reliability of hair mineral analysis persist, and there have been evaluations of commercial laboratories performing hair mineral analysis. The objective of this study was to assess the reliability of intra-laboratory and inter-laboratory data at three commercial laboratories conducting hair mineral analysis, compared to serum mineral analysis. Two divided hair samples taken from near the scalp were submitted for analysis at the same time, to all laboratories, from one healthy volunteer. Each laboratory sent a report consisting of quantitative results and their interpretation of health implications. Differences among intra-laboratory and interlaboratory data were analyzed using SPSS version 12.0 (SPSS Inc., USA). All the laboratories used identical methods for quantitative analysis, and they generated consistent numerical results according to Friedman analysis of variance. However, the normal reference ranges of each laboratory varied. As such, each laboratory interpreted the patient's health differently. On intra-laboratory data, Wilcoxon analysis suggested they generated relatively coherent data, but laboratory B could not in one element, so its reliability was doubtful. In comparison with the blood test, laboratory C generated identical results, but not laboratory A and B. Hair mineral analysis has its limitations, considering the reliability of inter and intra laboratory analysis comparing with blood analysis. As such, clinicians should be cautious when applying hair mineral analysis as an ancillary tool. Each laboratory included in this study requires continuous refinement from now on for inducing standardized normal reference levels.

  3. Query-Driven Visualization and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ruebel, Oliver; Bethel, E. Wes; Prabhat, Mr.; Wu, Kesheng

    2012-11-01

    This report focuses on an approach to high performance visualization and analysis, termed query-driven visualization and analysis (QDV). QDV aims to reduce the amount of data that needs to be processed by the visualization, analysis, and rendering pipelines. The goal of the data reduction process is to separate out data that is "scientifically interesting'' and to focus visualization, analysis, and rendering on that interesting subset. The premise is that for any given visualization or analysis task, the data subset of interest is much smaller than the larger, complete data set. This strategy---extracting smaller data subsets of interest and focusing of the visualization processing on these subsets---is complementary to the approach of increasing the capacity of the visualization, analysis, and rendering pipelines through parallelism. This report discusses the fundamental concepts in QDV, their relationship to different stages in the visualization and analysis pipelines, and presents QDV's application to problems in diverse areas, ranging from forensic cybersecurity to high energy physics.

  4. Bayesian data analysis for newcomers.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    This article explains the foundational concepts of Bayesian data analysis using virtually no mathematical notation. Bayesian ideas already match your intuitions from everyday reasoning and from traditional data analysis. Simple examples of Bayesian data analysis are presented that illustrate how the information delivered by a Bayesian analysis can be directly interpreted. Bayesian approaches to null-value assessment are discussed. The article clarifies misconceptions about Bayesian methods that newcomers might have acquired elsewhere. We discuss prior distributions and explain how they are not a liability but an important asset. We discuss the relation of Bayesian data analysis to Bayesian models of mind, and we briefly discuss what methodological problems Bayesian data analysis is not meant to solve. After you have read this article, you should have a clear sense of how Bayesian data analysis works and the sort of information it delivers, and why that information is so intuitive and useful for drawing conclusions from data.

  5. Computer system for environmental sample analysis and data storage and analysis

    International Nuclear Information System (INIS)

    Brauer, F.P.; Fager, J.E.

    1976-01-01

    A mini-computer based environmental sample analysis and data storage system has been developed. The system is used for analytical data acquisition, computation, storage of analytical results, and tabulation of selected or derived results for data analysis, interpretation and reporting. This paper discussed the structure, performance and applications of the system

  6. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Jyri Pakarinen

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  7. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    Science.gov (United States)

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  8. Harmonic and geometric analysis

    CERN Document Server

    Citti, Giovanna; Pérez, Carlos; Sarti, Alessandro; Zhong, Xiao

    2015-01-01

    This book presents an expanded version of four series of lectures delivered by the authors at the CRM. Harmonic analysis, understood in a broad sense, has a very wide interplay with partial differential equations and in particular with the theory of quasiconformal mappings and its applications. Some areas in which real analysis has been extremely influential are PDE's and geometric analysis. Their foundations and subsequent developments made extensive use of the Calderón–Zygmund theory, especially the Lp inequalities for Calderón–Zygmund operators (Beurling transform and Riesz transform, among others) and the theory of Muckenhoupt weights.  The first chapter is an application of harmonic analysis and the Heisenberg group to understanding human vision, while the second and third chapters cover some of the main topics on linear and multilinear harmonic analysis. The last serves as a comprehensive introduction to a deep result from De Giorgi, Moser and Nash on the regularity of elliptic partial differen...

  9. Biosensors for Cell Analysis.

    Science.gov (United States)

    Zhou, Qing; Son, Kyungjin; Liu, Ying; Revzin, Alexander

    2015-01-01

    Biosensors first appeared several decades ago to address the need for monitoring physiological parameters such as oxygen or glucose in biological fluids such as blood. More recently, a new wave of biosensors has emerged in order to provide more nuanced and granular information about the composition and function of living cells. Such biosensors exist at the confluence of technology and medicine and often strive to connect cell phenotype or function to physiological or pathophysiological processes. Our review aims to describe some of the key technological aspects of biosensors being developed for cell analysis. The technological aspects covered in our review include biorecognition elements used for biosensor construction, methods for integrating cells with biosensors, approaches to single-cell analysis, and the use of nanostructured biosensors for cell analysis. Our hope is that the spectrum of possibilities for cell analysis described in this review may pique the interest of biomedical scientists and engineers and may spur new collaborations in the area of using biosensors for cell analysis.

  10. K Basin safety analysis

    International Nuclear Information System (INIS)

    Porten, D.R.; Crowe, R.D.

    1994-01-01

    The purpose of this accident safety analysis is to document in detail, analyses whose results were reported in summary form in the K Basins Safety Analysis Report WHC-SD-SNF-SAR-001. The safety analysis addressed the potential for release of radioactive and non-radioactive hazardous material located in the K Basins and their supporting facilities. The safety analysis covers the hazards associated with normal K Basin fuel storage and handling operations, fuel encapsulation, sludge encapsulation, and canister clean-up and disposal. After a review of the Criticality Safety Evaluation of the K Basin activities, the following postulated events were evaluated: Crane failure and casks dropped into loadout pit; Design basis earthquake; Hypothetical loss of basin water accident analysis; Combustion of uranium fuel following dryout; Crane failure and cask dropped onto floor of transfer area; Spent ion exchange shipment for burial; Hydrogen deflagration in ion exchange modules and filters; Release of Chlorine; Power availability and reliability; and Ashfall

  11. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-II analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This presentation will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  12. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  13. Systems analysis-independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  14. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  15. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    Science.gov (United States)

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  16. Assessment of non-linear analysis finite element program (NONSAP) for inelastic analysis

    International Nuclear Information System (INIS)

    Chang, T.Y.; Prachuktam, S.; Reich, M.

    1976-11-01

    An assessment on a nonlinear structural analysis finite element program called NONSAP is given with respect to its inelastic analysis capability for pressure vessels and components. The assessment was made from the review of its theoretical basis and bench mark problem runs. It was found that NONSAP has only limited capability for inelastic analysis. However, the program was written flexible enough that it can be easily extended or modified to suit the user's need. Moreover, some of the numerical difficulties in using NONSAP are pointed out

  17. Insight and Evidence Motivating the Simplification of Dual-Analysis Hybrid Systems into Single-Analysis Hybrid Systems

    Science.gov (United States)

    Todling, Ricardo; Diniz, F. L. R.; Takacs, L. L.; Suarez, M. J.

    2018-01-01

    Many hybrid data assimilation systems currently used for NWP employ some form of dual-analysis system approach. Typically a hybrid variational analysis is responsible for creating initial conditions for high-resolution forecasts, and an ensemble analysis system is responsible for creating sample perturbations used to form the flow-dependent part of the background error covariance required in the hybrid analysis component. In many of these, the two analysis components employ different methodologies, e.g., variational and ensemble Kalman filter. In such cases, it is not uncommon to have observations treated rather differently between the two analyses components; recentering of the ensemble analysis around the hybrid analysis is used to compensated for such differences. Furthermore, in many cases, the hybrid variational high-resolution system implements some type of four-dimensional approach, whereas the underlying ensemble system relies on a three-dimensional approach, which again introduces discrepancies in the overall system. Connected to these is the expectation that one can reliably estimate observation impact on forecasts issued from hybrid analyses by using an ensemble approach based on the underlying ensemble strategy of dual-analysis systems. Just the realization that the ensemble analysis makes substantially different use of observations as compared to their hybrid counterpart should serve as enough evidence of the implausibility of such expectation. This presentation assembles numerous anecdotal evidence to illustrate the fact that hybrid dual-analysis systems must, at the very minimum, strive for consistent use of the observations in both analysis sub-components. Simpler than that, this work suggests that hybrid systems can reliably be constructed without the need to employ a dual-analysis approach. In practice, the idea of relying on a single analysis system is appealing from a cost-maintenance perspective. More generally, single-analysis systems avoid

  18. Distributed Analysis in CMS

    CERN Document Server

    Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank

    2009-01-01

    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

  19. Energy Sector Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Arent, D.; Benioff, R.; Mosey, G.; Bird, L.; Brown, J.; Brown, E.; Vimmerstedt, L.; Aabakken, J.; Parks, K.; Lapsa, M.; Davis, S.; Olszewski, M.; Cox, D.; McElhaney, K.; Hadley, S.; Hostick, D.; Nicholls, A.; McDonald, S.; Holloman, B.

    2006-10-01

    This paper presents the results of energy market analysis sponsored by the Department of Energy's (DOE) Weatherization and International Program (WIP) within the Office of Energy Efficiency and Renewable Energy (EERE). The analysis was conducted by a team of DOE laboratory experts from the National Renewable Energy Laboratory (NREL), Oak Ridge National Laboratory (ORNL), and Pacific Northwest National Laboratory (PNNL), with additional input from Lawrence Berkeley National Laboratory (LBNL). The analysis was structured to identify those markets and niches where government can create the biggest impact by informing management decisions in the private and public sectors. The analysis identifies those markets and niches where opportunities exist for increasing energy efficiency and renewable energy use.

  20. Design-for-analysis or the unintended role of analysis in the design of piping systems

    International Nuclear Information System (INIS)

    Antaki, G.A.

    1991-01-01

    The paper discusses the evolution of piping design in the nuclear industry with its increasing reliance on dynamic analysis. While it is well recognized that the practice has evolved from ''design-by- rule '' to ''design-by-analysis,'' examples are provided of cases where the choice of analysis technique has determined the hardware configuration, which could be called ''design-for-analysis.'' The paper presents practical solutions to some of these cases and summarizes the important recent industry and regulatory developments which, if successful, will reverse the trend towards ''design-for-analysis.'' 14 refs

  1. Numerical Limit Analysis:

    DEFF Research Database (Denmark)

    Damkilde, Lars

    2007-01-01

    Limit State analysis has a long history and many prominent researchers have contributed. The theoretical foundation is based on the upper- and lower-bound theorems which give a very comprehensive and elegant formulation on complicated physical problems. In the pre-computer age Limit State analysis...... also enabled engineers to solve practical problems within reinforced concrete, steel structures and geotechnics....

  2. Reactor Safety Analysis

    International Nuclear Information System (INIS)

    Arien, B.

    2000-01-01

    The objective of SCK-CEN's programme on reactor safety is to develop expertise in probabilistic and deterministic reactor safety analysis. The research programme consists of two main activities, in particular the development of software for reliability analysis of large systems and participation in the international PHEBUS-FP programme for severe accidents. Main achievements in 1999 are reported

  3. Qualitative Content Analysis

    OpenAIRE

    Satu Elo; Maria Kääriäinen; Outi Kanste; Tarja Pölkki; Kati Utriainen; Helvi Kyngäs

    2014-01-01

    Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studie...

  4. High resolution melting (HRM) analysis of DNA--its role and potential in food analysis.

    Science.gov (United States)

    Druml, Barbara; Cichna-Markl, Margit

    2014-09-01

    DNA based methods play an increasing role in food safety control and food adulteration detection. Recent papers show that high resolution melting (HRM) analysis is an interesting approach. It involves amplification of the target of interest in the presence of a saturation dye by the polymerase chain reaction (PCR) and subsequent melting of the amplicons by gradually increasing the temperature. Since the melting profile depends on the GC content, length, sequence and strand complementarity of the product, HRM analysis is highly suitable for the detection of single-base variants and small insertions or deletions. The review gives an introduction into HRM analysis, covers important aspects in the development of an HRM analysis method and describes how HRM data are analysed and interpreted. Then we discuss the potential of HRM analysis based methods in food analysis, i.e. for the identification of closely related species and cultivars and the identification of pathogenic microorganisms. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. failure analysis of a uav flight control system using markov analysis

    African Journals Online (AJOL)

    Failure analysis of a flight control system proposed for Air Force Institute of Technology (AFIT) Unmanned Aerial Vehicle (UAV) was studied using Markov Analysis (MA). It was perceived that understanding of the number of failure states and the probability of being in those state are of paramount importance in order to ...

  6. The Potential for Meta-Analysis to Support Decision Analysis in Ecology

    Science.gov (United States)

    Mengersen, Kerrie; MacNeil, M. Aaron; Caley, M. Julian

    2015-01-01

    Meta-analysis and decision analysis are underpinned by well-developed methods that are commonly applied to a variety of problems and disciplines. While these two fields have been closely linked in some disciplines such as medicine, comparatively little attention has been paid to the potential benefits of linking them in ecology, despite reasonable…

  7. Elements of stock market analysis

    Directory of Open Access Journals (Sweden)

    Suciu, T.

    2013-12-01

    Full Text Available The paper represents a starting point in the presentation of the two types of stock/market analysis: the fundamental analysis and the technical analysis. The fundamental analysis consist in the assessment of the financial and economic status of the company together with the context and macroeconomic environment where it activates. The technical analysis deals with the demand and supply of securities and the evolution of their trend on the market, using a range of graphics and charts to illustrate the market tendencies for the quick identification of the best moments to buy or sell.

  8. An easy guide to factor analysis

    CERN Document Server

    Kline, Paul

    2014-01-01

    Factor analysis is a statistical technique widely used in psychology and the social sciences. With the advent of powerful computers, factor analysis and other multivariate methods are now available to many more people. An Easy Guide to Factor Analysis presents and explains factor analysis as clearly and simply as possible. The author, Paul Kline, carefully defines all statistical terms and demonstrates step-by-step how to work out a simple example of principal components analysis and rotation. He further explains other methods of factor analysis, including confirmatory and path analysis, a

  9. Project delay analysis of HRSG

    Science.gov (United States)

    Silvianita; Novega, A. S.; Rosyid, D. M.; Suntoyo

    2017-08-01

    Completion of HRSG (Heat Recovery Steam Generator) fabrication project sometimes is not sufficient with the targeted time written on the contract. The delay on fabrication process can cause some disadvantages for fabricator, including forfeit payment, delay on HRSG construction process up until HRSG trials delay. In this paper, the author is using semi quantitative on HRSG pressure part fabrication delay with configuration plant 1 GT (Gas Turbine) + 1 HRSG + 1 STG (Steam Turbine Generator) using bow-tie analysis method. Bow-tie analysis method is a combination from FTA (Fault tree analysis) and ETA (Event tree analysis) to develop the risk matrix of HRSG. The result from FTA analysis is use as a threat for preventive measure. The result from ETA analysis is use as impact from fabrication delay.

  10. Networks and Bargaining in Policy Analysis

    DEFF Research Database (Denmark)

    Bogason, Peter

    2006-01-01

    A duscussion of the fight between proponents of rationalistic policy analysis and more political interaction models for policy analysis. The latter group is the foundation for the many network models of policy analysis of today.......A duscussion of the fight between proponents of rationalistic policy analysis and more political interaction models for policy analysis. The latter group is the foundation for the many network models of policy analysis of today....

  11. Analysis

    DEFF Research Database (Denmark)

    Mathiesen, Brian Vad; Liu, Wen; Zhang, Xiliang

    2014-01-01

    three major technological changes: energy savings on the demand side, efficiency improvements in energy production, and the replacement of fossil fuels by various sources of renewable energy. Consequently, the analysis of these systems must include strategies for integrating renewable sources...

  12. COMPUTER METHODS OF GENETIC ANALYSIS.

    Directory of Open Access Journals (Sweden)

    A. L. Osipov

    2017-02-01

    Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.

  13. The Application of Structured Job Analysis Information Based on the Position Analysis Questionnaire (PAQ).

    Science.gov (United States)

    Position Analysis Questionnaire ( PAQ ). This job analysis instrument consists of 187 job elements organized into six divisions. In the analysis of a job...with the PAQ the relevance of the individual elements to the job are rated using any of several rating scales such as importance, or time.

  14. ROCKS & MINERALS DETERMINATION AND ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    20150204 Abaydulla Alimjan(Department of Chemistry and Environmental Sciences,Kashgar Teachers College,Kashgar 844006,China);Cheng Chunying Non-Metallic Element Composition Analysis of Non-Ferrous Metal Ores from Oytagh Town,Xinjiang(Rock and Mineral Analysis,ISSN0254-5357,CN11-2131/TD,33(1),2014,p.44-50,5illus.,4tables,28refs.)Key words:nonferrous metals ore,nonmetals,chemical analysis,thermogravimetric analysis Anions in non-ferrous ore materials

  15. A study of environmental analysis of urban river sediments using activation analysis

    International Nuclear Information System (INIS)

    Tanaka, Y.; Kuno, A.; Matsuo, M.

    2003-01-01

    Sediments of the Kitajukkengawa River (Sumida-ku, Tokyo, Japan) were analyzed by activation analyses. Concentrations of 36 elements for each sample were determined by instrumental neutron activation analysis (INAA) and neutron induced prompt gamma-ray analysis (PGA). Based on the correlation matrix between the elements in vertical distribution, principal component analysis (PCA) was performed. The degree of chemical weathering of silicate minerals was highest in the middle layer of the Kitajukkengawa River sediment and that adsorbed amount of trace metals such as Cd and Cr was increased along with chemical weathering. (author)

  16. Conducting Qualitative Data Analysis: Qualitative Data Analysis as a Metaphoric Process

    Science.gov (United States)

    Chenail, Ronald J.

    2012-01-01

    In the second of a series of "how-to" essays on conducting qualitative data analysis, Ron Chenail argues the process can best be understood as a metaphoric process. From this orientation he suggests researchers follow Kenneth Burke's notion of metaphor and see qualitative data analysis as the analyst systematically considering the "this-ness" of…

  17. Automatic analysis of the micronucleus test in primary human lymphocytes using image analysis.

    Science.gov (United States)

    Frieauff, W; Martus, H J; Suter, W; Elhajouji, A

    2013-01-01

    The in vitro micronucleus test (MNT) is a well-established test for early screening of new chemical entities in industrial toxicology. For assessing the clastogenic or aneugenic potential of a test compound, micronucleus induction in cells has been shown repeatedly to be a sensitive and a specific parameter. Various automated systems to replace the tedious and time-consuming visual slide analysis procedure as well as flow cytometric approaches have been discussed. The ROBIAS (Robotic Image Analysis System) for both automatic cytotoxicity assessment and micronucleus detection in human lymphocytes was developed at Novartis where the assay has been used to validate positive results obtained in the MNT in TK6 cells, which serves as the primary screening system for genotoxicity profiling in early drug development. In addition, the in vitro MNT has become an accepted alternative to support clinical studies and will be used for regulatory purposes as well. The comparison of visual with automatic analysis results showed a high degree of concordance for 25 independent experiments conducted for the profiling of 12 compounds. For concentration series of cyclophosphamide and carbendazim, a very good correlation between automatic and visual analysis by two examiners could be established, both for the relative division index used as cytotoxicity parameter, as well as for micronuclei scoring in mono- and binucleated cells. Generally, false-positive micronucleus decisions could be controlled by fast and simple relocation of the automatically detected patterns. The possibility to analyse 24 slides within 65h by automatic analysis over the weekend and the high reproducibility of the results make automatic image processing a powerful tool for the micronucleus analysis in primary human lymphocytes. The automated slide analysis for the MNT in human lymphocytes complements the portfolio of image analysis applications on ROBIAS which is supporting various assays at Novartis.

  18. Bridge Diagnosis by Using Nonlinear Independent Component Analysis and Displacement Analysis

    Science.gov (United States)

    Zheng, Juanqing; Yeh, Yichun; Ogai, Harutoshi

    A daily diagnosis system for bridge monitoring and maintenance is developed based on wireless sensors, signal processing, structure analysis, and displacement analysis. The vibration acceleration data of a bridge are firstly collected through the wireless sensor network by exerting. Nonlinear independent component analysis (ICA) and spectral analysis are used to extract the vibration frequencies of the bridge. After that, through a band pass filter and Simpson's rule the vibration displacement is calculated and the vibration model is obtained to diagnose the bridge. Since linear ICA algorithms work efficiently only in linear mixing environments, a nonlinear ICA model, which is more complicated, is more practical for bridge diagnosis systems. In this paper, we firstly use the post nonlinear method to change the signal data, after that perform linear separation by FastICA, and calculate the vibration displacement of the bridge. The processed data can be used to understand phenomena like corrosion and crack, and evaluate the health condition of the bridge. We apply this system to Nakajima Bridge in Yahata, Kitakyushu, Japan.

  19. Loss of coolant accident analysis (thermal hydraulic analysis) - Japanese industries experience

    International Nuclear Information System (INIS)

    Okabe, K.

    1995-01-01

    An overview of LOCA analysis in Japanese industry is presented. The BASH-M code, developed for large scale LOCA reflooding analysis, is given as an example of verification and improvement of US computer programs are given. The code's application to the operational safety analysis concerns the following main areas: 1D drift flux model base computer program CANAC; CANAC-based advanced training simulator; emergency operating procedures. The author considers also the code application to the following new PWR safety design concepts: use of steam generators for decay heat removal at LOCA conditions; use of horizontal type steam generator for maintaining two-phase natural circulation under the reactor coolant system submerged. 9 figs

  20. Risk analysis

    International Nuclear Information System (INIS)

    Baron, J.H.; Nunez McLeod, J.; Rivera, S.S.

    1997-01-01

    This book contains a selection of research works performed in the CEDIAC Institute (Cuyo National University) in the area of Risk Analysis, with specific orientations to the subjects of uncertainty and sensitivity studies, software reliability, severe accident modeling, etc. This volume presents important material for all those researches who want to have an insight in the risk analysis field, as a tool to solution several problems frequently found in the engineering and applied sciences field, as well as for the academic teachers who want to keep up to date, including the new developments and improvements continuously arising in this field [es

  1. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  2. Virtual Data in CMS Analysis

    CERN Document Server

    Arbree, A; Bourilkov, D; Cavanaugh, R J; Graham, G; Rodríguez, J; Wilde, M; Zhao, Y

    2003-01-01

    The use of virtual data for enhancing the collaboration between large groups of scientists is explored in several ways: - by defining ``virtual'' parameter spaces which can be searched and shared in an organized way by a collaboration of scientists in the course of their analysis - by providing a mechanism to log the provenance of results and the ability to trace them back to the various stages in the analysis of real or simulated data - by creating ``check points'' in the course of an analysis to permit collaborators to explore their own analysis branches by refining selections, improving the signal to background ratio, varying the estimation of parameters, etc. - by facilitating the audit of an analysis and the reproduction of its results by a different group, or in a peer review context. We describe a prototype for the analysis of data from the CMS experiment based on the virtual data system Chimera and the object-oriented data analysis framework ROOT. The Chimera system is used to chain together several s...

  3. Real analysis with economic applications

    CERN Document Server

    Ok, Efe A

    2011-01-01

    There are many mathematics textbooks on real analysis, but they focus on topics not readily helpful for studying economic theory or they are inaccessible to most graduate students of economics. Real Analysis with Economic Applications aims to fill this gap by providing an ideal textbook and reference on real analysis tailored specifically to the concerns of such students. The emphasis throughout is on topics directly relevant to economic theory. In addition to addressing the usual topics of real analysis, this book discusses the elements of order theory, convex analysis, optimization, correspondences, linear and nonlinear functional analysis, fixed-point theory, dynamic programming, and calculus of variations. Efe Ok complements the mathematical development with applications that provide concise introductions to various topics from economic theory, including individual decision theory and games, welfare economics, information theory, general equilibrium and finance, and intertemporal economics. Moreover, a...

  4. Analysis from concepts to applications

    CERN Document Server

    Penot, Jean-Paul

    2016-01-01

    This textbook covers the main results and methods of real analysis in a single volume. Taking a progressive approach to equations and transformations, this book starts with the very foundations of real analysis (set theory, order, convergence, and measure theory) before presenting powerful results that can be applied to concrete problems. In addition to classical results of functional analysis, differential calculus and integration, Analysis discusses topics such as convex analysis, dissipative operators and semigroups which are often absent from classical treatises. Acknowledging that analysis has significantly contributed to the understanding and development of the present world, the book further elaborates on techniques which pervade modern civilization, including wavelets in information theory, the Radon transform in medical imaging and partial differential equations in various mechanical and physical phenomena. Advanced undergraduate and graduate students, engineers as well as practitioners wishing to fa...

  5. Pathway analysis of IMC

    DEFF Research Database (Denmark)

    Skrypnyuk, Nataliya; Nielson, Flemming; Pilegaard, Henrik

    2009-01-01

    We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced into the...... into the syntax of IMC in order to make our analysis feasible. Finally we describe the analysis itself together with several theoretical results that we have proved for it.......We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced...

  6. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  7. B plant mission analysis report

    International Nuclear Information System (INIS)

    Lund, D.P.

    1995-01-01

    This report further develops the mission for B Plant originally defined in WHC-EP-0722, ''System Engineering Functions and Requirements for the Hanford Cleanup Mission: First Issue.'' The B Plant mission analysis will be the basis for a functional analysis that breaks down the B Plant mission statement into the necessary activities to accomplish the mission. These activities are the product of the functional analysis and will then be used in subsequent steps of the systems engineering process, such as identifying requirements and allocating those requirements to B Plant functions. The information in this mission analysis and the functional and requirements analysis are a part of the B Plant technical baseline

  8. Non-commutative analysis

    CERN Document Server

    Jorgensen, Palle

    2017-01-01

    The book features new directions in analysis, with an emphasis on Hilbert space, mathematical physics, and stochastic processes. We interpret 'non-commutative analysis' broadly to include representations of non-Abelian groups, and non-Abelian algebras; emphasis on Lie groups and operator algebras (C* algebras and von Neumann algebras.)A second theme is commutative and non-commutative harmonic analysis, spectral theory, operator theory and their applications. The list of topics includes shift invariant spaces, group action in differential geometry, and frame theory (over-complete bases) and their applications to engineering (signal processing and multiplexing), projective multi-resolutions, and free probability algebras.The book serves as an accessible introduction, offering a timeless presentation, attractive and accessible to students, both in mathematics and in neighboring fields.

  9. Reactor safety analysis

    International Nuclear Information System (INIS)

    Arien, B.

    1998-01-01

    Risk assessments of nuclear installations require accurate safety and reliability analyses to estimate the consequences of accidental events and their probability of occurrence. The objective of the work performed in this field at the Belgian Nuclear Research Centre SCK-CEN is to develop expertise in probabilistic and deterministic reactor safety analysis. The four main activities of the research project on reactor safety analysis are: (1) the development of software for the reliable analysis of large systems; (2) the development of an expert system for the aid to diagnosis; (3) the development and the application of a probabilistic reactor-dynamics method, and (4) to participate in the international PHEBUS-FP programme for severe accidents. Progress in research during 1997 is described

  10. Fuzzy data analysis

    CERN Document Server

    Bandemer, Hans

    1992-01-01

    Fuzzy data such as marks, scores, verbal evaluations, imprecise observations, experts' opinions and grey tone pictures, are quite common. In Fuzzy Data Analysis the authors collect their recent results providing the reader with ideas, approaches and methods for processing such data when looking for sub-structures in knowledge bases for an evaluation of functional relationship, e.g. in order to specify diagnostic or control systems. The modelling presented uses ideas from fuzzy set theory and the suggested methods solve problems usually tackled by data analysis if the data are real numbers. Fuzzy Data Analysis is self-contained and is addressed to mathematicians oriented towards applications and to practitioners in any field of application who have some background in mathematics and statistics.

  11. Physics analysis workstation

    International Nuclear Information System (INIS)

    Johnstad, H.

    1989-06-01

    The Physics Analysis Workstation (PAW) is a high-level program providing data presentation and statistical or mathematical analysis. PAW has been developed at CERN as an instrument to assist physicists in the analysis and presentation of their data. The program is interfaced to a high level graphics package, based on basic underlying graphics. 3-D graphics capabilities are being implemented. The major objects in PAW are 1 or 2 dimensional binned event data with fixed number of entries per event, vectors, functions, graphics pictures, and macros. Command input is handled by an integrated user interface package, which allows for a variety of choices for input, either with typed commands, or in a tree structure menu driven mode. 6 refs., 1 fig

  12. Malware analysis and reverse engineering

    OpenAIRE

    Šváb, Martin

    2014-01-01

    Focus of this thesis is reverse engineering in information technology closely linked with the malware analysis. It explains fundamentals of IA-32 processors architecture and basics of operating system Microsoft Windows. Main part of this thesis is dedicated to the malware analysis, including description of creating a tool for simplification of static part of the analysis. In Conclusion various approaches to the malware analysis, which were described in previous part of the thesis, are practic...

  13. [Cluster analysis in biomedical researches].

    Science.gov (United States)

    Akopov, A S; Moskovtsev, A A; Dolenko, S A; Savina, G D

    2013-01-01

    Cluster analysis is one of the most popular methods for the analysis of multi-parameter data. The cluster analysis reveals the internal structure of the data, group the separate observations on the degree of their similarity. The review provides a definition of the basic concepts of cluster analysis, and discusses the most popular clustering algorithms: k-means, hierarchical algorithms, Kohonen networks algorithms. Examples are the use of these algorithms in biomedical research.

  14. Semi-classical signal analysis

    KAUST Repository

    Laleg-Kirati, Taous-Meriem

    2012-09-30

    This study introduces a new signal analysis method, based on a semi-classical approach. The main idea in this method is to interpret a pulse-shaped signal as a potential of a Schrödinger operator and then to use the discrete spectrum of this operator for the analysis of the signal. We present some numerical examples and the first results obtained with this method on the analysis of arterial blood pressure waveforms. © 2012 Springer-Verlag London Limited.

  15. A Large Dimensional Analysis of Regularized Discriminant Analysis Classifiers

    KAUST Repository

    Elkhalil, Khalil

    2017-11-01

    This article carries out a large dimensional analysis of standard regularized discriminant analysis classifiers designed on the assumption that data arise from a Gaussian mixture model with different means and covariances. The analysis relies on fundamental results from random matrix theory (RMT) when both the number of features and the cardinality of the training data within each class grow large at the same pace. Under mild assumptions, we show that the asymptotic classification error approaches a deterministic quantity that depends only on the means and covariances associated with each class as well as the problem dimensions. Such a result permits a better understanding of the performance of regularized discriminant analsysis, in practical large but finite dimensions, and can be used to determine and pre-estimate the optimal regularization parameter that minimizes the misclassification error probability. Despite being theoretically valid only for Gaussian data, our findings are shown to yield a high accuracy in predicting the performances achieved with real data sets drawn from the popular USPS data base, thereby making an interesting connection between theory and practice.

  16. Glucocorticosteroids for sepsis : systematic review with meta-analysis and trial sequential analysis

    NARCIS (Netherlands)

    Volbeda, M.; Wetterslev, J.; Gluud, C.; Zijlstra, J. G.; van der Horst, I. C. C.; Keus, F.

    Glucocorticosteroids (steroids) are widely used for sepsis patients. However, the potential benefits and harms of both high and low dose steroids remain unclear. A systematic review of randomised clinical trials with meta-analysis and trial sequential analysis (TSA) might shed light on this

  17. Functional Principal Component Analysis and Randomized Sparse Clustering Algorithm for Medical Image Analysis

    Science.gov (United States)

    Lin, Nan; Jiang, Junhai; Guo, Shicheng; Xiong, Momiao

    2015-01-01

    Due to the advancement in sensor technology, the growing large medical image data have the ability to visualize the anatomical changes in biological tissues. As a consequence, the medical images have the potential to enhance the diagnosis of disease, the prediction of clinical outcomes and the characterization of disease progression. But in the meantime, the growing data dimensions pose great methodological and computational challenges for the representation and selection of features in image cluster analysis. To address these challenges, we first extend the functional principal component analysis (FPCA) from one dimension to two dimensions to fully capture the space variation of image the signals. The image signals contain a large number of redundant features which provide no additional information for clustering analysis. The widely used methods for removing the irrelevant features are sparse clustering algorithms using a lasso-type penalty to select the features. However, the accuracy of clustering using a lasso-type penalty depends on the selection of the penalty parameters and the threshold value. In practice, they are difficult to determine. Recently, randomized algorithms have received a great deal of attentions in big data analysis. This paper presents a randomized algorithm for accurate feature selection in image clustering analysis. The proposed method is applied to both the liver and kidney cancer histology image data from the TCGA database. The results demonstrate that the randomized feature selection method coupled with functional principal component analysis substantially outperforms the current sparse clustering algorithms in image cluster analysis. PMID:26196383

  18. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  19. Correspondence analysis of longitudinal data

    NARCIS (Netherlands)

    Van der Heijden, P.G.M.|info:eu-repo/dai/nl/073087998

    2005-01-01

    Correspondence analysis is an exploratory tool for the analysis of associations between categorical variables, the results of which may be displayed graphically. For longitudinal data with two time points, an analysis of the transition matrix (showing the relative frequencies for pairs of

  20. Critical Analysis of Multimodal Discourse

    DEFF Research Database (Denmark)

    van Leeuwen, Theo

    2013-01-01

    This is an encyclopaedia article which defines the fields of critical discourse analysis and multimodality studies, argues that within critical discourse analysis more attention should be paid to multimodality, and within multimodality to critical analysis, and ends reviewing a few examples of re...

  1. Wavelet analysis for nonstationary signals

    International Nuclear Information System (INIS)

    Penha, Rosani Maria Libardi da

    1999-01-01

    Mechanical vibration signals play an important role in anomalies identification resulting of equipment malfunctioning. Traditionally, Fourier spectral analysis is used where the signals are assumed to be stationary. However, occasional transient impulses and start-up process are examples of nonstationary signals that can be found in mechanical vibrations. These signals can provide important information about the equipment condition, as early fault detection. The Fourier analysis can not adequately be applied to nonstationary signals because the results provide data about the frequency composition averaged over the duration of the signal. In this work, two methods for nonstationary signal analysis are used: Short Time Fourier Transform (STFT) and wavelet transform. The STFT is a method of adapting Fourier spectral analysis for nonstationary application to time-frequency domain. To have a unique resolution throughout the entire time-frequency domain is its main limitation. The wavelet transform is a new analysis technique suitable to nonstationary signals, which handles the STFT drawbacks, providing multi-resolution frequency analysis and time localization in a unique time-scale graphic. The multiple frequency resolutions are obtained by scaling (dilatation/compression) the wavelet function. A comparison of the conventional Fourier transform, STFT and wavelet transform is made applying these techniques to: simulated signals, arrangement rotor rig vibration signal and rotate machine vibration signal Hanning window was used to STFT analysis. Daubechies and harmonic wavelets were used to continuos, discrete and multi-resolution wavelet analysis. The results show the Fourier analysis was not able to detect changes in the signal frequencies or discontinuities. The STFT analysis detected the changes in the signal frequencies, but with time-frequency resolution problems. The wavelet continuos and discrete transform demonstrated to be a high efficient tool to detect

  2. Analysis and Application of Reliability

    International Nuclear Information System (INIS)

    Jeong, Hae Seong; Park, Dong Ho; Kim, Jae Ju

    1999-05-01

    This book tells of analysis and application of reliability, which includes definition, importance and historical background of reliability, function of reliability and failure rate, life distribution and assumption of reliability, reliability of unrepaired system, reliability of repairable system, sampling test of reliability, failure analysis like failure analysis by FEMA and FTA, and cases, accelerated life testing such as basic conception, acceleration and acceleration factor, and analysis of accelerated life testing data, maintenance policy about alternation and inspection.

  3. Comparing methods of classifying life courses: Sequence analysis and latent class analysis

    NARCIS (Netherlands)

    Elzinga, C.H.; Liefbroer, Aart C.; Han, Sapphire

    2017-01-01

    We compare life course typology solutions generated by sequence analysis (SA) and latent class analysis (LCA). First, we construct an analytic protocol to arrive at typology solutions for both methodologies and present methods to compare the empirical quality of alternative typologies. We apply this

  4. Comparing methods of classifying life courses: sequence analysis and latent class analysis

    NARCIS (Netherlands)

    Han, Y.; Liefbroer, A.C.; Elzinga, C.

    2017-01-01

    We compare life course typology solutions generated by sequence analysis (SA) and latent class analysis (LCA). First, we construct an analytic protocol to arrive at typology solutions for both methodologies and present methods to compare the empirical quality of alternative typologies. We apply this

  5. Kosice meteorite analysis

    International Nuclear Information System (INIS)

    Sitek, J.; Degmova, J.; Dekan, J.

    2011-01-01

    Meteorite Kosice fell down 28 th of February 2010 near the Kosice and represents an unique find, because the last fall of meteorite was observed in Slovakia at the year 1895. It supposes that for this kind of meteorite the orbit in cosmic space could be calculated. This is one of most important part because until now 13 places of meteorite find are known in the world of which cosmic orbit in space have been calculated. Slovakia is member of international bolide net, dealing with meteorite analysis in Middle Europe .Analysis of Kosice meteorite will also concern at the long live and short live nuclides. Results should be a contribution to determination of radiation and formative ages. From structural analysis of meteorite it will be possible to compare it with similar types of meteorites. In this work Moessbauer spectroscopy will be used for phase analysis from point of view iron contain components with the aim to identify magnetic and non magnetic fractions. From the analysis of magnetic part we can find that the first sextet with hyperfine magnetic field 33.5 T corresponds to bcc Fe-Ni alloy (kamacite) and second with field 31.5 T to FeS (triolite). Meteorites with mentioned composition belong to the mineral group of chondrites. Comparing our parameters with results of measurements at the similar meteorites we can conclude that Kosice meteorite contains the same components. According all Moessbauer parameters we can also include this meteorite in the mineral group of chondrites. (authors)

  6. Control system design and analysis using the INteractive Controls Analysis (INCA) program

    Science.gov (United States)

    Bauer, Frank H.; Downing, John P.

    1987-01-01

    The INteractive Controls Analysis (INCA) program was developed at the Goddard Space Flight Center to provide a user friendly efficient environment for the design and analysis of linear control systems. Since its inception, INCA has found extensive use in the design, development, and analysis of control systems for spacecraft, instruments, robotics, and pointing systems. Moreover, the results of the analytic tools imbedded in INCA have been flight proven with at least three currently orbiting spacecraft. This paper describes the INCA program and illustrates, using a flight proven example, how the package can perform complex design analyses with relative ease.

  7. A Beginner’s Guide to Factor Analysis: Focusing on Exploratory Factor Analysis

    Directory of Open Access Journals (Sweden)

    An Gie Yong

    2013-10-01

    Full Text Available The following paper discusses exploratory factor analysis and gives an overview of the statistical technique and how it is used in various research designs and applications. A basic outline of how the technique works and its criteria, including its main assumptions are discussed as well as when it should be used. Mathematical theories are explored to enlighten students on how exploratory factor analysis works, an example of how to run an exploratory factor analysis on SPSS is given, and finally a section on how to write up the results is provided. This will allow readers to develop a better understanding of when to employ factor analysis and how to interpret the tables and graphs in the output.

  8. Oncological image analysis.

    Science.gov (United States)

    Brady, Sir Michael; Highnam, Ralph; Irving, Benjamin; Schnabel, Julia A

    2016-10-01

    Cancer is one of the world's major healthcare challenges and, as such, an important application of medical image analysis. After a brief introduction to cancer, we summarise some of the major developments in oncological image analysis over the past 20 years, but concentrating those in the authors' laboratories, and then outline opportunities and challenges for the next decade. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Medical decision making tools: Bayesian analysis and ROC analysis

    International Nuclear Information System (INIS)

    Lee, Byung Do

    2006-01-01

    During the diagnostic process of the various oral and maxillofacial lesions, we should consider the following: 'When should we order diagnostic tests? What tests should be ordered? How should we interpret the results clinically? And how should we use this frequently imperfect information to make optimal medical decision?' For the clinicians to make proper judgement, several decision making tools are suggested. This article discusses the concept of the diagnostic accuracy (sensitivity and specificity values) with several decision making tools such as decision matrix, ROC analysis and Bayesian analysis. The article also explain the introductory concept of ORAD program

  10. Compatibility analysis of DUPIC fuel (Part II) - Reactor physics design and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Chang Joon; Choi, Hang Bok; Rhee, Bo Wook; Roh, Gyu Hong; Kim, Do Hun [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    The compatibility analysis of the DUPIC fuel in a CANDU reactor has been assessed. This study includes the fuel composition adjustment, comparison of lattice properties, performance analysis of reactivity devices, determination of regional over-power (ROP) trip setpoint, and uncertainty estimation of core performance parameters. For the DUPIC fuel composition adjustment, three options have been proposed, which can produce uniform neutronic characteristics of the DUPIC fuel. The lattice analysis has shown that the characteristics of the DUPIC fuel is compatible with those of natural uranium fuel. The reactivity devices of the CANDU-6 reactor maintain their functional requirements even for the DUPIC fuel system. The ROP analysis has shown that the trip setpoint is not sacrificed for the DUPIC fuel system owing to the power shape that enhances more thermal margin. The uncertainty analysis of the core performance parameter has shown that the uncertainty associated with the fuel composition variation is reduced appreciably, which is primarily due to the fuel composition adjustment and secondly the on-power refueling feature and spatial control function of the CANDU reactor. The reactor physics calculation has also shown that it is feasible to use spent PWR fuel directly in CANDU reactors without deteriorating the CANDU-6 core physics design requirements. 29 refs., 67 figs., 60 tabs. (Author)

  11. Analysis of some Egyptian cosmetic samples by fast neutron activation analysis

    International Nuclear Information System (INIS)

    Medhat, M.E.; Ali, M.A.; Hassan, M.F.

    2001-01-01

    A description of D-T neutron generator (NG) is presented. This generator can be used for fast neutron activation analysis applied to determine some selected elements, especially light elements, in different materials. The concentrations of the elements Na, Mg, Al, Si, K, Cl, Ca and Fe were determined in two domestic brands of face powder by using 14 MeV neutron activation analysis

  12. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  13. Liquid Effluents Program mission analysis

    International Nuclear Information System (INIS)

    Lowe, S.S.

    1994-01-01

    Systems engineering is being used to identify work to cleanup the Hanford Site. The systems engineering process transforms an identified mission need into a set of performance parameters and a preferred system configuration. Mission analysis is the first step in the process. Mission analysis supports early decision-making by clearly defining the program objectives, and evaluating the feasibility and risks associated with achieving those objectives. The results of the mission analysis provide a consistent basis for subsequent systems engineering work. A mission analysis was performed earlier for the overall Hanford Site. This work was continued by a ''capstone'' team which developed a top-level functional analysis. Continuing in a top-down manner, systems engineering is now being applied at the program and project levels. A mission analysis was conducted for the Liquid Effluents Program. The results are described herein. This report identifies the initial conditions and acceptable final conditions, defines the programmatic and physical interfaces and sources of constraints, estimates the resources to carry out the mission, and establishes measures of success. The mission analysis reflects current program planning for the Liquid Effluents Program as described in Liquid Effluents FY 1995 Multi-Year Program Plan

  14. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  15. Dynamic analysis program for frame structure

    International Nuclear Information System (INIS)

    Ando, Kozo; Chiba, Toshio

    1975-01-01

    A general purpose computer program named ISTRAN/FD (Isub(HI) STRucture ANalysis/Frame structure, Dynamic analysis) has been developed for dynamic analysis of three-dimensional frame structures. This program has functions of free vibration analysis, seismic response analysis, graphic display by plotter and CRT, etc. This paper introduces ISTRAN/FD; examples of its application are shown with various problems : idealization of the cantilever, dynamic analysis of the main tower of the suspension bridge, three-dimensional vibration in the plate girder bridge, seismic response in the boiler steel structure, and dynamic properties of the underground LNG tank. In this last example, solid elements, in addition to beam elements, are especially used for the analysis. (auth.)

  16. Bridging ImmunoGenomic Data Analysis Workflow Gaps (BIGDAWG): An integrated case-control analysis pipeline.

    Science.gov (United States)

    Pappas, Derek J; Marin, Wesley; Hollenbach, Jill A; Mack, Steven J

    2016-03-01

    Bridging ImmunoGenomic Data-Analysis Workflow Gaps (BIGDAWG) is an integrated data-analysis pipeline designed for the standardized analysis of highly-polymorphic genetic data, specifically for the HLA and KIR genetic systems. Most modern genetic analysis programs are designed for the analysis of single nucleotide polymorphisms, but the highly polymorphic nature of HLA and KIR data require specialized methods of data analysis. BIGDAWG performs case-control data analyses of highly polymorphic genotype data characteristic of the HLA and KIR loci. BIGDAWG performs tests for Hardy-Weinberg equilibrium, calculates allele frequencies and bins low-frequency alleles for k×2 and 2×2 chi-squared tests, and calculates odds ratios, confidence intervals and p-values for each allele. When multi-locus genotype data are available, BIGDAWG estimates user-specified haplotypes and performs the same binning and statistical calculations for each haplotype. For the HLA loci, BIGDAWG performs the same analyses at the individual amino-acid level. Finally, BIGDAWG generates figures and tables for each of these comparisons. BIGDAWG obviates the error-prone reformatting needed to traffic data between multiple programs, and streamlines and standardizes the data-analysis process for case-control studies of highly polymorphic data. BIGDAWG has been implemented as the bigdawg R package and as a free web application at bigdawg.immunogenomics.org. Copyright © 2015 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.

  17. A Technical Analysis Information Fusion Approach for Stock Price Analysis and Modeling

    Science.gov (United States)

    Lahmiri, Salim

    In this paper, we address the problem of technical analysis information fusion in improving stock market index-level prediction. We present an approach for analyzing stock market price behavior based on different categories of technical analysis metrics and a multiple predictive system. Each category of technical analysis measures is used to characterize stock market price movements. The presented predictive system is based on an ensemble of neural networks (NN) coupled with particle swarm intelligence for parameter optimization where each single neural network is trained with a specific category of technical analysis measures. The experimental evaluation on three international stock market indices and three individual stocks show that the presented ensemble-based technical indicators fusion system significantly improves forecasting accuracy in comparison with single NN. Also, it outperforms the classical neural network trained with index-level lagged values and NN trained with stationary wavelet transform details and approximation coefficients. As a result, technical information fusion in NN ensemble architecture helps improving prediction accuracy.

  18. EMPIRICAL RESEARCH AND CONGREGATIONAL ANALYSIS ...

    African Journals Online (AJOL)

    empirical research has made to the process of congregational analysis. 1 Part of this ... contextual congegrational analysis – meeting social and divine desires”) at the IAPT .... methodology of a congregational analysis should be regarded as a process. ... essential to create space for a qualitative and quantitative approach.

  19. Canister storage building hazard analysis report

    International Nuclear Information System (INIS)

    Krahn, D.E.; Garvin, L.J.

    1997-01-01

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the final CSB safety analysis report (SAR) and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Report, and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report

  20. Sensory analysis of pet foods.

    Science.gov (United States)

    Koppel, Kadri

    2014-08-01

    Pet food palatability depends first and foremost on the pet and is related to the pet food sensory properties such as aroma, texture and flavor. Sensory analysis of pet foods may be conducted by humans via descriptive or hedonic analysis, pets via acceptance or preference tests, and through a number of instrumental analysis methods. Sensory analysis of pet foods provides additional information on reasons behind palatable and unpalatable foods as pets lack linguistic capabilities. Furthermore, sensory analysis may be combined with other types of information such as personality and environment factors to increase understanding of acceptable pet foods. Most pet food flavor research is proprietary and, thus, there are a limited number of publications available. Funding opportunities for pet food studies would increase research and publications and this would help raise public awareness of pet food related issues. This mini-review addresses current pet food sensory analysis literature and discusses future challenges and possibilities. © 2014 Society of Chemical Industry.

  1. Thermal Power Plant Performance Analysis

    CERN Document Server

    2012-01-01

    The analysis of the reliability and availability of power plants is frequently based on simple indexes that do not take into account the criticality of some failures used for availability analysis. This criticality should be evaluated based on concepts of reliability which consider the effect of a component failure on the performance of the entire plant. System reliability analysis tools provide a root-cause analysis leading to the improvement of the plant maintenance plan.   Taking in view that the power plant performance can be evaluated not only based on  thermodynamic related indexes, such as heat-rate, Thermal Power Plant Performance Analysis focuses on the presentation of reliability-based tools used to define performance of complex systems and introduces the basic concepts of reliability, maintainability and risk analysis aiming at their application as tools for power plant performance improvement, including: ·         selection of critical equipment and components, ·         defini...

  2. Portable and Automatic Moessbauer Analysis

    International Nuclear Information System (INIS)

    Souza, P. A. de; Garg, V. K.; Klingelhoefer, G.; Gellert, R.; Guetlich, P.

    2002-01-01

    A portable Moessbauer spectrometer, developed for extraterrestrial applications, opens up new industrial applications of MBS. But for industrial applications, an available tool for fast data analysis is also required, and it should be easy to handle. The analysis of Moessbauer spectra and their parameters is a barrier for the popularity of this wide-applicable spectroscopic technique in industry. Based on experience, the analysis of a Moessbauer spectrum is time-consuming and requires the dedication of a specialist. However, the analysis of Moessbauer spectra, from the fitting to the identification of the sample phases, can be faster using by genetic algorithms, fuzzy logic and artificial neural networks. Industrial applications are very specific ones and the data analysis can be performed using these algorithms. In combination with an automatic analysis, the Moessbauer spectrometer can be used as a probe instrument which covers the main industrial needs for an on-line monitoring of its products, processes and case studies. Some of these real industrial applications will be discussed.

  3. MGR External Events Hazards Analysis

    International Nuclear Information System (INIS)

    Booth, L.

    1999-01-01

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses

  4. Foundations of Risk Analysis

    CERN Document Server

    Aven, Terje

    2012-01-01

    Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and

  5. Nuclear analysis techniques and environmental sciences

    International Nuclear Information System (INIS)

    1997-10-01

    31 theses are collected in this book. It introduced molecular activation analysis micro-PIXE and micro-probe analysis, x-ray fluorescence analysis and accelerator mass spectrometry. The applications about these nuclear analysis techniques are presented and reviewed for environmental sciences

  6. Alternatives to Center of Gravity Analysis

    Science.gov (United States)

    2013-04-04

    20 Figure 11. SWOT Analysis ...COMPARISON BETWEEN COG ANALYSIS AND SMTS ......................................................22 Benefits of using SMT in COG Analysis ...Opportunities, and Threats ( SWOT ) analysis . SWOT identifies external and internal factors that impinge on the business (Figure 11). SWOT can be as

  7. 40 CFR 763.87 - Analysis.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Analysis. 763.87 Section 763.87... Asbestos-Containing Materials in Schools § 763.87 Analysis. (a) Local education agencies shall have bulk samples, collected under § 763.86 and submitted for analysis, analyzed for asbestos using laboratories...

  8. Portfolio Analysis for Vector Calculus

    Science.gov (United States)

    Kaplan, Samuel R.

    2015-01-01

    Classic stock portfolio analysis provides an applied context for Lagrange multipliers that undergraduate students appreciate. Although modern methods of portfolio analysis are beyond the scope of vector calculus, classic methods reinforce the utility of this material. This paper discusses how to introduce classic stock portfolio analysis in a…

  9. Combining network analysis with Cognitive Work Analysis: insights into social organisational and cooperation analysis.

    Science.gov (United States)

    Houghton, Robert J; Baber, Chris; Stanton, Neville A; Jenkins, Daniel P; Revell, Kirsten

    2015-01-01

    Cognitive Work Analysis (CWA) allows complex, sociotechnical systems to be explored in terms of their potential configurations. However, CWA does not explicitly analyse the manner in which person-to-person communication is performed in these configurations. Consequently, the combination of CWA with Social Network Analysis provides a means by which CWA output can be analysed to consider communication structure. The approach is illustrated through a case study of a military planning team. The case study shows how actor-to-actor and actor-to-function mapping can be analysed, in terms of centrality, to produce metrics of system structure under different operating conditions. In this paper, a technique for building social network diagrams from CWA is demonstrated.The approach allows analysts to appreciate the potential impact of organisational structure on a command system.

  10. Structural analysis of fuel handling systems

    Energy Technology Data Exchange (ETDEWEB)

    Lee, L S.S. [Atomic Energy of Canada Ltd., Mississauga, ON (Canada)

    1997-12-31

    The purpose of this paper has three aspects: (i) to review `why` and `what` types of structural analysis, testing and report are required for the fuel handling systems according to the codes, or needed for design of a product, (ii) to review the input requirements for analysis and the analysis procedures, and (iii) to improve the communication between the analysis and other elements of the product cycle. The required or needed types of analysis and report may be categorized into three major groups: (i) Certified Stress Reports for design by analysis, (ii) Design Reports not required for certification and registration, but are still required by codes, and (iii) Design Calculations required by codes or needed for design. Input requirements for structural analysis include: design, code classification, loadings, and jurisdictionary boundary. Examples of structural analysis for the fueling machine head and support structure are given. For improving communication between the structural analysis and the other elements of the product cycle, some areas in the specification of design requirements and load rating are discussed. (author). 6 refs., 1 tab., 4 figs.

  11. Structural analysis of fuel handling systems

    International Nuclear Information System (INIS)

    Lee, L.S.S.

    1996-01-01

    The purpose of this paper has three aspects: (i) to review 'why' and 'what' types of structural analysis, testing and report are required for the fuel handling systems according to the codes, or needed for design of a product, (ii) to review the input requirements for analysis and the analysis procedures, and (iii) to improve the communication between the analysis and other elements of the product cycle. The required or needed types of analysis and report may be categorized into three major groups: (i) Certified Stress Reports for design by analysis, (ii) Design Reports not required for certification and registration, but are still required by codes, and (iii) Design Calculations required by codes or needed for design. Input requirements for structural analysis include: design, code classification, loadings, and jurisdictionary boundary. Examples of structural analysis for the fueling machine head and support structure are given. For improving communication between the structural analysis and the other elements of the product cycle, some areas in the specification of design requirements and load rating are discussed. (author). 6 refs., 1 tab., 4 figs

  12. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  13. Trace analysis of semiconductor materials

    CERN Document Server

    Cali, J Paul; Gordon, L

    1964-01-01

    Trace Analysis of Semiconductor Materials is a guidebook concerned with procedures of ultra-trace analysis. This book discusses six distinct techniques of trace analysis. These techniques are the most common and can be applied to various problems compared to other methods. Each of the four chapters basically includes an introduction to the principles and general statements. The theoretical basis for the technique involved is then briefly discussed. Practical applications of the techniques and the different instrumentations are explained. Then, the applications to trace analysis as pertaining

  14. Activation analysis in national economy

    International Nuclear Information System (INIS)

    1974-01-01

    The collected papers are based on the materials of the III All-Union Activation Analysis Meeting. The papers selected deal with the theoretical questions of the activation analysis, its hardware, latest developments in the field of automatic analysis and computer methods employment in the treatment of analytical information. Described are the new techniques for determination of a large number of elements in samples of biological and geological origin. Some results of the use of the activation analysis in various fields of science and technology are provided. The volume reflects the present status of activation analysis techniques in the USSR and might be of interest both for specialists, and for those involved in obtaining and using information on the composition of substances. (auth.)

  15. Global optimization and sensitivity analysis

    International Nuclear Information System (INIS)

    Cacuci, D.G.

    1990-01-01

    A new direction for the analysis of nonlinear models of nuclear systems is suggested to overcome fundamental limitations of sensitivity analysis and optimization methods currently prevalent in nuclear engineering usage. This direction is toward a global analysis of the behavior of the respective system as its design parameters are allowed to vary over their respective design ranges. Presented is a methodology for global analysis that unifies and extends the current scopes of sensitivity analysis and optimization by identifying all the critical points (maxima, minima) and solution bifurcation points together with corresponding sensitivities at any design point of interest. The potential applicability of this methodology is illustrated with test problems involving multiple critical points and bifurcations and comprising both equality and inequality constraints

  16. Analysis of Some Egyptian Cosmetic Samples by Fast Neutron Activation Analysis

    CERN Document Server

    Medhat, M E; Fayez-Hassan, M

    2001-01-01

    A description of D-T neutron generator (NG) is presented. This generator can be used for fast neutron activation analysis applied to determine some selected elements, especially light elements, in different materials. In our work, the concentration of the elements Na, Mg, Al, Si, K, Cl, Ca and Fe, were determined in two domestic brands of face powder by using 14 MeV neutron activation analysis.

  17. Activation analysis in Greece

    International Nuclear Information System (INIS)

    Grimanis, A.P.

    1985-01-01

    A review of research and development on NAA as well as examples of applications of this method are presented, taken from work carried out over the last 21 years at the Radioanalytical Laboratory of the Department of Chemistry in the Greek Nuclear Research Center ''Demokritos''. Improved and faster radiochemical NAA methods have been developed for the determination of Au, Ni, Cl, As, Cu, U, Cr, Eu, Hg and Mo in several materials, for the simultaneous determination of Br and I; Mg, Sr and Ni; As and Cu; As, Sb and Hg; Mn, Sr and Ba; Cd and Zn; Se and As; Mo and Cr in biological materials. Instrumental NAA methods have also been developed for the determination of Ag, Cl and Na in lake waters, Al, Ca, Mg and V in wines, 7 trace elements in biological materials, 17 trace elements in sediments and 20 minor and trace elements in ceramics. A comprehensive computer program for routine activation analysis using Ge(Li) detectors have been worked out. A rather extended charged-particle activation analysis program is carried out for the last 10 years, including particle induced X-ray emission (PIXE) analysis, particle induced prompt gamma-ray emission analysis (PIGE), other nuclear reactions and proton activation analysis. A special neutron activation method, the delayed fission neutron counting method is used for the analysis of fissionable elements, as U, Th, Pu, in samples of the whole nuclear fuel cycle including geological, enriched and nuclear safeguards samples

  18. Technical Note: Introduction of variance component analysis to setup error analysis in radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Matsuo, Yukinori, E-mail: ymatsuo@kuhp.kyoto-u.ac.jp; Nakamura, Mitsuhiro; Mizowaki, Takashi; Hiraoka, Masahiro [Department of Radiation Oncology and Image-applied Therapy, Kyoto University, 54 Shogoin-Kawaharacho, Sakyo, Kyoto 606-8507 (Japan)

    2016-09-15

    Purpose: The purpose of this technical note is to introduce variance component analysis to the estimation of systematic and random components in setup error of radiotherapy. Methods: Balanced data according to the one-factor random effect model were assumed. Results: Analysis-of-variance (ANOVA)-based computation was applied to estimate the values and their confidence intervals (CIs) for systematic and random errors and the population mean of setup errors. The conventional method overestimates systematic error, especially in hypofractionated settings. The CI for systematic error becomes much wider than that for random error. The ANOVA-based estimation can be extended to a multifactor model considering multiple causes of setup errors (e.g., interpatient, interfraction, and intrafraction). Conclusions: Variance component analysis may lead to novel applications to setup error analysis in radiotherapy.

  19. Technical Note: Introduction of variance component analysis to setup error analysis in radiotherapy

    International Nuclear Information System (INIS)

    Matsuo, Yukinori; Nakamura, Mitsuhiro; Mizowaki, Takashi; Hiraoka, Masahiro

    2016-01-01

    Purpose: The purpose of this technical note is to introduce variance component analysis to the estimation of systematic and random components in setup error of radiotherapy. Methods: Balanced data according to the one-factor random effect model were assumed. Results: Analysis-of-variance (ANOVA)-based computation was applied to estimate the values and their confidence intervals (CIs) for systematic and random errors and the population mean of setup errors. The conventional method overestimates systematic error, especially in hypofractionated settings. The CI for systematic error becomes much wider than that for random error. The ANOVA-based estimation can be extended to a multifactor model considering multiple causes of setup errors (e.g., interpatient, interfraction, and intrafraction). Conclusions: Variance component analysis may lead to novel applications to setup error analysis in radiotherapy.

  20. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  1. Is risk analysis scientific?

    Science.gov (United States)

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.

  2. Nuclear analysis software. Pt. 2: Gamma spectrum analysis, activity calculations and neutron activiation analysis (GANAAS)

    International Nuclear Information System (INIS)

    1991-01-01

    A spectrum acquired with a multichannel analyzer is usually stored with a suitable device (tape, cassette tape, diskette, hard disk). Every manufacturer of multichannel analyzers uses his own method for storage, and records the spectra in his own format. Furthermore, the formats to save the spectra evolve in time: the same manufacturer can have several formats for different generations of multichannel analyzers. A similar situation prevails with the spectrum analysis programmes. They require spectra in a particular format as the input to the analysis. Again, these input formats are many and differ from each other considerably. SPEDAC set of routines was developed to provide the spectroscopist with a tool for converting the spectral formats. They can read the spectra recorded in a number of formats used in different multichannel analyzers, to a number of analysis programmes. In fact, all the major formats are represented. Another serious problem for the user of a stand-alone multichannel analyzer is the transfer of spectra from the MCA to the computer. For several well known types of MCAs, the Version 5.0 of SPEDAC offers a set of routines for spectrum transfer, using the most simple methods of interfacing. All the transfer programmes described in this manual have been carefully tested with the corresponding stand-alone multichannel analyzers

  3. Automated Communications Analysis System using Latent Semantic Analysis

    National Research Council Canada - National Science Library

    Foltz, Peter W

    2006-01-01

    ... and during the debriefing process to assess knowledge proficiency. In this report, the contractor describes prior research on communication analysis and how it can inform assessment of individual and team cognitive processing...

  4. The same analysis approach: Practical protection against the pitfalls of novel neuroimaging analysis methods.

    Science.gov (United States)

    Görgen, Kai; Hebart, Martin N; Allefeld, Carsten; Haynes, John-Dylan

    2017-12-27

    Standard neuroimaging data analysis based on traditional principles of experimental design, modelling, and statistical inference is increasingly complemented by novel analysis methods, driven e.g. by machine learning methods. While these novel approaches provide new insights into neuroimaging data, they often have unexpected properties, generating a growing literature on possible pitfalls. We propose to meet this challenge by adopting a habit of systematic testing of experimental design, analysis procedures, and statistical inference. Specifically, we suggest to apply the analysis method used for experimental data also to aspects of the experimental design, simulated confounds, simulated null data, and control data. We stress the importance of keeping the analysis method the same in main and test analyses, because only this way possible confounds and unexpected properties can be reliably detected and avoided. We describe and discuss this Same Analysis Approach in detail, and demonstrate it in two worked examples using multivariate decoding. With these examples, we reveal two sources of error: A mismatch between counterbalancing (crossover designs) and cross-validation which leads to systematic below-chance accuracies, and linear decoding of a nonlinear effect, a difference in variance. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...

  6. Statistical Energy Analysis (SEA) and Energy Finite Element Analysis (EFEA) Predictions for a Floor-Equipped Composite Cylinder

    Science.gov (United States)

    Grosveld, Ferdinand W.; Schiller, Noah H.; Cabell, Randolph H.

    2011-01-01

    Comet Enflow is a commercially available, high frequency vibroacoustic analysis software founded on Energy Finite Element Analysis (EFEA) and Energy Boundary Element Analysis (EBEA). Energy Finite Element Analysis (EFEA) was validated on a floor-equipped composite cylinder by comparing EFEA vibroacoustic response predictions with Statistical Energy Analysis (SEA) and experimental results. Statistical Energy Analysis (SEA) predictions were made using the commercial software program VA One 2009 from ESI Group. The frequency region of interest for this study covers the one-third octave bands with center frequencies from 100 Hz to 4000 Hz.

  7. Android malware and analysis

    CERN Document Server

    Dunham, Ken

    2014-01-01

    The rapid growth and development of Android-based devices has resulted in a wealth of sensitive information on mobile devices that offer minimal malware protection. This has created an immediate demand for security professionals that understand how to best approach the subject of Android malware threats and analysis.In Android Malware and Analysis, Ken Dunham, renowned global malware expert and author, teams up with international experts to document the best tools and tactics available for analyzing Android malware. The book covers both methods of malware analysis: dynamic and static.This tact

  8. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  9. An analyst's self-analysis.

    Science.gov (United States)

    Calder, K T

    1980-01-01

    I have told you why I selected the topic of self-analysis, and I have described my method for it: of recording primary data such as dreams, daydreams, memories, and symptoms and of recording associations to this primary data, followed by an attempt at analyzing this written material. I have described a dream, a memory and a daydream which is also a symptom, each of which primary data I found useful in understanding myself. Finally, I reached some conclusions regarding the uses of self-analysis, including self-analysis as a research tool.

  10. Distributed analysis at LHCb

    International Nuclear Information System (INIS)

    Williams, Mike; Egede, Ulrik; Paterson, Stuart

    2011-01-01

    The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.

  11. Analysis of metal samples

    International Nuclear Information System (INIS)

    Ramirez T, J.J.; Lopez M, J.; Sandoval J, A.R.; Villasenor S, P.; Aspiazu F, J.A.

    2001-01-01

    An elemental analysis, metallographic and of phases was realized in order to determine the oxidation states of Fe contained in three metallic pieces: block, plate and cylinder of unknown material. Results are presented from the elemental analysis which was carried out in the Tandem Accelerator of ININ by Proton induced X-ray emission (PIXE). The phase analysis was carried out by X-ray diffraction which allowed to know the type of alloy or alloys formed. The combined application of nuclear techniques with metallographic techniques allows the integral characterization of industrial metals. (Author)

  12. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  13. Worst-case analysis of heap allocations

    DEFF Research Database (Denmark)

    Puffitsch, Wolfgang; Huber, Benedikt; Schoeberl, Martin

    2010-01-01

    the worst-case heap allocations of tasks. The analysis builds upon techniques that are well established for worst-case execution time analysis. The difference is that the cost function is not the execution time of instructions in clock cycles, but the allocation in bytes. In contrast to worst-case execution...... time analysis, worst-case heap allocation analysis is not processor dependent. However, the cost function depends on the object layout of the runtime system. The analysis is evaluated with several real-time benchmarks to establish the usefulness of the analysis, and to compare the memory consumption...

  14. Random safety auditing, root cause analysis, failure mode and effects analysis.

    Science.gov (United States)

    Ursprung, Robert; Gray, James

    2010-03-01

    Improving quality and safety in health care is a major concern for health care providers, the general public, and policy makers. Errors and quality issues are leading causes of morbidity and mortality across the health care industry. There is evidence that patients in the neonatal intensive care unit (NICU) are at high risk for serious medical errors. To facilitate compliance with safe practices, many institutions have established quality-assurance monitoring procedures. Three techniques that have been found useful in the health care setting are failure mode and effects analysis, root cause analysis, and random safety auditing. When used together, these techniques are effective tools for system analysis and redesign focused on providing safe delivery of care in the complex NICU system. Copyright 2010 Elsevier Inc. All rights reserved.

  15. Physics analysis tools

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1991-04-01

    There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed

  16. Identifying Effective Spelling Interventions Using a Brief Experimental Analysis and Extended Analysis

    Science.gov (United States)

    McCurdy, Merilee; Clure, Lynne F.; Bleck, Amanda A.; Schmitz, Stephanie L.

    2016-01-01

    Spelling is an important skill that is crucial to effective written communication. In this study, brief experimental analysis procedures were used to examine spelling instruction strategies (e.g., whole word correction; word study strategy; positive practice; and cover, copy, and compare) for four students. In addition, an extended analysis was…

  17. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    Science.gov (United States)

    Hendricks, Eric S.

    2016-01-01

    The Object-Oriented Turbomachinery Analysis Code (OTAC) is a new meanline/streamline turbomachinery modeling tool being developed at NASA GRC. During the development process, a limitation of the code was discovered in relation to the analysis of choked flow in axial turbines. This paper describes the relevant physics for choked flow as well as the changes made to OTAC to enable analysis in this flow regime.

  18. Exploratory factor analysis in Rehabilitation Psychology: a content analysis.

    Science.gov (United States)

    Roberson, Richard B; Elliott, Timothy R; Chang, Jessica E; Hill, Jessica N

    2014-11-01

    Our objective was to examine the use and quality of exploratory factor analysis (EFA) in articles published in Rehabilitation Psychology. Trained raters examined 66 separate exploratory factor analyses in 47 articles published between 1999 and April 2014. The raters recorded the aim of the EFAs, the distributional statistics, sample size, factor retention method(s), extraction and rotation method(s), and whether the pattern coefficients, structure coefficients, and the matrix of association were reported. The primary use of the EFAs was scale development, but the most widely used extraction and rotation method was principle component analysis, with varimax rotation. When determining how many factors to retain, multiple methods (e.g., scree plot, parallel analysis) were used most often. Many articles did not report enough information to allow for the duplication of their results. EFA relies on authors' choices (e.g., factor retention rules extraction, rotation methods), and few articles adhered to all of the best practices. The current findings are compared to other empirical investigations into the use of EFA in published research. Recommendations for improving EFA reporting practices in rehabilitation psychology research are provided.

  19. Environmental risk analysis

    International Nuclear Information System (INIS)

    Lima-e-Silva, Pedro Paulo de

    1996-01-01

    The conventional Risk Analysis (RA) relates usually a certain undesired event frequency with its consequences. Such technique is used nowadays in Brazil to analyze accidents and their consequences strictly under the human approach, valuing loss of human equipment, human structures and human lives, without considering the damage caused to natural resources that keep life possible on Earth. This paradigm was developed primarily because of the Homo sapiens' lack of perception upon the natural web needed to sustain his own life. In reality, the Brazilian professionals responsible today for licensing, auditing and inspecting environmental aspects of human activities face huge difficulties in making technical specifications and procedures leading to acceptable levels of impact, furthermore considering the intrinsic difficulties to define those levels. Therefore, in Brazil the RA technique is a weak tool for licensing for many reasons, and of them are its short scope (only accident considerations) and wrong a paradigm (only human direct damages). A paper from the author about the former was already proposed to the 7th International Conference on Environmetrics, past July'96, USP-SP. This one discusses the extension of the risk analysis concept to take into account environmental consequences, transforming the conventional analysis into a broader methodology named here as Environmental Risk Analysis. (author)

  20. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  1. Panel data analysis using EViews

    CERN Document Server

    Agung, I Gusti Ngurah

    2013-01-01

    A comprehensive and accessible guide to panel data analysis using EViews software This book explores the use of EViews software in creating panel data analysis using appropriate empirical models and real datasets. Guidance is given on developing alternative descriptive statistical summaries for evaluation and providing policy analysis based on pool panel data. Various alternative models based on panel data are explored, including univariate general linear models, fixed effect models and causal models, and guidance on the advantages and disadvantages of each one is given. Panel Data Analysis

  2. Activation analysis in water chemistry

    International Nuclear Information System (INIS)

    Szabo, A.; Toth, A.

    1978-01-01

    The potential applications of activation analysis in water chemistry are discussed. The principle, unit operations, the radiation sources and measuring instruments of activation analysis are described. The sensitivity of activation analysis is given in tabulated form for some elements of major importance in water chemistry and the elements readily accessible to determination by measurement of the spontaneous gamma radiation are listed. A few papers selected from the recent international professional literature are finally reviewed, in which the authors report on the results obtained by activation analysis applied to water chemistry. (author)

  3. Quality assurance of qualitative analysis

    DEFF Research Database (Denmark)

    Ríos, Ángel; Barceló, Damiá; Buydens, Lutgarde

    2003-01-01

    The European Commission has supported the G6MA-CT-2000-01012 project on "Metrology of Qualitative Chemical Analysis" (MEQUALAN), which was developed during 2000-2002. The final result is a document produced by a group of scientists with expertise in different areas of chemical analysis, metrology...... and quality assurance. One important part of this document deals, therefore, with aspects involved in analytical quality assurance of qualitative analysis. This article shows the main conclusions reported in the document referring to the implementation of quality principles in qualitative analysis...

  4. The fundamentals of mathematical analysis

    CERN Document Server

    Fikhtengol'ts, G M

    1965-01-01

    The Fundamentals of Mathematical Analysis, Volume 1 is a textbook that provides a systematic and rigorous treatment of the fundamentals of mathematical analysis. Emphasis is placed on the concept of limit which plays a principal role in mathematical analysis. Examples of the application of mathematical analysis to geometry, mechanics, physics, and engineering are given. This volume is comprised of 14 chapters and begins with a discussion on real numbers, their properties and applications, and arithmetical operations over real numbers. The reader is then introduced to the concept of function, i

  5. Subcubic Control Flow Analysis Algorithms

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Van Horn, David

    We give the first direct subcubic algorithm for performing control flow analysis of higher-order functional programs. Despite the long held belief that inclusion-based flow analysis could not surpass the ``cubic bottleneck, '' we apply known set compression techniques to obtain an algorithm...... that runs in time O(n^3/log n) on a unit cost random-access memory model machine. Moreover, we refine the initial flow analysis into two more precise analyses incorporating notions of reachability. We give subcubic algorithms for these more precise analyses and relate them to an existing analysis from...

  6. The analysis of the permanent magnet motor using the new magnetic field analysis

    International Nuclear Information System (INIS)

    Shimoji, Hiroyasu; Enokizono, Masato; Todaka, Takashi

    2002-01-01

    In this paper iron loss analysis of the permanent magnet motor considering anisotropy of magnetic material is carried out. Recently the magnetic material can be measured using of vector quantity technique. Non-oriented silicon steel sheets for the iron core material have the anisotropy. Therefore, it is necessary to carry out the analysis considering the anisotropy of the magnetic material. We used the magnetic field analysis, which consider the anisotropy by combining finite element method with the E and S (Enokizono and Soda) modeling. (Author)

  7. Activation analysis. Chapter 4

    International Nuclear Information System (INIS)

    1976-01-01

    The principle, sample and calibration standard preparation, activation by neutrons, charged particles and gamma radiation, sample transport after activation, activity measurement, and chemical sample processing are described for activation analysis. Possible applications are shown of nondestructive activation analysis. (J.P.)

  8. Retinal Imaging and Image Analysis

    Science.gov (United States)

    Abràmoff, Michael D.; Garvin, Mona K.; Sonka, Milan

    2011-01-01

    Many important eye diseases as well as systemic diseases manifest themselves in the retina. While a number of other anatomical structures contribute to the process of vision, this review focuses on retinal imaging and image analysis. Following a brief overview of the most prevalent causes of blindness in the industrialized world that includes age-related macular degeneration, diabetic retinopathy, and glaucoma, the review is devoted to retinal imaging and image analysis methods and their clinical implications. Methods for 2-D fundus imaging and techniques for 3-D optical coherence tomography (OCT) imaging are reviewed. Special attention is given to quantitative techniques for analysis of fundus photographs with a focus on clinically relevant assessment of retinal vasculature, identification of retinal lesions, assessment of optic nerve head (ONH) shape, building retinal atlases, and to automated methods for population screening for retinal diseases. A separate section is devoted to 3-D analysis of OCT images, describing methods for segmentation and analysis of retinal layers, retinal vasculature, and 2-D/3-D detection of symptomatic exudate-associated derangements, as well as to OCT-based analysis of ONH morphology and shape. Throughout the paper, aspects of image acquisition, image analysis, and clinical relevance are treated together considering their mutually interlinked relationships. PMID:22275207

  9. PIXE analysis by baby cyclotron

    International Nuclear Information System (INIS)

    Yoshida, Hyogo; Tanaka, Teruaki; Ito, Takashi; Toda, Yohjiro; Wakasa, Hideichiro

    1988-01-01

    The Japan Steel Works, Ltd. has been supplying a very small sized cyclotron (Baby Cyclotron) to hospitals and research facilities. The cyclotron is designed to produce short-lived radioisotopes for medical use. In the present study, this cyclotron is modified so that it can serve for PIXE analysis. The PIXE (particle induced X-ray emission) technique has the following features: (1) Down to 1 ng of trace material in a sample (mg - μg) can be detected, (2) An analysis run is completed in one to ten minutes, permitting economical analysis for a large number of samples, (3) Several elements can be analyzed simultaneously, with an almost constant sensitivity for a variety of elements ranging from aluminum to heavy metals, (4) Analysis can be performed nondestructively without a chemical process, and (5) The use of microbeam can provide data on the distribution of elements with a resolution of several μm. Software for analysis is developed to allow the modified equipment to perform peak search, background fitting, and identification and determination of peaks. A study is now being conducted to examine the performance of the equipment for PIXE analysis of thin samples. Satisfactory results have been obtained. The analysis time, excluding the background correction, is 5-10 min. (Nogami, K.)

  10. An expert image analysis system for chromosome analysis application

    International Nuclear Information System (INIS)

    Wu, Q.; Suetens, P.; Oosterlinck, A.; Van den Berghe, H.

    1987-01-01

    This paper reports a recent study on applying a knowledge-based system approach as a new attempt to solve the problem of chromosome classification. A theoretical framework of an expert image analysis system is proposed, based on such a study. In this scheme, chromosome classification can be carried out under a hypothesize-and-verify paradigm, by integrating a rule-based component, in which the expertise of chromosome karyotyping is formulated with an existing image analysis system which uses conventional pattern recognition techniques. Results from the existing system can be used to bring in hypotheses, and with the rule-based verification and modification procedures, improvement of the classification performance can be excepted

  11. PLACE OF PRODUCTION COSTS SYSTEM ANALYSIS IN SYSTEM ANALYSIS

    Directory of Open Access Journals (Sweden)

    Mariia CHEREDNYCHENKO

    2016-12-01

    Full Text Available Current economic conditions require the development and implementation of an adequate system of production costs, which would ensure a steady profit growth and production volumes in a highly competitive, constantly increasing input prices and tariffs. This management system must be based on an integrated production costs system analysis (PCSA, which would provide all operating costs management subsystems necessary information to design and make better management decisions. It provides a systematic analysis of more opportunities in knowledge, creating conditions of integrity mechanism knowledge object consisting of elements that show intersystem connections, each of which has its own defined and limited objectives, relationship with the environment.

  12. [Conversation analysis for improving nursing communication].

    Science.gov (United States)

    Yi, Myungsun

    2007-08-01

    Nursing communication has become more important than ever before because quality of nursing services largely depends on the quality of communication in a very competitive health care environment. This article was to introduce ways to improve nursing communication using conversation analysis. This was a review study on conversation analysis, critically examining previous studies in nursing communication and interpersonal relationships. This study provided theoretical backgrounds and basic assumptions of conversation analysis which was influenced by ethnomethodology, phenomenology, and sociolinguistic. In addition, the characteristics and analysis methods of conversation analysis were illustrated in detail. Lastly, how conversation analysis could help improve communication was shown, by examining researches using conversation analysis not only for ordinary conversations but also for extraordinary or difficult conversations such as conversations between patients with dementia and their professional nurses. Conversation analysis can help in improving nursing communication by providing various structures and patterns as well as prototypes of conversation, and by suggesting specific problems and problem-solving strategies in communication.

  13. Exploratory Analysis in Learning Analytics

    Science.gov (United States)

    Gibson, David; de Freitas, Sara

    2016-01-01

    This article summarizes the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects. The cases include an analysis of a games-based virtual performance assessment and an analysis of data from 52,000 students over a 5-year period at a large Australian university. The complex…

  14. Planar Parametrization in Isogeometric Analysis

    DEFF Research Database (Denmark)

    Gravesen, Jens; Evgrafov, Anton; Nguyen, Dang-Manh

    2012-01-01

    Before isogeometric analysis can be applied to solving a partial differential equation posed over some physical domain, one needs to construct a valid parametrization of the geometry. The accuracy of the analysis is affected by the quality of the parametrization. The challenge of computing...... and maintaining a valid geometry parametrization is particularly relevant in applications of isogemetric analysis to shape optimization, where the geometry varies from one optimization iteration to another. We propose a general framework for handling the geometry parametrization in isogeometric analysis and shape...... are suitable for our framework. The non-linear methods we consider are based on solving a constrained optimization problem numerically, and are divided into two classes, geometry-oriented methods and analysis-oriented methods. Their performance is illustrated through a few numerical examples....

  15. Ultrastructural Analysis of Urinary Stones by Microfocus Computed Tomography and Comparison with Chemical Analysis

    Directory of Open Access Journals (Sweden)

    Tolga Karakan

    2016-06-01

    Full Text Available Objective: To investigate the ultra-structure of urinary system stones using micro-focus computed tomography (MCT, which makes non-destructive analysis and to compare with wet chemical analysis. Methods: This study was carried out at the Ankara Train­ing and Research hospital. Renal stones, removed from 30 patients during percutaneous nephrolithotomy (PNL surgery, were included in the study. The stones were blindly evaluated by the specialists with MCT and chemi­cal analysis. Results: The comparison of the stone components be­tween chemical analysis and MCT, showed that the rate of consistence was very low (p0.05. It was also seen that there was no significant relation between its 3D structure being heterogeneous or homogenous. Conclusion: The stone analysis with MCT is a time con­suming and costly method. This method is useful to un­derstand the mechanisms of stone formation and an im­portant guide to develop the future treatment modalities.

  16. Fatigue behavior of a bolted assembly - a comparison between numerical analysis and experimental analysis

    International Nuclear Information System (INIS)

    Bosser, M.; Vagner, J.

    1987-01-01

    The fatigue behavior of a bolted assembly can be analysed, either by fatigue tests, or by computing the stress variations and using a fatigue curve. This paper presents the fatigue analysis of a stud-bolt and stud-flange of a steam generator manway carried out with the two methods. The experimental analysis is performed for various levels of load, according to the recommandations of the ASME code section III appendix II. The numerical analysis of the stresses is based on the results of a finite element analysis performed with the program SYSTUS. The maximum stresses are obtained in the first bolt threads. In using these stresses, the allowable number of cycles for each level of loading analysed, is obtained from fatigue curves, as defined in appendix I section III of the ASME code. The analysis underlines that, for each level of load the purely numerical approach is highly conservative, compared to the experimental approach. (orig.)

  17. Contrast analysis : A tutorial

    NARCIS (Netherlands)

    Haans, A.

    2018-01-01

    Contrast analysis is a relatively simple but effective statistical method for testing theoretical predictions about differences between group means against the empirical data. Despite its advantages, contrast analysis is hardly used to date, perhaps because it is not implemented in a convenient

  18. Towards Cognitive Component Analysis

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Ahrendt, Peter; Larsen, Jan

    2005-01-01

    Cognitive component analysis (COCA) is here defined as the process of unsupervised grouping of data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. We have earlier demonstrated that independent components analysis is relevant for representing...

  19. Marketing research cluster analysis

    OpenAIRE

    Marić Nebojša

    2002-01-01

    One area of applications of cluster analysis in marketing is identification of groups of cities and towns with similar demographic profiles. This paper considers main aspects of cluster analysis by an example of clustering 12 cities with the use of Minitab software.

  20. Amplitude and Ascoli analysis

    International Nuclear Information System (INIS)

    Hansen, J.D.

    1976-01-01

    This article discusses the partial wave analysis of two, three and four meson systems. The difference between the two approaches, referred to as amplitude and Ascoli analysis is discussed. Some of the results obtained with these methods are shown. (B.R.H.)

  1. Noise and vibration analysis system

    International Nuclear Information System (INIS)

    Johnsen, J.R.; Williams, R.L.

    1985-01-01

    The analysis of noise and vibration data from an operating nuclear plant can provide valuable information that can identify and characterize abnormal conditions. Existing plant monitoring equipment, such as loose parts monitoring systems (LPMS) and neutron flux detectors, may be capable of gathering noise data, but may lack the analytical capability to extract useful meanings hidden in the noise. By analyzing neutron noise signals, the structural motion and integrity of core components can be assessed. Computer analysis makes trending of frequency spectra within a fuel cycle and from one cycle to another a practical means of core internals monitoring. The Babcock and Wilcox Noise and Vibration Analysis System (NVAS) is a powerful, compact system that can automatically perform complex data analysis. The system can acquire, process, and store data, then produce report-quality plots of the important parameter. Software to perform neutron noise analysis and loose parts analysis operates on the same hardware package. Since the system is compact, inexpensive, and easy to operate, it allows utilities to perform more frequency analyses without incurring high costs and provides immediate results

  2. Quality Assessment of Urinary Stone Analysis

    DEFF Research Database (Denmark)

    Siener, Roswitha; Buchholz, Noor; Daudon, Michel

    2016-01-01

    After stone removal, accurate analysis of urinary stone composition is the most crucial laboratory diagnostic procedure for the treatment and recurrence prevention in the stone-forming patient. The most common techniques for routine analysis of stones are infrared spectroscopy, X-ray diffraction......, fulfilled the quality requirements. According to the current standard, chemical analysis is considered to be insufficient for stone analysis, whereas infrared spectroscopy or X-ray diffraction is mandatory. However, the poor results of infrared spectroscopy highlight the importance of equipment, reference...... spectra and qualification of the staff for an accurate analysis of stone composition. Regular quality control is essential in carrying out routine stone analysis....

  3. Analysis Streamlining in ATLAS

    CERN Document Server

    Heinrich, Lukas; The ATLAS collaboration

    2018-01-01

    We present recent work within the ATLAS collaboration centrally provide tools to facilitate analysis management and highly automated container-based analysis execution in order to both enable non-experts to benefit from these best practices as well as the collaboration to track and re-execute analyses indpendently, e.g. during their review phase. Through integration with the ATLAS GLANCE system, users can request a pre-configured, but customizable version control setup, including continuous integration for automated build and testing as well as continuous Linux Container image building for software preservation purposes. As analyses typically require many individual steps, analysis workflow pipelines can then be defined using such images and the yadage workflow description language. The integration into the workflow exection service REANA allows the interactive or automated reproduction of the main analysis results by orchestrating a large number of container jobs using the Kubernetes. For long-term archival,...

  4. Longitudinal categorical data analysis

    CERN Document Server

    Sutradhar, Brajendra C

    2014-01-01

    This is the first book in longitudinal categorical data analysis with parametric correlation models developed based on dynamic relationships among repeated categorical responses. This book is a natural generalization of the longitudinal binary data analysis to the multinomial data setup with more than two categories. Thus, unlike the existing books on cross-sectional categorical data analysis using log linear models, this book uses multinomial probability models both in cross-sectional and longitudinal setups. A theoretical foundation is provided for the analysis of univariate multinomial responses, by developing models systematically for the cases with no covariates as well as categorical covariates, both in cross-sectional and longitudinal setups. In the longitudinal setup, both stationary and non-stationary covariates are considered. These models have also been extended to the bivariate multinomial setup along with suitable covariates. For the inferences, the book uses the generalized quasi-likelihood as w...

  5. Principles of Fourier analysis

    CERN Document Server

    Howell, Kenneth B

    2001-01-01

    Fourier analysis is one of the most useful and widely employed sets of tools for the engineer, the scientist, and the applied mathematician. As such, students and practitioners in these disciplines need a practical and mathematically solid introduction to its principles. They need straightforward verifications of its results and formulas, and they need clear indications of the limitations of those results and formulas.Principles of Fourier Analysis furnishes all this and more. It provides a comprehensive overview of the mathematical theory of Fourier analysis, including the development of Fourier series, "classical" Fourier transforms, generalized Fourier transforms and analysis, and the discrete theory. Much of the author''s development is strikingly different from typical presentations. His approach to defining the classical Fourier transform results in a much cleaner, more coherent theory that leads naturally to a starting point for the generalized theory. He also introduces a new generalized theory based ...

  6. Perspectives in shape analysis

    CERN Document Server

    Bruckstein, Alfred; Maragos, Petros; Wuhrer, Stefanie

    2016-01-01

    This book presents recent advances in the field of shape analysis. Written by experts in the fields of continuous-scale shape analysis, discrete shape analysis and sparsity, and numerical computing who hail from different communities, it provides a unique view of the topic from a broad range of perspectives. Over the last decade, it has become increasingly affordable to digitize shape information at high resolution. Yet analyzing and processing this data remains challenging because of the large amount of data involved, and because modern applications such as human-computer interaction require real-time processing. Meeting these challenges requires interdisciplinary approaches that combine concepts from a variety of research areas, including numerical computing, differential geometry, deformable shape modeling, sparse data representation, and machine learning. On the algorithmic side, many shape analysis tasks are modeled using partial differential equations, which can be solved using tools from the field of n...

  7. Statistical analysis of medical data using SAS

    CERN Document Server

    Der, Geoff

    2005-01-01

    An Introduction to SASDescribing and Summarizing DataBasic InferenceScatterplots Correlation: Simple Regression and SmoothingAnalysis of Variance and CovarianceMultiple RegressionLogistic RegressionThe Generalized Linear ModelGeneralized Additive ModelsNonlinear Regression ModelsThe Analysis of Longitudinal Data IThe Analysis of Longitudinal Data II: Models for Normal Response VariablesThe Analysis of Longitudinal Data III: Non-Normal ResponseSurvival AnalysisAnalysis Multivariate Date: Principal Components and Cluster AnalysisReferences

  8. Time Based Workload Analysis Method for Safety-Related Operator Actions in Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yun Goo; Oh, Eung Se [Korea Hydro and Nuclear Power Co., Daejeon (Korea, Republic of)

    2016-05-15

    During the design basis event, the safety system performs safety functions to mitigate the event. The most of safety system is actuated by automatic system however, there are operator manual actions that are needed for the plant safety. These operator actions are classified as important human actions in human factors engineering design. The human factors engineering analysis and evaluation is needed for these important human actions to assure that operator successfully perform their tasks for plant safety and operational goals. The work load analysis is one of the required analysis for the important human actions.

  9. Time Based Workload Analysis Method for Safety-Related Operator Actions in Safety Analysis

    International Nuclear Information System (INIS)

    Kim, Yun Goo; Oh, Eung Se

    2016-01-01

    During the design basis event, the safety system performs safety functions to mitigate the event. The most of safety system is actuated by automatic system however, there are operator manual actions that are needed for the plant safety. These operator actions are classified as important human actions in human factors engineering design. The human factors engineering analysis and evaluation is needed for these important human actions to assure that operator successfully perform their tasks for plant safety and operational goals. The work load analysis is one of the required analysis for the important human actions.

  10. An overview of the design and analysis of simulation experiments for sensitivity analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2005-01-01

    Sensitivity analysis may serve validation, optimization, and risk analysis of simulation models. This review surveys 'classic' and 'modern' designs for experiments with simulation models. Classic designs were developed for real, non-simulated systems in agriculture, engineering, etc. These designs

  11. The integration methods of fuzzy fault mode and effect analysis and fault tree analysis for risk analysis of yogurt production

    Science.gov (United States)

    Aprilia, Ayu Rizky; Santoso, Imam; Ekasari, Dhita Murita

    2017-05-01

    Yogurt is a product based on milk, which has beneficial effects for health. The process for the production of yogurt is very susceptible to failure because it involves bacteria and fermentation. For an industry, the risks may cause harm and have a negative impact. In order for a product to be successful and profitable, it requires the analysis of risks that may occur during the production process. Risk analysis can identify the risks in detail and prevent as well as determine its handling, so that the risks can be minimized. Therefore, this study will analyze the risks of the production process with a case study in CV.XYZ. The method used in this research is the Fuzzy Failure Mode and Effect Analysis (fuzzy FMEA) and Fault Tree Analysis (FTA). The results showed that there are 6 risks from equipment variables, raw material variables, and process variables. Those risks include the critical risk, which is the risk of a lack of an aseptic process, more specifically if starter yogurt is damaged due to contamination by fungus or other bacteria and a lack of sanitation equipment. The results of quantitative analysis of FTA showed that the highest probability is the probability of the lack of an aseptic process, with a risk of 3.902%. The recommendations for improvement include establishing SOPs (Standard Operating Procedures), which include the process, workers, and environment, controlling the starter of yogurt and improving the production planning and sanitation equipment using hot water immersion.

  12. Stress analysis and torsional buckling analysis of U-shaped bellows

    International Nuclear Information System (INIS)

    Watanabe, Osamu; Ohtsubo, Hideomi.

    1986-01-01

    This paper presents analysis of elastic stress and torsional buckling of U-shaped bellows using ring elements. The expansion joint is considered to be composed of the two toroidal sections and inner-connecting annular plates. The general thin shell theory is employed to derive strain-displacement relations of shells and plates, valid for any loadings. Numerical examples under internal pressure or axial loading are described and compared with the results of existing appropriate analysis. The fundamental aspects of torsional buckling, which have not been studied previously, will also be investigated. (author)

  13. An Overview of the Design and Analysis of Simulation Experiments for Sensitivity Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2004-01-01

    Sensitivity analysis may serve validation, optimization, and risk analysis of simulation models.This review surveys classic and modern designs for experiments with simulation models.Classic designs were developed for real, non-simulated systems in agriculture, engineering, etc.These designs assume a

  14. Dried blood spot analysis of creatinine with LC-MS/MS in addition to immunosuppressants analysis

    NARCIS (Netherlands)

    Koster, Remco A.; Greijdanus, Ben; Alffenaar, Jan-Willem C.; Touw, Daan J.

    In order to monitor creatinine levels or to adjust the dosage of renally excreted or nephrotoxic drugs, the analysis of creatinine in dried blood spots (DBS) could be a useful addition to DBS analysis. We developed a LC-MS/MS method for the analysis of creatinine in the same DBS extract that was

  15. Thermal energy systems design and analysis

    CERN Document Server

    Penoncello, Steven G

    2015-01-01

    IntroductionThermal Energy Systems Design and AnalysisSoftwareThermal Energy System TopicsUnits and Unit SystemsThermophysical PropertiesEngineering DesignEngineering EconomicsIntroductionCommon Engineering Economics NomenclatureEconomic Analysis Tool: The Cash Flow DiagramTime Value of MoneyTime Value of Money ExamplesUsing Software to Calculate Interest FactorsEconomic Decision MakingDepreciation and TaxesProblemsAnalysis of Thermal Energy SystemsIntroductionNomenclatureThermophysical Properties of SubstancesSuggested Thermal Energy Systems Analysis ProcedureConserved and Balanced QuantitiesConservation of MassConservation of Energy (The First Law of Thermodynamics)Entropy Balance (The Second Law of Thermodynamics)Exergy Balance: The Combined LawEnergy and Exergy Analysis of Thermal Energy CyclesDetailed Analysis of Thermal Energy CyclesProblemsFluid Transport in Thermal Energy SystemsIntroductionPiping and Tubing StandardsFluid Flow FundamentalsValves and FittingsDesign and Analysis of Pipe NetworksEconomi...

  16. A background risk analysis. Vol. 1

    International Nuclear Information System (INIS)

    Taylor, J.R.

    1979-01-01

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques, described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 1 contains a short history of risk analysis, and chapters on risk, failures, errors and accidents, and general procedures for risk analysis. (BP)

  17. Marketing research cluster analysis

    Directory of Open Access Journals (Sweden)

    Marić Nebojša

    2002-01-01

    Full Text Available One area of applications of cluster analysis in marketing is identification of groups of cities and towns with similar demographic profiles. This paper considers main aspects of cluster analysis by an example of clustering 12 cities with the use of Minitab software.

  18. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  19. How to think about analysis

    CERN Document Server

    Alcock, Lara

    2014-01-01

    Analysis (sometimes called Real Analysis or Advanced Calculus) is a core subject in most undergraduate mathematics degrees. It is elegant, clever and rewarding to learn, but it is hard. Even the best students find it challenging, and those who are unprepared often find it incomprehensible at first. This book aims to ensure that no student need be unprepared. It is not like other Analysis books. It is not a textbook containing standard content. Rather, it isdesigned to be read before arriving at university and/or before starting an Analysis course, or as a companion text once a course is begun.

  20. Analysis of monazite samples

    International Nuclear Information System (INIS)

    Kartiwa Sumadi; Yayah Rohayati

    1996-01-01

    The 'monazit' analytical program has been set up for routine work of Rare Earth Elements analysis in the monazite and xenotime minerals samples. Total relative error of the analysis is very low, less than 2.50%, and the reproducibility of counting statistic and stability of the instrument were very excellent. The precision and accuracy of the analytical program are very good with the maximum percentage relative are 5.22% and 1.61%, respectively. The mineral compositions of the 30 monazite samples have been also calculated using their chemical constituents, and the results were compared to the grain counting microscopic analysis