WorldWideScience

Sample records for homatropine hydrobromide analysis

  1. Rectal absorption of homatropine [14C] methylbromide in the rat

    International Nuclear Information System (INIS)

    Cramer, M.B.; Cates, L.A.; Clarke, D.E.

    1978-01-01

    Homatropine [ 14 C]methylbromide (HMB- 14 C) was administered to rats by intramuscular injection, oral gavage and rectal suppository. Plasma concentrations of 14 C were measured over the subsequent 12 h. Peak plasma concentrations were higher and achieved more rapidly after rectal administration than by other routes whether HMB- 14 C was administered in a water-soluble suppository base or in aqueous solution. Twelve h after the suppositories were inserted and retained 28% of the 14 C had been excreted in the urine while 56% remained in the large intestine. Unlabelled HMB, given in rectal suppositories to anaesthetized rats, caused prompt blockade of the effects of vagal stimulation on pulse rate and of intravenous acetylcholine on blood pressure. These results confirm the rapid rectal absorption of the drug. (author)

  2. Development of analytical method for the determination of carbinoxamine maleate, dextromethorphan hydrobromide and pseudoephedrine hydrochloride by HPLC

    International Nuclear Information System (INIS)

    Mahfoud, J.

    2007-01-01

    A simple and accurate method was developed for the analysis of carbinoxamine maleate, dextromethorphan hydrobromide and pseudoephedrine hydrochloride content in pure form and pharmaceutical preparations using HPLC. Analysis was conducted on a silica column (6 μm) with mobile phase consisting of ethanol - ammonium acetate (0.05 M) in rate [85:15] respectively, and at detection wavelength of 276 nm and flow rate 1 ml/min. Results were linear (correlation coefficient R > 0.9996) in the range of the studied concentrations for the active materials. The relative standard deviations (n=6) of intra and interday assay were 0.931%, 1.527% for carbinoxamine maleate and 0.717%, 1.058% for dextromethorphan hydrobromide and 0.309%, 0.891% for pseudoephedrine hydrochloride, respectively. This method, proved to be easy, precise and economical, is useful for quality control of pharmaceutical drugs industrial samples. (author)

  3. The efficacy of hyoscine hydrobromide in reducing side-effects induced during immersion in virtual reality.

    Science.gov (United States)

    Regan, E C; Ramsey, A D

    1996-03-01

    Regan and Price (1994) investigated the frequency of occurrence and severity of side-effects of using an immersion virtual reality system in 150 subjects: 61% of the subjects reported symptoms of malaise at some point during a 20-min immersion and 10-min post-immersion period. This paper describes a double-blind placebo-controlled study that investigated whether 300 microgram of hyoscine/scopolamine hydrobromide administered to subjects prior to immersion in virtual reality was effective in reducing side-effects experienced during immersion. It was hypothesized that the hyoscine hydrobromide would cause a significant reduction in reported symptoms. We administered 300 micrograms of hyoscine hydrobromide to 19 subjects, and 20 subjects were administered a placebo compound 40 min prior to a 20-min immersion in VR. Data on malaise were collected using a simulator sickness questionnaire and a malaise scale. A 2 x 2 Chi-square analysis comparing the numbers of subjects reporting no symptoms on the malaise scale with those reporting some symptoms in the placebo and hyoscine conditions showed the differences between the two groups to be statistically significant at the 0.01 level (Chi-square = 7.392 with 1 df, p = 0.007). This difference was clearly in the direction of fewer symptoms being reported in the hyoscine condition. The results of the study showed that the hyoscine was effective in reducing symptoms that are commonly observed during immersion in virtual reality.

  4. Development and validation of dissolution study of sustained release dextromethorphan hydrobromide tablets.

    Science.gov (United States)

    Rajan, Sekar; Colaco, Socorrina; Ramesh, N; Meyyanathan, Subramania Nainar; Elango, K

    2014-02-01

    This study describes the development and validation of dissolution tests for sustained release Dextromethorphan hydrobromide tablets using an HPLC method. Chromatographic separation was achieved on a C18 column utilizing 0.5% triethylamine (pH 7.5) and acetonitrile in the ratio of 50:50. The detection wavelength was 280 nm. The method was validated and response was found to be linear in the drug concentration range of 10-80 microg mL(-1). The suitable conditions were clearly decided after testing sink conditions, dissolution medium and agitation intensity. The most excellent dissolution conditions tested, for the Dextromethorphan hydrobromide was applied to appraise the dissolution profiles. The method was validated and response was found to be linear in the drug concentration range of 10-80 microg mL(-1). The method was established to have sufficient intermediate precision as similar separation was achieved on another instrument handled by different operators. Mean Recovery was 101.82%. Intra precisions for three different concentrations were 1.23, 1.10 0.72 and 1.57, 1.69, 0.95 and inter run precisions were % RSD 0.83, 1.36 and 1.57%, respectively. The method was successfully applied for dissolution study of the developed Dextromethorphan hydrobromide tablets.

  5. Preparation of a β-Cyclodextrin-Based Open-Tubular Capillary Electrochromatography Column and Application for Enantioseparations of Ten Basic Drugs.

    Directory of Open Access Journals (Sweden)

    Linlin Fang

    Full Text Available An open-tubular capillary electrochromatography column was prepared by chemically immobilized β-cyclodextrin modified gold nanoparticles onto new surface with the prederivatization of (3-mercaptopropyl-trimethoxysilane. The synthesized nanoparticles and the prepared column were characterized by transmission electron microscopy, scanning electron microscopy, infrared spectroscopy and ultraviolet visible spectroscopy. When the column was employed as the chiral stationary phase, no enantioselectivity was observed for ten model basic drugs. So β-cyclodextrin was added to the background electrolyte as chiral additive to expect a possible synergistic effect occurring and resulting in a better separation. Fortunately, significant improvement in enantioselectivity was obtained for ten pairs of drug enantiomers. Then, the effects of β-cyclodextrin concentration and background electrolyte pH on the chiral separation were investigated. With the developed separation mode, all the enantiomers (except for venlafaxine were baseline separated in resolutions of 4.49, 1.68, 1.88, 1.57, 2.52, 2.33, 3.24, 1.63 and 3.90 for zopiclone, chlorphenamine maleate, brompheniramine maleate, dioxopromethazine hydrochloride, carvedilol, homatropine hydrobromide, homatropine methylbromide, venlafaxine, sibutramine hydrochloride and terbutaline sulfate, respectively. Further, the possible separation mechanism involved was discussed.

  6. Manifestações neurológicas na intoxicação de lactentes pela associação dimeticona e homatropina: relato de 6 casos Neurological manifestations in the intoxication of infants by dimethicone plus homatropine: report of 6 cases

    Directory of Open Access Journals (Sweden)

    Vicente José Assencio-Ferreira

    2001-06-01

    Full Text Available OBJETIVO: alertar que o uso da associação dimeticona/homatropina (Espasmo Luftal® em recém-nascidos e lactentes de até dois meses, pode causar episódios disfuncionais transitórios extrapiramidais. MÉTODO: relato de 6 casos de crianças com menos de 2 meses, em uso diário da associação dimeticona/homatropina, que apresentaram sintomas agudos caracterizados por crises repetidas de curta duração com desvio tônico da cabeça para trás (opistótono, desvio do olhos para cima com olhar fixo e expressão de terror, postura mantida em hipertonia extensora dos 4 membros e emissão de choro e/ou sons guturais. RESULTADOS: os sintomas extrapiramidais desapareceram (e não retornaram após a suspensão da associação dimeticona/homatropina. Não foram constatadas anormalidades no exame neurológico, eletrencefalograma e provas sanguíneas. CONCLUSÕES: a associação dimeticona/homatropina pode determinar em crianças com menos de 2 meses, quadro disfuncional dos gânglios da base. É importante diferenciá-lo das crises epilépticas generalizadas, a fim de se evitar a utilização, errônea, de drogas antiepilépticas.OBJECTIVE: to alert that use of dimethicone plus homatropine in infants up to two months, may cause episodes of transitory extrapyramidal disturbances. METHOD: report 6 infants up to two months old, in daily use of dimethicone plus homatropine, that presented typical symptoms of the basal ganglia dysfunction, characterized by repeated crisis of short duration with tonic back shift of the head (opisthotonos, deviation of the eyes upward with looking fixed and terror expression, maintained in extending hipertonia posture of the 4 members and cry and/or guttural sounds emission. RESULTS: the extrapyramidal symptoms disappeared (and did not return after interruption of dimethicone plus homatropine. Abnormalities were not verified in neurological examination, EEG and blood tests. CONCLUSIONS: the associated dimethicone/homatropine

  7. Utility of eosin Y as a complexing reagent for the determination of citalopram hydrobromide in commercial dosage forms by fluorescence spectrophotometry.

    Science.gov (United States)

    Azmi, Syed Najmul Hejaz; Al-Fazari, Ahlam; Al-Badaei, Munira; Al-Mahrazi, Ruqiya

    2015-12-01

    An accurate, selective and sensitive spectrofluorimetric method was developed for the determination of citalopram hydrobromide in commercial dosage forms. The method was based on the formation of a fluorescent ion-pair complex between citalopram hydrobromide and eosin Y in the presence of a disodium hydrogen phosphate/citric acid buffer solution of pH 3.4 that was extractable in dichloromethane. The extracted complex showed fluorescence intensity at λem = 554 nm after excitation at 259 nm. The calibration curve was linear over at concentrations of 2.0-26.0 µg/mL. Under optimized experimental conditions, the proposed method was validated as per ICH guidelines. The effect of common excipients used as additives was tested and the tolerance limit calculated. The limit of detection for the proposed method was 0.121 μg/mL. The proposed method was successfully applied to the determination of citalopram hydrobromide in commercial dosage forms. The results were compared with the reference RP-HPLC method. Copyright © 2015 John Wiley & Sons, Ltd.

  8. Research of triamcinolone acetonide with compound anisodine hydrobromide for mild central retinal vein occlusion in early stage

    Directory of Open Access Journals (Sweden)

    Jun Fan

    2016-03-01

    Full Text Available AIM: To explore the clinical significance of triamcinolone acetonide combined with compound anisodine hydrobromide injection for the treatment of mild(non ischemiccentral retinal vein occlusion(CRVOin the early stage.METHODS: One hundred and sixteen eyes in 116 patients with non ischemic CRVO in early stage were randomly divided into four groups, group A, group B, group C, and group D. Divided by the completely random data method, each group had 29 eyes. Group A received no treatment. Group B was given compound anisodine hydrobromide injection in subcutaneous injection besides superficial temporal artery of the eye. Group C was injected with triamcinolone acetonide beside eyeballs and Group D was given triamcinolone acetonide combined with compound anisodine hydrobromide injection. In each group, we observed and recorded the best corrected visual acuity(BCVA, using EDTRS chart, bleeding, optical coherence tomography(OCTscanning for central macular thickness(CMT, fundus fluorescence angiography(FFAimaging check for the possibility of ischemic CRVO at 1, 2, 4, 8 and 12wk respectively. The total curative effect after 3mo was being compared among the three groups.RESULTS: After 12 weeks' treatment, the mean BCVA was lower and the mean CMT was higher in group A than those before the treatment. The mean BCVA was increased and the mean CMT decreased in group B, C and D after treated for 3mo. Comparing Group D with the rest groups, the variation of BCVA and CMT had statistical significance(PP>0.05. Ischemic CRVO was found in 8 cases of group A, 6 cases of group B, 5 cases of group C, and 2 cases of group D,and the difference was not statistically significant(χ2=4.361; P=0.225. Flame-shaped bleeding was found in 14 cases of group A, 7 cases of group B, 9 cases of group C and 4 cases of group D and the difference was statistically significant(χ2=8.821; P=0.032. CONCLUSION: The combination of triamcinolone acetonide and compound anisodine hydrobromide

  9. Canagliflozin prevents scopolamine-induced memory impairment in rats: Comparison with galantamine hydrobromide action.

    Science.gov (United States)

    Arafa, Nadia M S; Ali, Elham H A; Hassan, Mohamed Kamel

    2017-11-01

    Canagliflozin (CAN) is a sodium-glucose co-transporter 2 (SGLT2) inhibitor indicated to improve glycemic control in adults with type 2 diabetes mellitus. There is a little information about its effect on the cholinergic system that proposed mechanism for memory improvement occurring by SGLT2 drugs. This study aimed to estimate the effect of CAN as compared to galantamine (GAL) treatments for two weeks on scopolamine hydrobromide (SCO)-induced memory dysfunction in experimental rats. Animals divided into six groups; control (CON), CAN, GAL, SCO, SCO + CAN and SCO + GAL. Results indicated significant decrease in body weights of the CAN groups as compared to control values. Moreover, in the SCO + CAN and SCO + GAL the number of arm entry and number of correct alternation in Y maze task increased and showed improvement in the water maze task, acetylcholinesterase (AChE) activities decreased significantly, while monoamines levels significantly increased compared with the SCO group values. Results also recorded acetylcholine M1 receptor (M1 mAChR) in SCO + CAN or SCO + GAL groups in comparison with the SCO group. The study suggested that canagliflozin might improve memory dysfunction induced by scopolamine hydrobromide via cholinergic and monoamines system. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. The effect of citalopram hydrobromide on 5-HT2A receptors in the impulsive-aggressive dog, as measured with 123I-5-I-R91150 SPECT

    International Nuclear Information System (INIS)

    Peremans, K.; Hoybergs, Y.; Gielen, I.; Audenaert, K.; Vervaet, M.; Heeringen, C. van; Otte, A.; Goethals, I.; Dierckx, R.; Blankaert, P.

    2005-01-01

    Involvement of the serotonergic system in impulsive aggression has been demonstrated in both human and animal studies. The purpose of the present study was to investigate the effect of citalopram hydrobromide (a selective serotonin re-uptake inhibitor) on the 5-HT 2A receptor and brain perfusion in impulsive-aggressive dogs by means of single-photon emission computed tomography. The binding index of the radioligand 123 I-5-I-R91150 was measured before and after treatment with citalopram hydrobromide in nine impulsive-aggressive dogs. Regional perfusion was measured with 99m Tc-ethyl cysteinate dimer (ECD). Behaviour was assessed before treatment and again after 6 weeks of treatment. A correlation was found between decreased binding and behavioural improvement in eight out of nine dogs. The 5-HT 2A receptor binding index was significantly reduced after citalopram hydrobromide treatment in all cortical regions but not in the subcortical area. None of the dogs displayed alterations in perfusion on the post-treatment scans. This study supports previous findings regarding the involvement of the serotonergic system in impulsive aggression in dogs in general. More specifically, the effect of treatment on the 5-HT 2A receptor binding index could be demonstrated and the decreased binding index correlated with behavioural improvement. (orig.)

  11. The effect of citalopram hydrobromide on 5-HT{sub 2A} receptors in the impulsive-aggressive dog, as measured with {sup 123}I-5-I-R91150 SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Peremans, K.; Hoybergs, Y.; Gielen, I. [Ghent University, Department of Medical Imaging, Faculty of Veterinary Medicine, Merelbeke (Belgium); Audenaert, K.; Vervaet, M.; Heeringen, C. van [Ghent University, Department of Psychiatry and Medical Psychology, Gent (Belgium); Otte, A.; Goethals, I.; Dierckx, R. [Ghent University Hospital, Division of Nuclear Medicine, Gent (Belgium); Blankaert, P. [Ghent University, Laboratory of Radiopharmacy, Gent (Belgium)

    2005-06-01

    Involvement of the serotonergic system in impulsive aggression has been demonstrated in both human and animal studies. The purpose of the present study was to investigate the effect of citalopram hydrobromide (a selective serotonin re-uptake inhibitor) on the 5-HT{sub 2A} receptor and brain perfusion in impulsive-aggressive dogs by means of single-photon emission computed tomography. The binding index of the radioligand {sup 123}I-5-I-R91150 was measured before and after treatment with citalopram hydrobromide in nine impulsive-aggressive dogs. Regional perfusion was measured with {sup 99m}Tc-ethyl cysteinate dimer (ECD). Behaviour was assessed before treatment and again after 6 weeks of treatment. A correlation was found between decreased binding and behavioural improvement in eight out of nine dogs. The 5-HT{sub 2A} receptor binding index was significantly reduced after citalopram hydrobromide treatment in all cortical regions but not in the subcortical area. None of the dogs displayed alterations in perfusion on the post-treatment scans. This study supports previous findings regarding the involvement of the serotonergic system in impulsive aggression in dogs in general. More specifically, the effect of treatment on the 5-HT{sub 2A} receptor binding index could be demonstrated and the decreased binding index correlated with behavioural improvement. (orig.)

  12. PVC membrane, coated-wire, and carbon-paste ion-selective electrodes for potentiometric determination of galantamine hydrobromide in physiological fluids.

    Science.gov (United States)

    Abdel-Haleem, Fatehy M; Saad, Mohamed; Barhoum, Ahmed; Bechelany, Mikhael; Rizk, Mahmoud S

    2018-08-01

    We report on highly-sensitive ion-selective electrodes (ISEs) for potentiometric determining of galantamine hydrobromide (GB) in physiological fluids. Galantamine hydrobromide (GB) was selected for this study due to its previous medical importance for treating Alzheimer's disease. Three different types of ISEs were investigated: PVC membrane electrode (PVCE), carbon-paste electrode (CPE), and coated-wire electrode (CWE). In the construction of these electrodes, galantaminium-reineckate (GR) ion-pair was used as a sensing species for GB in solutions. The modified carbon-paste electrode (MCPE) was prepared using graphene oxide (MCPE-GO) and sodium tetrakis (trifluoromethyl) phenyl borate (MCPE-STFPB) as ion-exchanger. The potentiometric modified CPEs (MCPE-GO and MCPE-STFPB) show an improved performance in term of Nernstian slope, selectivity, response time, and response stability compared to the unmodified CPE. The prepared electrodes PVCE, CWE, CPE, MCPE-GO and MCPE-STFPB show Nernstian slopes of 59.9, 59.5, 58.1, 58.3 and 57.0 mV/conc. decade, and detection limits of 5.0 × 10 -6 , 6.3 × 10 -6 , 8.0 × 10 -6 , 6.0 × 10 -6 and 8.0 × 10 -6  mol L -1 , respectively. The prepared ISEs also show high selectivity against cations (i.e. Na + , K + , NH 4 + , Ca 2+ , Al 3+ , Fe 3+ ), amino acids (i.e. glycine, L-alanine alanine), and sugars (i.e. fructose, glucose, maltose, lactose). The prepared ISEs are applicable for determining GB in spiked serums, urines, and pharmaceutical preparations, using a standard addition and a direct potentiometric method. The fast response time (<10 s), long lifetime (1-5 weeks), reversibility and stability of the measured signals facilitate the application of these sensors for routine analysis of the real samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Steviamine, a new class of indolizidine alkaloid [(1R,2S,3R,5R,8aR-3-hydroxymethyl-5-methyloctahydroindolizine-1,2-diol hydrobromide

    Directory of Open Access Journals (Sweden)

    Amber L. Thompson

    2009-11-01

    Full Text Available X-ray crystallographic analysis of the title hydrobromide salt, C10H20N+·Br−, of (1R,2S,3R,5R,8aR-3-hydroxymethyl-5-methyloctahydroindolizine-1,2-diol defines the absolute and relative stereochemistry at the five chiral centres in steviamine, a new class of polyhydroxylated indolizidine alkaloid isolated from Stevia rebaudiana (Asteraceae leaves. In the crystal structure, molecules are linked by intermolecular O—H...Br and N—H...Br hydrogen bonds, forming double chains around the twofold screw axes along the b-axis direction. Intramolecular O—H...O interactions occur.

  14. Crystal structure and thermochemical properties of 1-decylammonium hydrobromide (C10H21NH3Br)(s)

    International Nuclear Information System (INIS)

    Zhang Lijun; Di Youying; Lu Dongfei

    2011-01-01

    Highlights: → Crystal structure of 1-decylammonium hydrobromide was reported. → Lattice potential energy of the compound was obtained. → Molar volumes of the compound and its cation were obtained. → Ionic radius of its cation of the compound was calculated. → Molar enthalpy of dissolution at infinite dilution was determined. → Hydration enthalpies of the compound and its cation were calculated. - Abstract: The crystal structure of 1-decylammonium hydrobromide was determined by X-ray crystallography. Lattice potential energy and molar volumes of the solid compound and its cation were obtained respectively. The ionic radius of the cation can be calculated from the corresponding effective volume of the cation. The molar enthalpies of dissolution of the compound at different concentrations m/(mol . kg -1 ) at T = 298.15 K were measured by an isoperibol solution-reaction calorimeter at T = 298.15 K. According to the Pitzer's electrolyte solution theory, the molar enthalpy of dissolution of the compound at infinite dilution (Δ s H m ∞ ) and Pitzer parameters (β MX (0)L and β MX (1)L ) were obtained. The values of apparent relative molar enthalpies ( Φ L) of the title compound and relative partial molar enthalpies (L 2 -bar and L 1 -bar) of the solute and the solvent at different concentrations were derived from the experimental values of the enthalpy of dissolution of the compound. Finally, hydration enthalpies of the compound and its cation were calculated by designing a thermochemical cycle in accordance with lattice potential energy and the molar enthalpy of dissolution of the title compound at infinite dilution.

  15. A novel spray-dried nanoparticles-in-microparticles system for formulating scopolamine hydrobromide into orally disintegrating tablets

    Directory of Open Access Journals (Sweden)

    Li FQ

    2011-04-01

    Full Text Available Feng-Qian Li1, Cheng Yan2, Juan Bi1, Wei-Lin Lv3, Rui-Rui Ji3, Xu Chen1, Jia-Can Su3, Jin-Hong Hu31Department of Pharmaceutics, Shanghai Eighth People’s Hospital, Shanghai, People’s Republic of China; 2Department of Pharmacy, Bethune International Peace Hospital, Shijiazhuang, People’s Republic of China; 3Changhai Hospital, Second Military Medical University, Shanghai, People’s Republic of ChinaAbstract: Scopolamine hydrobromide (SH-loaded microparticles were prepared from a colloidal fluid containing ionotropic-gelated chitosan nanoparticles using a spray-drying method. The spray-dried microparticles were then formulated into orally disintegrating tablets (ODTs using a wet granulation tablet formation process. A drug entrapment efficiency of about 90% (w/w and loading capacity of 20% (w/w were achieved for the microparticles, which ranged from 2 µm to 8 µm in diameter. Results of disintegration tests showed that the formulated ODTs could be completely dissolved within 45 seconds. Drug dissolution profiles suggested that SH is released more slowly from tablets made using the microencapsulation process compared with tablets containing SH that is free or in the form of nanoparticles. The time it took for 90% of the drug to be released increased significantly from 3 minutes for conventional ODTs to 90 minutes for ODTs with crosslinked microparticles. Compared with ODTs made with noncrosslinked microparticles, it was thus possible to achieve an even lower drug release rate using tablets with appropriate chitosan crosslinking. Results obtained indicate that the development of new ODTs designed with crosslinked microparticles might be a rational way to overcome the unwanted taste of conventional ODTs and the side effects related to SH’s intrinsic characteristics.Keywords: scopolamine hydrobromide, chitosan, nanoparticles-in-microparticles system, spray-drying, orally disintegrating tablets

  16. Pharmacologic Evaluation of Antidepressant Activity and Synthesis of 2-Morpholino-5-phenyl-6H-1,3,4-thiadiazine Hydrobromide

    Directory of Open Access Journals (Sweden)

    Alexey P. Sarapultsev

    2016-05-01

    Full Text Available Substituted thiadiazines exert a reliable therapeutic effect in treating stress, and a schematic description of their ability to influence all aspects of a stress response has been depicted. This study was conducted to pharmacologically evaluate compound L-17, a substituted thiadiazine, (2-morpholino-5-phenyl-6H-1,3,4-thiadiazine, hydrobromide for possible anti-psychotic/antidepressant activity. Compound L-17 was synthesized by cyclocondensation of α-bromoacetophenone with the original morpholine-4-carbothionic acid hydrazide. Pharmacologic evaluations were conducted using methods described by E.F. Lavretskaya (1985, and in accordance with published guidelines for studying drugs for neuroleptic activity. Compound L-17 was evaluated for various possible mechanisms of action, including its effects on cholinergic system agonists/antagonists, dopaminergic neurotransmission, the adrenergic system, and 5-HT3 serotonin receptors. One or more of these mechanisms may be responsible for the beneficial effects shown by thiadiazine compounds in experiments conducted to evaluate their activity in models of acute stress and acute myocardial infarction.

  17. [The protection of hydrogen-rich saline on a rat dry eye model induced by scopolamine hydrobromide].

    Science.gov (United States)

    Chu, Y Y; Hua, N; Ru, Y S; Zhao, S Z

    2017-05-11

    Objective: To evaluate the effect of hydrogen-rich saline (HRS) on dry eye rats induced by subcutaneous injection of scopolamine hydrobromide. Methods: Experiment research. Thirty female Wistar rats at about six weeks old were randomly divided into the normal group, dry eye group, HRS eyedrops group, normal saline eyedrops group (NS), HRS intraperitoneal injection group and NS intraperitoneal injection group, with 5 rats in each group. The dry eye was induced by subcutaneous injection of scopolamine hydrobromide in the latter five groups. The clinical signs of dry eye such as tear volume (SⅠt), tear break-up time (BUT) and corneal epithelial fluorescein staining scores were evaluated on day 7, 14, 21 and 28. On the 28th day, ten eyes in each group were enucleated and processed for paraffin sections for HE, PAS and immunohistochemistry stainings. Analysis of variance was used to test the data, and independent samples t -test was used for comparison between the two groups. Two-way repeated measure ANOVA was used to compare the difference among groups at different time points, one-way ANOVA was used to test the comparisons of the clinical signs at one time, and LSD was used to for comparison between two groups. Results: Before and after the experiment of the day 7, 14, 21, 28, the values of SIt in HRS eyedrops group and HRS intraperitoneal injection group were respectively:(3.625±1.157),(3.313±0.704),(3.250±0.535),(3.313±0.372), (3.375±0.582)mm and (3.500±1.019), (2.893±0.656), (3.321±0.668), (3.179±0.575), (3.214±0.871)mm. The values of BUT were respectively: (2.750±0.707), (2.688±0.594), (2.813±0.753), (3.000±0.756), (2.750±0.707)s and (3.000±0.679), (2.321±0.464), (2.750±0.753), (3.214±0.699), (2.679±0.608)s. The values of fluorescein staining score were respectively: (6.250±0.707), (8.875±0.641), (8.750±0.707), (9.250±0.463), (8.250±1.282) and (6.000±0.679), (9.143±1.027), (8.857±0.770), (9.143±0.949), (8.500±0.760). The difference

  18. Protection to glycolysis by a combination of 5-hydroxy-L-tryptophan and 2-aminoethylisothiuronium bromide hydrobromide in lethally irradiated rats

    International Nuclear Information System (INIS)

    Basu, S.K.; Srinivasan, M.N.; Chuttani, K.; George, S.

    1992-01-01

    Rate of glycolysis in vivo at different time intervals following 8 Gy[LDsub(100(30)] whole body gamma radiation (WBGR) was evaluated by estimating liver glycogen, blood sugar, serum lactic dehydrogenase (LDH) and lactic acid concentration in adult male Sprague Dawley rats. Within 1 hr of radiation exposure, a significant fall in liver glycogen was observed in rats fed food and water ad libitum. The glycogen content increased after 24 hr and had returned to control level on 7th day after radiation exposure. Blood sugar, serum LDH and blood lactate levels increased significantly as compared to non irradiated controls. Pretreatment with 5-hydroxy-L-tryptophan (5-HTP;100 mg/kg) + 2-aminoethylisothiuronium bromide hydrobromide (AET;20 mg/kg)ip 30 min before 8 Gy WBGR, modified these values and restored them to normal level on 7th day post-irradiation. (author). 24 refs

  19. Pharmacokinetic profile of dextromethorphan hydrobromide in a syrup formulation in children and adolescents.

    Science.gov (United States)

    Guenin, Eric; Armogida, Marianna; Riff, Dennis

    2014-09-01

    Dextromethorphan hydrobromide (DM) is a widely used antitussive. This study determined, for the first time, the basic pharmacokinetic profile of DM and its active metabolite, dextrorphan (DP) in children and adolescents. Thirty-eight male and female subjects at risk for developing an upper respiratory tract infection (URTI), or symptomatic with cough due to URTI, were enrolled in this single-dose, open-label study: ages 2-5 years (Group A, n = 8), 6-11 years (Group B, n = 17), 12-17 years (Group C, n = 13). Subjects were genotyped for cytochrome P450 (CYP) 2D6 polymorphisms and characterized as poor (PM) or non-poor metabolizers (non-PM). Groups A and B were dosed using an age-weight dosing schedule (DM range 7.5-24.75 mg); a 30-mg dose was used for Group C. Average exposures to total DP increased as age group increased, and average exposure to DM was highest in the adolescent group. One subject in that group was a PM. The terminal half-life (t ½) values were longer in the adolescent group due in part to the single PM subject. No relationship between body weight and pharmacokinetic parameters was noted. This is the first evaluation of the pharmacokinetic characteristics of DM in children and adolescents. A single dose of DM in this population was safe, and well tolerated at all doses tested. The data are used to model and compare pediatric DM exposures with those of adults.

  20. A novel spray-dried nanoparticles-in-microparticles system for formulating scopolamine hydrobromide into orally disintegrating tablets

    Science.gov (United States)

    Li, Feng-Qian; Yan, Cheng; Bi, Juan; Lv, Wei-Lin; Ji, Rui-Rui; Chen, Xu; Su, Jia-Can; Hu, Jin-Hong

    2011-01-01

    Scopolamine hydrobromide (SH)-loaded microparticles were prepared from a colloidal fluid containing ionotropic-gelated chitosan nanoparticles using a spray-drying method. The spray-dried microparticles were then formulated into orally disintegrating tablets (ODTs) using a wet granulation tablet formation process. A drug entrapment efficiency of about 90% (w/w) and loading capacity of 20% (w/w) were achieved for the microparticles, which ranged from 2 μm to 8 μm in diameter. Results of disintegration tests showed that the formulated ODTs could be completely dissolved within 45 seconds. Drug dissolution profiles suggested that SH is released more slowly from tablets made using the microencapsulation process compared with tablets containing SH that is free or in the form of nanoparticles. The time it took for 90% of the drug to be released increased significantly from 3 minutes for conventional ODTs to 90 minutes for ODTs with crosslinked microparticles. Compared with ODTs made with noncrosslinked microparticles, it was thus possible to achieve an even lower drug release rate using tablets with appropriate chitosan crosslinking. Results obtained indicate that the development of new ODTs designed with crosslinked microparticles might be a rational way to overcome the unwanted taste of conventional ODTs and the side effects related to SH’s intrinsic characteristics. PMID:21720502

  1. Analysis of Dextromethorphan in Cough Drops and Syrups: A Medicinal Chemistry Laboratory

    Science.gov (United States)

    Hamilton, Todd M.; Wiseman, Frank L., Jr.

    2009-01-01

    Fluorescence spectroscopy is used to determine the quantity of dextromethorphan hydrobromide (DM) in over-the-counter (OTC) cough drops and syrups. This experiment is appropriate for an undergraduate medicinal chemistry laboratory course when studying OTC medicines and active ingredients. Students prepare the cough drops and syrups for analysis,…

  2. Utility of Charge Transfer and Ion-Pair Complexation for Spectrophotometric Determination of Eletriptan Hydrobromide in Pure and Dosage Forms

    Directory of Open Access Journals (Sweden)

    Ayman A. Gouda

    2013-01-01

    Full Text Available Three simple, sensitive, and accurate spectrophotometric methods have been developed for the determination of eletriptan hydrobromide (ELT in pure and dosage forms. The first two methods are based on charge transfer complex formation between ELT and chromogenic reagents quinalizarin (Quinz and alizarin red S (ARS producing charge transfer complexes which showed an absorption maximum at 569 and 533 nm for Quinz and ARS, respectively. The third method is based on the formation of ion-pair complex between ELT with molybdenum(V-thiocyanate inorganic complex in hydrochloric acid medium followed by extraction of the colored ion-pair with dichloromethane and measured at 470 nm. Different variables affecting the reactions were studied and optimized. Beer's law is obeyed in the concentration ranges 2.0–18, 1.0–8.0, and 2.0–32 μg mL−1 for Quinz, ARS, and Mo(V-thiocyanate, respectively. The molar absorptivity, Sandell sensitivity, detection, and quantification limits are also calculated. The correlation coefficients were ≥0.9994 with a relative standard deviation (R.S.D%. of ≤0.925. The proposed methods were successfully applied for simultaneous determination of ELT in tablets with good accuracy and precision and without interferences from common additives, and the validity is assessed by applying the standard addition technique, which is compared with those obtained using the reported method.

  3. Pattern of Refractive Errors Among Ophthalmic Outpatients of ...

    African Journals Online (AJOL)

    The etiologic mechanism of RE can be both genetic ... system of a nonaccommodating eye fails to bring parallel rays of light to focus on the .... homatropine 1% eye‑drops. .... Abdull MM, Sivasubramaniam S, Murthy GV, Gilbert C, Abubakar T,.

  4. Development and validation of RP-HPLC method for analysis of multicomponent cough-cold syrup formulation

    OpenAIRE

    Ivković, Branka; Marković, Bojan; Vladimirov, Sote

    2014-01-01

    In this study a reversed phase HPLC method for rapid and simultaneous identification and quantification of doxylamine succinate, ephedrine sulfate, dextrometorphane hydrobromide, paracetamole and sodium benzoate in cough-cold syrup formulation was described. Separation was carried out on XTerraTM RP 18, Waters (150 mm x 4.6 mm column, 5 μm particle size). For the analysis of investigated substances gradient elution was used employing water, pH adjusted at 2.5 with 85 % ortophosphoric acid as ...

  5. LC for analysis of two sustained-release mixtures containing cough cold suppressant drugs.

    Science.gov (United States)

    El-Gindy, Alaa; Sallam, Shehab; Abdel-Salam, Randa A

    2010-07-01

    A liquid chromatographic method was applied for the analysis of two sustained-release mixtures containing dextromethorphane hydrobromide, carbinoxamine maleate with either phenylephrine hydrochloride in pharmaceutical capsules (Mix 1) or phenyl-propanolamine, methylparaben, and propylparaben, which bonds as a drug base to ion exchange resin in pharmaceutical syrup (Mix 2). The method was used for their simultaneous determination using a CN column with a mobile phase consisting of acetonitrile-12 mM ammonium acetate in the ratio of 60:40 (v/v, pH 6.0) for Mix 1 and 45:55 (v/v, pH 6.0) for Mix 2.

  6. Lattice potential energy and standard molar enthalpy in the formation of 1-dodecylamine hydrobromide(1-C12H25NH3·Br)(s)

    Institute of Scientific and Technical Information of China (English)

    Liu Yu-Pu; Di You-Ying; Dan Wen-Yan; He Dong-Hua; Kong Yu-Xia; Yang Wei-Wei

    2011-01-01

    This paper reports that 1-dodecylamine hydrobromide (1-C12H25NH3·Br)(s) has been synthesized using the liquid phase reaction method. The lattice potential energy of the compound 1-C12H25NH3·Br and the ionic volume and radius of the 1-C12H25NH3+ cation are obtained from the crystallographic data and other auxiliary ther-modynamic data. The constant-volume energy of combustion of 1-C12H25NH3·Br(s) is measured to be △cUm°(1-C12H25NH3·Br, s) =-(7369.03±3.28) kJ·mol-1 by means of an RBC-Ⅱ precision rotating-bomb combustion calorimeter at T=(298.15±0.001) K. The standard molar enthalpy of combustion of the compound is derived to be △cHm°(1-C12H25NH3·Br, s)=-(7384.52±3.28) kJ·mol-1 from the constant-volume energy of combustion. The standard molar enthalpy of formation of the compound is calculated to be △fHm°(1-C12H25NH3·Br, s)=-(1317.86±3.67) kJ·mol-1 from the standard molar enthalpy of combustion of the title compound and other auxiliary thermodynamic quantities through a thermochemical cycle.

  7. Preparation of Chemicals and Bulk Drug Substances for the U.S. Army Drug Development Program

    Science.gov (United States)

    1997-12-01

    Turning to Chart No. 7, dextromethorphan hydrobromide was converted to the free base, then treated with 1-chloroethyl chloroformate to give the N...route, shown in Chart No. 9, was used m the current resynthesis. Commercial dextromethorphan hydrobromide was treated with concentrated hydrobromic

  8. Determinação espectrofotométrica do bromidrato de fenoterol em preparações farmacêuticas Spectrophotometric determination of fenoterol hydrobromide in pharmaceutical preparations

    Directory of Open Access Journals (Sweden)

    Maria Lucilia Motinha Zamuner

    2008-12-01

    Full Text Available Um método espectrofotométrico simples foi desenvolvido para a determinação do bromidrato de fenoterol (BF em comprimidos, gotas e xarope, como princípio ativo único e associado ao ibuprofeno. O método se baseia na reação de acoplamento oxidativo do BF com o 3-metil-2-benzotiazolinona hidrazona (MBTH, na presença de sulfato cérico, como agente oxidante. A mistura de BF, MBTH e sulfato cérico, em meio ácido, produz um composto colorido (vermelho castanho com máximo de absorção a 475 nm. A curva de calibração foi linear num intervalo de concentração de 3,0 a 12,0 µg/mL, com coeficiente de correlação linear de 0,9998. Os parâmetros experimentais que afetam o desenvolvimento e a estabilidade do produto colorido foram cuidadosamente estudados e otimizados. O método foi aplicado em amostras comerciais e simuladas, obtendo-se coeficientes de variação entre 0,25 % a 0,82 % e médias de recuperação do padrão que variaram de 98 % a 102 %. O método proposto mostrou-se exato, preciso, linear e não é passível de interferência de excipientes, para as formas farmacêuticas comprimidos e gotas. Não houve interferência do ibuprofeno que consta de uma das formulações analisadas, associado ao BF. Quanto ao xarope, houve interferência do veículo sugerindo reações de seus componentes com o MBTH.A simple spectrophotometric method has been developed for the determination of fenoterol hydrobromide (FH in tablets, drops and syrup, as the only active principle and associated with ibuprofen. The method is based on the oxidative coupling reaction of the FH with 3-methyl-2-benzothiazolinone hydrazone (MBTH and ceric sulphate as oxidant reagent. The mixture of the drug, MBTH and ceric sulfate, in acid medium, produces a red brown color compound, with absorption maximum at 475 nm. The calibration curve was linear over a concentration range from 3.0 to 12.0 µg/mL, with correlation coefficient of 0.9998. The different experimental

  9. Validation of an HPLC method for the simultaneous determination of eletriptan and UK 120.413

    Directory of Open Access Journals (Sweden)

    LJILJANA ZIVANOVIC

    2006-11-01

    Full Text Available Arapid and sensitive RPHPLCmethod was developed for the routine control analysis of eletriptan hydrobromide and its organic impurity UK 120.413 in Relpax® tablets. The chromatography was performed at 20 °Cusing a C18 XTerraTM (5 m, 150 × 4,6 mm column at a flow rate 1.0 ml/min. The drug and its impurity were detected at 225 nm. The mobile phase consisted of TEA (1 % – methanol (67.2:32.8 v/v, the pH of which was adjusted to 6.8 with 85 % orthophosphoric acid. Quantification was accomplished by the internal standard method. The developed RP HPLC method was validated by testing: accuracy, precision, repeatibility, specificity, detection limit, quantification limit, linearity, robustness and sensitivity. High linearity of the analytical procedure was confirmed over the concentration range of 0.05 – 1.00 mg/ml for eletriptan hydrobromide and from 0.10 – 1.50 µg/ml for UK 120.413, with correlation coefficients greater than r = 0.995. The low value of the RSD expressed the good repeatability and precision of the method. Experimental design and a response surface method were used to test robustness of the analytical procedure and to evaluate the effect of variation of the method parameters, namely the mobile phase composition, pH and temperature. They showed small deviations from the method setting. The good recovery and low RSD confirm the suitability of the proposed RP HPLC method for the routine determination of eletriptan hydrobromide and its impurity UK 120.413 in Relpax® tables.

  10. Dgroup: DG01283 [KEGG MEDICUS

    Lifescience Database Archive (English)

    Full Text Available igliptin hydrobromide hydrate (JAN) ... Antidiabetic agent ... DG01601 ... DPP-4 inhibitor Unclassified ... DG02044 ... H...ypoglycemics ... DG01601 ... DPP-4 inhibitor ... DPP4 inhibitor, antidiabetics DPP4 [HSA:1803] [KO:K01278] ...

  11. Drug: D04068 [KEGG MEDICUS

    Lifescience Database Archive (English)

    Full Text Available D04068 Drug Estrazinol hydrobromide (USAN) ... C20H25NO2. HBr D04068.gif ... Other ... DG01584 ... Estr...ogen receptor agonist ... DG01986 ... Estrogen ... Estrogen ... CAS: 15179-97-2 PubChem: 47206025 ChEMBL: CHEMBL2106221 LigandBox: D04068 ...

  12. A convenient synthesis of sup 14 C-cotinine from sup 14 C-nicotine

    Energy Technology Data Exchange (ETDEWEB)

    Desai, D.H.; Djordjevic, M.V.; Amin, S. (American Health Foundation, Valhalla, NY (USA). Naylor Dana Inst. for Disease Prevention)

    1991-03-01

    A convenient synthesis with analytical monitoring of {sup 14}C-cotinine is reported. {sup 14}C-Nicotine was converted into {sup 14}C-dibromocotinine hydrobromide perbromide. Debromination, achieved by using Zn dust/acetic acid, resulted in high yields (71%) of {sup 14}C-cotinine. (author).

  13. The GABAA Antagonist DPP-4-PIOL Selectively Antagonises Tonic over Phasic GABAergic Currents in Dentate Gyrus Granule Cells

    DEFF Research Database (Denmark)

    Boddum, Kim; Frølund, Bente; Kristiansen, Uffe

    2014-01-01

    that phasic and tonic GABAA receptor currents can be selectively inhibited by the antagonists SR 95531 and the 4-PIOL derivative, 4-(3,3-diphenylpropyl)-5-(4-piperidyl)-3-isoxazolol hydrobromide (DPP-4-PIOL), respectively. In dentate gyrus granule cells, SR 95531 was found approximately 4 times as potent...

  14. Author Details

    African Journals Online (AJOL)

    Ofoefule, SI. Vol 11 (2006) - Articles In vitro evaluation and application of Carbopol 940- tragacanth binary mixtures in the formulation of bio-adhesive hyoscine hydrobromide tablet. Abstract PDF · Vol 9 (2004) - Articles Use of mathematical approximationns in the elucidation of drug release mechanisms from a high viscosity ...

  15. 76 FR 82302 - Determination That HYCODAN (Hydrocodone Bitartrate and Homatropine Methylbromide) Tablets, 5...

    Science.gov (United States)

    2011-12-30

    ... discontinued from marketing for reasons other than safety or effectiveness. ANDAs that refer to HYCODAN... Effectiveness AGENCY: Food and Drug Administration, HHS. ACTION: Notice. SUMMARY: The Food and Drug... effectiveness. This determination will allow FDA to approve abbreviated new drug applications (ANDAs) for...

  16. Evaluation of some anionic exchange resins as potential tablet ...

    African Journals Online (AJOL)

    The effect of resin concentration and compression force on the properties of tablets using the selected resin was investigated. In addition, the disintegrant efficacy of the selected resin in the tablet formulations containing either a basic drug, e.g., dextromethorphan hydrobromide (DMP), or an acidic drug, e.g., diclofenac ...

  17. Features of Acute Treatment of Bronchial Obstruction Syndrome in Infants

    Directory of Open Access Journals (Sweden)

    Ye.N. Okhotnikova

    2012-04-01

    Full Text Available The paper is devoted to a problem of bronchial obstruction in infants. There have been considered the pathogenesis of this pathology, its clinical manifestation and complications, features of treatment focusing on combined therapy use (medication Berodual containing β2-agonist of fenoterol hydrobromide and anticholinergic drug ipratropium bromide.

  18. Transdermal hyoscine induced unilateral mydriasis.

    LENUS (Irish Health Repository)

    Hannon, Breffni

    2012-03-20

    The authors present a case of unilateral mydriasis in a teenager prescribed transdermal hyoscine hydrobromide (scopolamine) for chemotherapy induced nausea and vomiting. The authors discuss the ocular side-effects associated with this particular drug and delivery system and the potential use of transdermal hyoscine as an antiemetic agent in this group.

  19. Journal of Phytomedicine and Therapeutics - Vol 8 (2003)

    African Journals Online (AJOL)

    In vitro evaluation and application of Carbopol 940- tragacanth binary mixtures in the formulation of bio-adhesive hyoscine hydrobromide tablet · EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT. SI Ofoefula. http://dx.doi.org/10.4314/jopat.v8i1.47058 ...

  20. Effects of Capsule Yi -Zhi on learning and memory disorder and beta-amyloid peptide induced neurotoxicity in rats

    Institute of Scientific and Technical Information of China (English)

    XUJiang-Ping; WUHang-Yu; LILin

    2004-01-01

    AIM To investigate the effects of Capsule Yi-Zhi (CYZ) on learning and memory disorder and beta-amyloid protein induced neurotoxieity in rats. Methods Various doses of CYZ were administered to Sprague-Dawley (SD) rats for 8 days, twice a day. Then scopolamine hydrobromide (Sco) intraperitoneal injection was performed on each rat and the

  1. Studies on chemical protectors against radiation, 18

    International Nuclear Information System (INIS)

    Shinoda, Masato; Ohta, Setsuko; Hayase, Yukitoshi

    1978-01-01

    Radiation protective effect of S,2-aminomethylisothiuronium bromide hydrobromide and 2-mercaptoethylamine hydrochloride was tested on mice irradiated with soft x-ray of 70 kVp, using life-prolongation effect as an index. These compounds showed a marked effect on mice irradiated with 11000--13000 R, using a 10 mm acrylate filter. This method seemed to be usable as a potency testing for chemical radioprotectors. (auth.)

  2. Development of specific dopamine D-1 agonists and antagonists

    International Nuclear Information System (INIS)

    Sakolchai, S.

    1987-01-01

    To develop potentially selective dopamine D-1 agonists and to investigate on the structural requirement for D-1 activity, the derivatives of dibenzocycloheptadiene are synthesized and pharmacologically evaluated. The target compounds are 5-aminomethyl-10,11-dihydro-1,2-dihydroxy-5H-dibenzo[a,d]cycloheptene hydrobromide 10 and 9,10-dihydroxy-1,2,3,7,8,12b-hexahydrobenzo[1,2]cyclohepta[3,4,5d,e]isoquinoline hydrobromide 11. In a dopamine-sensitive rat retinal adenylate cyclase assay, a model for D-1 activity, compound 10 is essentially inert for both agonist and antagonist activity. In contrast, compound 11 is approximately equipotent to dopamine in activation of the D-1 receptor. Based on radioligand and binding data, IC 50 of compound 11 for displacement of 3 H-SCH 23390, a D-1 ligand, is about 7 fold less than that for displacement of 3 H-spiperone, a D-2 ligand. These data indicate that compound 11 is a potent selective dopamine D-1 agonist. This study provides a new structural class of dopamine D-1 acting agent: dihydroxy-benzocycloheptadiene analog which can serve as a lead compound for further drug development and as a probe for investigation on the nature of dopamine D-1 receptor

  3. Death rattle: prevalence, prevention and treatment.

    Science.gov (United States)

    Wildiers, Hans; Menten, Johan

    2002-04-01

    A retrospective analysis was performed to study the occurrence and treatment of death rattle (DR) in 107 consecutive dying patients on the palliative care unit of the University Hospital Leuven. The incidence of DR (23%) is lower than reported in literature, possibly due to low hydration. We found 2 types of rattle: "Real DR" responds generally very well to anticholinergic therapy, and is probably caused by non-expectorated secretions. "Pseudo DR" is poorly responsive to therapy and is probably caused by bronchial secretions due to pulmonary pathology, such as infection, tumor, fluid retention, or aspiration. Rattle disappeared in >90% for the patients with real DR. Real DR is a strong predictor for death, and 76% (19/25) died within 48h after onset. Administration of subcutaneous hyoscine hydrobromide, as a bolus or continuous infusion, is effective therapy for real DR and is comfortable for the patient and caregivers.

  4. Synthesis of potential Schistosomicides: new 2-(alkylamino)-1-octometiosulfuric acids

    International Nuclear Information System (INIS)

    Oliveira Penido, M.L. de; Nelson, D.L.; Pilo-Veloso, D.

    1990-01-01

    Four new 2-(alkylamino)-1-octanethiosulfuric acids (1) were synthesized from 1-octene. 1-Octene was epoxidized with MCPBA or with a two-phase system composed of H 2 O 2 , sodium tungstate, phosphoric acid, 1-octene and a phase transfer agent. Reaction of the 1,2-epoxyoetane with primary amines furnished 1-(alkylamino)-2octanols which were converted to the respective N-alkyl-2-bromo-1-octanamine hydrobromides by reaction with hydrobromic acid, followed by phosphorus tribromide. Finally, substitution of the bromide ion with sodium thiosulfate was accompanied by rearrangement via an aziridine intermediate, resulting in formation of the product. 1. The intermediates and the final products were screened for activity against infection by Schistosoma mansoni, only the final products in which the N-alkyl group was sec-butyl or isopropyl exhibited activity. Nuclear magnetic resonance and infrared and mass spectroscopy analysis are presented. (author) [pt

  5. Protection by DABCO against inactivation of transforming DNA by near-ultraviolet light: action spectra and implications for involvement of singlet oxygen

    International Nuclear Information System (INIS)

    Peak, J.G.; Peak, M.J.; Foote, C.S.

    1981-01-01

    Diazobicyclo (2.2.2) octane (DABCO) protects the genetic activity of purified transforming Bacillus subtilis DNA against inactivation by near-, but not far-, UV light. The maximum dose-modifying factor is 0.4, at 0.1 M DABCO. Maximal protection is at about 350 nm and no protection occurs below 313 nm. The spectrum for protection is similar to that described for 2-aminoethylisothiouronium bromide hydrobromide. The relevance of these observations with regard to the role of singlet oxygen in near-UV effects is discussed. (author)

  6. Anticholinergic syndrome following an unintentional overdose of scopolamine

    Directory of Open Access Journals (Sweden)

    Carmela E Corallo

    2009-09-01

    Full Text Available Carmela E Corallo1, Ann Whitfield2, Adeline Wu21Department of Pharmacy, The Alfred, Melbourne, Victoria, Australia; 2Intensive Care Unit, Box Hill Hospital, Melbourne, Victoria, AustraliaAbstract: Scopolamine hydrobromide (hyoscine is an antimuscarinic drug which is primarily used in the prophylaxis and treatment of motion sickness and as a premedication to dry bronchial and salivary secretions. In acute overdosage, the main clinical problem is central nervous system (CNS depression. In Australia, tablets containing scopolamine hydrobromide 0.3 mg are available over the counter in packs of ten. The recommended dose for adults is one to two tablets as a single dose, repeated four to six hours later, if required. The maximum dose stated on the pack is four tablets over a 24-hour period with a caution regarding drowsiness and blurred vision. We describe a patient who presented with symptoms of anticholinergic syndrome secondary to an unintentional overdose of scopolamine. Whilst at work, the patient noticed that he had forgotten his prescribed medication, domperidone, at home; a friend gave him some travel sickness medication which contained scopolamine for relief of nausea. On a previous occasion, he had experienced a similar, less severe reaction with another anticholinergic agent, loperamide. This report highlights the need to consider nonprescription products, ie, over the counter medications, herbal/nutritional supplements as causes of anticholinergic syndrome when a patient presents with symptoms suggestive of this diagnosis.Keywords: domperidone, scopolamine, nonprescription drugs, toxicity, anticholinergic syndrome

  7. Pharmacokinetic Effects of Isavuconazole Coadministration With the Cytochrome P450 Enzyme Substrates Bupropion, Repaglinide, Caffeine, Dextromethorphan, and Methadone in Healthy Subjects

    OpenAIRE

    Yamazaki, Takao; Desai, Amit; Goldwater, Ronald; Han, David; Howieson, Corrie; Akhtar, Shahzad; Kowalski, Donna; Lademacher, Christopher; Pearlman, Helene; Rammelsberg, Diane; Townsend, Robert

    2016-01-01

    Abstract This report describes phase 1 clinical trials performed to assess interactions of oral isavuconazole at the clinically targeted dose (200 mg, administered as isavuconazonium sulfate 372 mg, 3 times a day for 2 days; 200 mg once daily [QD] thereafter) with single oral doses of the cytochrome P450 (CYP) substrates: bupropion hydrochloride (CYP2B6; 100?mg; n = 24), repaglinide (CYP2C8/CYP3A4; 0.5 mg; n = 24), caffeine (CYP1A2; 200 mg; n = 24), dextromethorphan hydrobromide (CYP2D6/CYP3A...

  8. [Study of acetylsalicylic acid role in the potentiation of antiamnesic and neuroprotective properties of piracetam in rats with alloxan diabetes].

    Science.gov (United States)

    Zhiliuk, V I; Levykh, A E; Mamchur, V I

    2013-01-01

    It has been established that prolonged alloxan-induced hyperglycemia in rats potentiates amnesic properties of scopolamine hydrobromide. It was characterized by shortening of the latent period by 44% (ppiracetam with acetylsalicylic acid was accompanied by an expressed antiamnetic potential - the reduction of early markers of proteins degradation (aldehydephenylhydrazones, APH) by 21,7% (ppiracetam according to the effect upon KPH. NO2-/NO3- level was also decreased by 30,3% (ppiracetam may be assumed to be directly related to the ability of acetylsalicylic acid to improve microcirculation in the ischemic areas of the brain in diabetes and probably to its neuroprotective potential.

  9. Capillary electrophoretic enantioseparation of basic drugs using a new single-isomer cyclodextrin derivative and theoretical study of the chiral recognition mechanism.

    Science.gov (United States)

    Liu, Yongjing; Deng, Miaoduo; Yu, Jia; Jiang, Zhen; Guo, Xingjie

    2016-05-01

    A novel single-isomer cyclodextrin derivative, heptakis {2,6-di-O-[3-(1,3-dicarboxyl propylamino)-2-hydroxypropyl]}-β-cyclodextrin (glutamic acid-β-cyclodextrin) was synthesized and used as a chiral selector in capillary electrophoresis for the enantioseparation of 12 basic drugs, including terbutaline, clorprenaline, tulobuterol, clenbuterol, procaterol, carvedilol, econazole, miconazole, homatropine methyl bromide, brompheniramine, chlorpheniramine and pheniramine. The primary factors affecting separation efficiency, which include the background electrolyte pH, the concentration of glutamic acid-β-cyclodextrin and phosphate buffer concentration, were investigated. Satisfactory enantioseparations were obtained using an uncoated fused-silica capillary of 50 cm (effective length 40 cm) × 50 μm id with 120 mM phosphate buffer (pH 2.5-4.0) containing 0.5-4.5 mM glutamic acid-β-cyclodextrin as background electrolyte. A voltage of 20 kV was applied and the capillary temperature was kept at 20°C. The results proved that glutamic acid-β-cyclodextrin was an effective chiral selector for studied 12 basic drugs. Moreover, the possible chiral recognition mechanism of brompheniramine, chlorpheniramine and pheniramine on glutamic acid-β-cyclodextrin was investigated using the semi-empirical Parametric Method 3. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Protonation of inorganic 5-Fluorocytosine salts

    Science.gov (United States)

    Souza, Matheus S.; da Silva, Cecília C. P.; Almeida, Leonardo R.; Diniz, Luan F.; Andrade, Marcelo B.; Ellena, Javier

    2018-06-01

    5-Fluorocytosine (5-FC) has been widely used for the treatment of fungal infections and recently was found to exert an extraordinary antineoplastic activity in gene directed prodrug therapy. However, despite of its intense use, 5-FC exhibits tabletability issues due its physical instability in humid environments, leading to transition from the anhydrous to monohydrate phase. By considering that salt formation is an interesting strategy to overcome this problem, in this paper crystal engineering approach was applied to the supramolecular synthesis of new 5-FC salts with sulfuric, hydrobromic and methanesulfonic inorganic acids. A total of four structures were obtained, namely 5-FC sulfate monohydrate (1:1:1), 5-FC hydrogen sulfate (1:1), 5-FC mesylate (1:1) and 5-FC hydrobromide (1:1), the last one being a polymorphic form of a structure already reported in the literature. These novel salts were structurally characterized by single crystal X-ray diffraction and its supramolecular organization were analyses by Hirshfeld surface analysis. The vibrational behavior was evaluated by Raman spectroscopy and it was found to be consistent with the crystal structures.

  11. Synthesis, characterization and theoretical study of a new asymmetrical tripodal amine containing morpholine moiety

    Directory of Open Access Journals (Sweden)

    Majid Rezaeivala

    2016-11-01

    Full Text Available A new asymmetrical tripodal amine, [H3L2]Br3 containing morpholine moiety was prepared from reacting of one equivalent of N-(3-aminopropylmorpholine and two equivalents of tosylaziridine, followed by detosylation with HBr/CH3COOH. The products were characterized by various spectroscopic methods such as FAB-MS, elemental analysis, 1H and 13C NMR spectroscopy. The crystal structure of the hydrobromide salt of the latter amine, [H3L2]Br3, was also determined. For triprotonated form of the ligand L2 we can consider several microspecies and/or conformers. A theoretical study at B3LYP/6-31G∗∗ level of theory showed that the characterized microspecies is the most stable microspecies for the triprotonated form of the ligand. It was shown that the experimental NMR data for [H3L2]Br3 in solution have good correlation with the corresponding calculated data for the most stable microspecies of [H3L2]3+ in the gas phase.

  12. Simultaneous quantitative analysis of dextromethorphan, dextrorphan and chlorphenamine in human plasma by liquid chromatography-electrospray tandem mass spectrometry.

    Science.gov (United States)

    Ding, Ying; Huang, Kai; Chen, Lan; Yang, Jie; Xu, Wen-Yan; Xu, Xue-Jiao; Duan, Ru; Zhang, Jing; He, Qing

    2014-03-01

    A sensitive and accurate HPLC-MS/MS method was developed for the simultaneous determination of dextromethorphan, dextrorphan and chlorphenamine in human plasma. Three analytes were extracted from plasma by liquid-liquid extraction using ethyl acetate and separated on a Kromasil 60-5CN column (3 µm, 2.1 × 150 mm) with mobile phase of acetonitrile-water (containing 0.1% formic acid; 50:50, v/v) at a flow rate of 0.2 mL/min. Quantification was performed on a triple quadrupole tandem mass spectrometer in multiple reaction monitoring mode using positive electrospray ionization. The calibration curve was linear over the range of 0.01-5 ng/mL for dextromethorphan, 0.02-5 ng/mL for dextrorphan and 0.025-20 ng/mL for chlorphenamine. The lower limits of quantification for dextromethorphan, dextrorphan and chlorphenamine were 0.01, 0.02 and 0.025 ng/mL, respectively. The intra- and inter-day precisions were within 11% and accuracies were in the range of 92.9-102.5%. All analytes were proved to be stable during sample storage, preparation and analytic procedures. This method was first applied to the pharmacokinetic study in healthy Chinese volunteers after a single oral dose of the formulation containing dextromethorphan hydrobromide (18 mg) and chlorpheniramine malaeate (8 mg). Copyright © 2013 John Wiley & Sons, Ltd.

  13. Development and validation of a sensitive UHPLC-MS/MS method for the simultaneous analysis of tramadol, dextromethorphan chlorpheniramine and their major metabolites in human plasma in forensic context: application to pharmacokinetics.

    Science.gov (United States)

    Heneedak, Hala M; Salama, Ismail; Mostafa, Samia; El-Kady, Ehab; El-Sadek, Mohamed

    2015-07-01

    The prerequisites for forensic confirmatory analysis by LC/MS/MS with respect to European Union guidelines are chromatographic separation, a minimum number of two MS/MS transitions to obtain the required identification points and predefined thresholds for the variability of the relative intensities of the MS/MS transitions (MRM transitions) in samples and reference standards. In the present study, a fast, sensitive and robust method to quantify tramadol, chlorpheniramine, dextromethorphan and their major metabolites, O-desmethyltramadol, dsmethyl-chlorpheniramine and dextrophan, respectively, in human plasma using ibuprofen as internal standard (IS) is described. The analytes and the IS were extracted from plasma by a liquid-liquid extraction method using ethyl acetate-diethyl-ether (1:1). Extracted samples were analyzed by ultra-high-performance liquid chromatography coupled to electrospray ionization tandem mass spectrometry (UHPLC-ESI-MS/MS). Chromatographic separation was performed by pumping the mobile phase containing acetonitrile, water and formic acid (89.2:11.7:0.1) for 2.0 min at a flow rate of 0.25 μL/min into a Hypersil-Gold C18 column, 20 × 2.0 mm (1.9 µm) from Thermoscientific, New York, USA. The calibration curve was linear for the six analytes. The intraday precision (RSD) and accuracy (RE) of the method were 3-9.8 and -1.7-4.5%, respectively. The analytical procedure herein described was used to assess the pharmacokinetics of the analytes in 24 healthy volunteers after a single oral dose containing 50 mg of tramadol hydrochloride, 3 mg chlorpheniramine maleate and 15 mg of dextromethorphan hydrobromide. Copyright © 2014 John Wiley & Sons, Ltd.

  14. Chemical radioprotection to bone marrow stem cells after whole body gamma irradiation to mice

    Energy Technology Data Exchange (ETDEWEB)

    Dey, J.; Dey, T.B.; Ganguly, S.K.; Nagpal, K.K.; Ghose, A.

    1988-11-01

    Protection to mice bone marrow stem cells has been noted as early as two days after whole body gamma ray exposure by prior treatment with combination of hydroxytryptophan (HT) and one of the two thiol drugs viz., aminoethylisothiuronium bromide hydrobromide (AET) (20 mg/kg body weight) and B-mercaptopropionylglicine (MPG). The levels of protection to bone marrow stem cells thus obtained have been compared to that obtained by treating with the optimum radioprotecting dose of AET (200 mg/kg body weight). The study reports the bone marrow stem cells status after two days of 3 Gy, 5 Gy and 10 Gy whole body gamma irradiation in relation to the mentioned radioprotecting treatments as studied by spleen colony forming method.

  15. Using an innovative combination of quality-by-design and green analytical chemistry approaches for the development of a stability indicating UHPLC method in pharmaceutical products.

    Science.gov (United States)

    Boussès, Christine; Ferey, Ludivine; Vedrines, Elodie; Gaudin, Karen

    2015-11-10

    An innovative combination of green chemistry and quality by design (QbD) approach is presented through the development of an UHPLC method for the analysis of the main degradation products of dextromethorphan hydrobromide. QbD strategy was integrated to the field of green analytical chemistry to improve method understanding while assuring quality and minimizing environmental impacts, and analyst exposure. This analytical method was thoroughly evaluated by applying risk assessment and multivariate analysis tools. After a scouting phase aimed at selecting a suitable stationary phase and an organic solvent in accordance with green chemistry principles, quality risk assessment tools were applied to determine the critical process parameters (CPPs). The effects of the CPPs on critical quality attributes (CQAs), i.e., resolutions, efficiencies, and solvent consumption were further evaluated by means of a screening design. A response surface methodology was then carried out to model CQAs as function of the selected CPPs and the optimal separation conditions were determined through a desirability analysis. Resulting contour plots enabled to establish the design space (DS) (method operable design region) where all CQAs fulfilled the requirements. An experimental validation of the DS proved that quality within the DS was guaranteed; therefore no more robustness study was required before the validation. Finally, this UHPLC method was validated using the concept of total error and was used to analyze a pharmaceutical drug product. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Preoperative topical flurbiprofen-Na+ in extracapsular lens extraction role in maintaining intraoperative pupillary dilatation

    Directory of Open Access Journals (Sweden)

    Chaudhary K

    1992-01-01

    Full Text Available Induction of intraoperative pupillary constriction, is predominantly a prostaglandin mediated process. The most potent antiprostaglandin NSAID, Flurbiprofen was used topically to study its efficacy against the above. In a prospective double blind clinical study, 50 brown eyes undergoing planned E.C.C.E., the pupils were dilated with 10% phenylephrine and 2% homatropine 1%/tropicamide. 25 eyes received 0.03% Flurbiprofen-Na+ eye drops 1/2 hourly starting two hours before surgery. The maintained intraoperative mydriasis in the two groups before anterior chamber entry (stage I vs at the end of complete cortex wash (stage III was: in control group (stage I 8.46 +/- 0.48 mm vs (stage III 3.56 +/- 0.43 mm (highly SS; in flurbiprofen group (stage I 8.60 +/- 0.48 mm vs (stage III 8.01 +/- 0.63 mm (NSS. The pupillary area available for surgical manipulation in the control group was significantly decreased from 56.18 mm2 in state I to 9.94 mm2 in stage III, while in flurbiprofen group it changed insignificantly from 58.05 mm2 in stage I to 50.24 mm2 in stage III. Postoperatively after cataract was observed in 44% eyes of control group as compared to only 8% of eyes of flurbiprofen group. Thus a maintained intraoperative mydriasis in flurbiprofen group led to better E.C.L.E. which is a mandatory prerequisite to preferred and better present day posterior chamber IOL implantation.

  17. Thermodynamic properties of amphiphilic antidepressant drug citalopram HBr

    International Nuclear Information System (INIS)

    Usman, M.; Khan, A.

    2010-01-01

    Association characteristics of antidepressant during Citalopram hydrobromide in water Have been examined and its thermodynamic parameters have been calculated using tensiometery and conductometry. The critical micelle concentration (cmc) was determined by surface tension measurement at 30 deg. C and Surface activity was studied by measuring surface parameters i.e. surface pressure, JI, surface excess concentration, area per molecule of drug and standard Gibbs free energy of adsorption, delta G. The electrical conductivity was measured as a function of concentration at various temperatures and cmc was calculated in the temperature range 20-50 deg. C. Thermodynamic parameters i.e. standard free energy of micellization, delta G standard enthalpy of micellization, delta H/sub m/ and standard entropy of micellization, delta S/sub m/ were calculated from cmc value using closed association model. (author)

  18. Disposition and pharmacokinetics in rats of McN-5707, a potential antidepressant drug

    International Nuclear Information System (INIS)

    Ng, K.T.; Holland, M.L.; Hills, J.F.; Uetz, J.A.

    1986-01-01

    A single 80 mg/kg oral solution dose of McN-5707- 14 C x HBr [trans-6-(2-chlorophenyl)-1,2,3,5,6,10b-hexahydropyrrolo[2,1-a]isoquinoline hydrobromide (1:1)] was administered orally to 40 Wistar rats. Total 14 C concentrations in plasma were high (> 4.5 μg x equiv/mL) for at least 24 hours after dosing. Unchanged McN-5707 represented 14 C concs in plasma at 45 min and < 1% at 24 hours after dosing. In the 8 days following dose administration, 23% of the dose was excreted in urine and 70% of the dose was excreted in feces. Analysis (HPLC and TLC) of glusulase treated urine, plasma and fecal samples revealed the presence of multiple metabolites of McN-5707. Unchanged McN-5707 was found only in fecal extracts (2-7% of dose). Single solution doses of McN-5707 x HBr were administered p.o. (20 mg/kg) and i.v. (4 mg/kg) to 39 Wistar rats. Plasma samples were analyzed for McN-5707 using a capillary GC assay. These studies indicated that McN-5707 was well absorbed and extensively metabolized in rats following oral doses

  19. Surface Plasmon Resonance Based Biosensors for Exploring the Influence of Alkaloids on Aggregation of Amyloid-β Peptide

    Directory of Open Access Journals (Sweden)

    Hanna Radecka

    2011-04-01

    Full Text Available The main objective of the presented study was the development of a simple analytical tool for exploring the influence of naturally occurring compounds on the aggregation of amyloid-β peptide (Aβ40 in order to find potential anti-neurodegenerative drugs. The gold discs used for surface plasmon resonance (SPR measurements were modified with thioaliphatic acid. The surface functionalized with carboxylic groups was used for covalent attaching of Aβ40 probe by creation of amide bonds in the presence of EDC/NHS. The modified SPR gold discs were used for exploring the Aβ40 aggregation process in the presence of selected alkaloids: arecoline hydrobromide, pseudopelletierine hydrochloride, trigonelline hydrochloride and α-lobeline hydrochloride. The obtained results were discussed with other parameters which govern the phenomenon studied such as lipophilicity/ hydrophilicy and Aβ40-alkaloid association constants.

  20. Effects of gamma radiation on the concentration of 5-hydroxy-L-tryptophan and 5-hydroxytryptamine in presence of radioprotector in Sprague Dawley rats

    International Nuclear Information System (INIS)

    Upadhyay, S.N.; Sharma, Ashok; Nagpal, K.K.; Saini, S.K.

    1997-01-01

    The result of variation of 5-hydroxy-L-tryptophan (HT) and 5-hydroxytryptamine (5-HT) in different tissues of control and gamma-irradiated Sprague Dawley rats with and without a radioprotector β-amino-ethylisothiuronium bromide hydrobromide (AET) combination e.g. (HT+AET) have been studied. The retention of HT, in the tissues studied, decreased after lethal dose (10.5 Gy) but for 5-HT, no such trend was observed after incorporation of HT+AET. A slight tendency of both metabolites to come back to control level was also observed for Sprague Dawley rats. In urine concentration of HT was less compared to 5-HT with a lethal dose (10.5 Gy). After incorporation of HT+AET the turnover rate of HT and 5-HT were found to be maximum when it was injected through intraperitoneal route. (author)

  1. Methadone radioimmunoassay: two simple methods

    International Nuclear Information System (INIS)

    Robinson, K.; Smith, R.N.

    1983-01-01

    Two simple and economical radioimmunoassays for methadone in blood or urine are described. Haemolysis, decomposition, common anticoagulants and sodium fluoride do not affect the results. One assay used commercially-available [1- 3 H](-)-methadone hydrobromide as the label, while the other uses a radioiodinated conjugate of 4-dimethylamino-2,2-diphenylpentanoic acid and L-tyrosine methyl ester. A commercially-available antiserum is used in both assays. Normethadone and α-methadol cross-react to a small extent with the antiserum while methadone metabolites, dextropropoxyphene, dipipanone and phenadoxone have negligible cross-reactivities. The 'cut-offs' of the two assays as described are 30 and 33 ng ml -1 for blood, and 24 and 21 ng ml -1 for urine. The assay using the radioiodinated conjugate can be made more sensitive if required by increasing the specific activity of the label. (author)

  2. Role of the autonomic nervous system and baroreflex in stress-evoked cardiovascular responses in rats.

    Science.gov (United States)

    Dos Reis, Daniel Gustavo; Fortaleza, Eduardo Albino Trindade; Tavares, Rodrigo Fiacadori; Corrêa, Fernando Morgan Aguiar

    2014-07-01

    Restraint stress (RS) is an experimental model to study stress-related cardiovascular responses, characterized by sustained pressor and tachycardiac responses. We used pharmacologic and surgical procedures to investigate the role played by sympathetic nervous system (SNS) and parasympathetic nervous system (PSNS) in the mediation of stress-evoked cardiovascular responses. Ganglionic blockade with pentolinium significantly reduced RS-evoked pressor and tachycardiac responses. Intravenous treatment with homatropine methyl bromide did not affect the pressor response but increased tachycardia. Pretreatment with prazosin reduced the pressor and increased the tachycardiac response. Pretreatment with atenolol did not affect the pressor response but reduced tachycardia. The combined treatment with atenolol and prazosin reduced both pressor and tachycardiac responses. Adrenal demedullation reduced the pressor response without affecting tachycardia. Sinoaortic denervation increased pressor and tachycardiac responses. The results indicate that: (1) the RS-evoked cardiovascular response is mediated by the autonomic nervous system without an important involvement of humoral factors; (2) hypertension results primarily from sympathovascular and sympathoadrenal activation, without a significant involvement of the cardiac sympathetic component (CSNS); (3) the abrupt initial peak in the hypertensive response to restraint is sympathovascular-mediated, whereas the less intense but sustained hypertensive response observed throughout the remaining restraint session is mainly mediated by sympathoadrenal activation and epinephrine release; (4) tachycardia results from CSNS activation, and not from PSNS inhibition; (5) RS evokes simultaneous CSNS and PSNS activation, and heart rate changes are a vector of both influences; (6) the baroreflex is functional during restraint, and modulates both the vascular and cardiac responses to restraint.

  3. Thermodynamics of micellization of cationic surfactants based on O-alkyl and O-perfluoro-N,N'-diisopropylisourea: Effect of the counter ion nature

    International Nuclear Information System (INIS)

    Lehanine, Zineb; Badache, Leila

    2011-01-01

    Graphical abstract: The pseudo phase thermodynamic model was applied to study the process of micellization of O-dodecyl- and O-tridecafluorooctyl-N,N'-diisopropylisourea hydrohalogenides. Free energy of micellization is weakly temperature dependent and decreased slightly with the size of the counterion, indicating a more favored micellization. Enthalpies and entropies of micellization decrease strongly with increasing temperature. The enthalpy-entropy compensation phenomenon was observed for all the surfactants studied. Highlights: → The pseudo phase thermodynamic model was applied to study the process of micellization of O-dodecyl- and O-tridecafluorooctyl-N,N'-diisopropylisourea hydrohalogenides. → Free energy of micellization is weakly temperature dependent and decreased slightly with the size of the counterion, indicating a more favored micellization. → Enthalpies and entropies of micellization decrease strongly with increasing temperature. → The enthalpy-entropy compensation phenomenon was observed for all the surfactants studied. - Abstract: A thermodynamic study on the micellization process of two series of cationic surfactants, viz. the O-dodecyl-N,N'-diisopropylisourea hydrochloride, hydrobromide, and hydroiodide and the O-tridecafluorooctyl-N,N'-diisopropylisourea hydrochloride, and hydrobromide, is reported. In order to explain the effect of the counter ion nature in the micellization process, thermodynamic parameters to include the standard free energy (ΔG mic 0 ), enthalpy (ΔH mic 0 ), entropy (ΔS mic 0 ), and heat capacity (ΔC p,mic 0 ) of micellization have been discussed. These parameters are evaluated from the variation of the critical micelle concentration (CMC) with temperature by fitting these values to expressions derived on the basis of a micellization thermodynamic model. The heat capacities of micellization (ΔC p,mic 0 ) were determined from the temperature dependence of ΔH mic 0 . For both surfactant series, the free energy of

  4. Serotonergic modulation of receptor occupancy in rats treated with L-DOPA after unilateral 6-OHDA lesioning

    DEFF Research Database (Denmark)

    Nahimi, Adjmal; Høltzermann, Mette; Landau, Anne M.

    2012-01-01

    Recent studies suggest that l-3,4 dihydroxyphenylalanine (L-DOPA)-induced dyskinesia (LID), a severe complication of conventional L-DOPA therapy of Parkinson's disease, may be caused by dopamine (DA) release originating in serotonergic neurons. To evaluate the in vivo effect of a 5-HT(1A) agonist...... [(±)-8-hydroxy-2-(dipropylamino) tetralin hydrobromide, 8-OHDPAT] on the L-DOPA-induced increase in extracellular DA and decrease in [(11) C]raclopride binding in an animal model of advanced Parkinson's disease and LID, we measured extracellular DA in response to L-DOPA or a combination of L......-DOPA and the 5-HT(1A) agonist, 8-OHDPAT, with microdialysis, and determined [(11) C]raclopride binding to DA receptors, with micro-positron emission tomography, as the surrogate marker of DA release. Rats with unilateral 6-hydroxydopamine lesions had micro-positron emission tomography scans with [(11) C...

  5. Urinary excretion of creatine and creatinine in gamma irradiated rats

    Energy Technology Data Exchange (ETDEWEB)

    Basu, S K; Srinivasan, M N; Chuttani, K; Bhatnagar, A; Ghose, A

    1985-06-01

    Dose response relationships of creatine, creatinine excretions and their ratio in 24 hr urine samples have been studied on each individual day upto 4 days after 1-7 Gy whole body gamma irradiation to rats. Creatine excretion reaches the peak on the 2nd day while creatinine excretion reaches the peak on the first day and a plateau is maintained up to the 4th day in each case. Good dose response correlationship is maintained for creatine or creatinine levels up to the 4th day and for creatine creatinine ratio up to the 3rd day. Seperate dose response curves are needed on each individual day for using these parameters for biological dosimetry purpose. Administration of the radioprotectors viz., combination of 5-hydroxytryptophan (HT) and 2-amino-ethylisothiuronium bromide hydrobromide (AET), HT alone and optimum radioprotecting dose of AET before 5 Gy whole body ..gamma..-irradiation have not been of help for reducing creatinineurea. (author).

  6. Factors influencing radiation-induced impairment of rat liver mitochondrial oxidative phosphorylation

    International Nuclear Information System (INIS)

    Alexander, K.C.; Aiyar, A.S.; Sreenivasan, A.

    1975-01-01

    The influence of some experimental conditions on the radiation-induced impairment of oxidative phosphorylation in rat liver mitochondria has been studied. Shielding of the liver during whole body irradiation of the animal does not significantly alter the decreased efficiency of phosphorylation. There exists a great disparity in the in vivo and in vitro radiation doses required for the manifestation of damage to liver mitochondria. While these observations point to the abscopal nature of the radiation effects, direct involvement of the adrenals has been ruled out by studies with adrenalectomised rats. Prior administration of the well known radio-protective agents, serotonin or 2-aminoethyl isothiouronium bromide hydrobromide, is effective in preventing the derangement of mitochondrial function following radioexposure. The hypocholesterolemic drug ethyl-α-p-chlorophenoxy isobutyrate, which is known to influence hepatic mitochondrial turnover, does not afford any significant protection against either mitochondrial damage or the mortality of the animals due to whole body irradiation. (author)

  7. Effect of age on upregulation of the cardiac adrenergic beta receptors

    International Nuclear Information System (INIS)

    Tumer, N.; Houck, W.T.; Roberts, J.

    1990-01-01

    Radioligand binding studies were performed to determine whether upregulation of postjunctional beta receptors occurs in sympathectomized hearts of aged animals. Fischer 344 rats 6, 12, and 24 months of age (n = 10) were used in these experiments. To produce sympathectomy, rats were injected with 6-hydroxydopamine hydrobromide (6-OHDA; 2 x 50 mg/kg iv) on days 1 and 8; the animals were decapitated on day 15. The depletion of norepinephrine in the heart was about 86% in each age group. 125I-Iodopindolol (IPIN), a beta adrenergic receptor antagonist, was employed to determine the affinity and total number of beta adrenergic receptors in the ventricles of the rat heart. The maximal number of binding sites (Bmax) was significantly elevated by 37%, 48%, and 50% in hearts from sympathectomized 6-, 12-, and 24-month-old rats, respectively. These results indicate that beta receptor mechanisms in older hearts can respond to procedures that cause upregulation of the beta adrenergic receptors

  8. Urinary excretion of creatine and creatinine in gamma irradiated rats

    International Nuclear Information System (INIS)

    Basu, S.K.; Srinivasan, M.N.; Chuttani, K.; Bhatnagar, A.; Ghose, A.

    1985-01-01

    Dose response relationships of creatine, creatinie excretions and their ratio in 24 hr urine samples have been studied on each individual day upto 4 days after 1-7 Gy whole body gamma irradiation to rats. Creatine excretion reaches the peak on the 2nd day while creatinine excretion reaches the peak on the first day and a plateau is maintained upto the 4th day in each case. Good dose response correlationship is maintained for creatine or creatinine levels upto the 4th day and for creatine creatinine ratio upto the 3rd day. Seperate dose response curves are needed on each individual day for using these parameters for biological dosimetry purpose. Administration of the radioprotectors viz., combination of 5-hydroxytryptophan (HT) and 2-amino-ethylisothiuronium bromide hydrobromide (AET), HT alone and optimum radioprotecting dose of AET before 5 Gy whole body γ-irradiation have not been of help for reducing creatinineurea. (author)

  9. High-Content Screening in hPSC-Neural Progenitors Identifies Drug Candidates that Inhibit Zika Virus Infection in Fetal-like Organoids and Adult Brain.

    Science.gov (United States)

    Zhou, Ting; Tan, Lei; Cederquist, Gustav Y; Fan, Yujie; Hartley, Brigham J; Mukherjee, Suranjit; Tomishima, Mark; Brennand, Kristen J; Zhang, Qisheng; Schwartz, Robert E; Evans, Todd; Studer, Lorenz; Chen, Shuibing

    2017-08-03

    Zika virus (ZIKV) infects fetal and adult human brain and is associated with serious neurological complications. To date, no therapeutic treatment is available to treat ZIKV-infected patients. We performed a high-content chemical screen using human pluripotent stem cell-derived cortical neural progenitor cells (hNPCs) and found that hippeastrine hydrobromide (HH) and amodiaquine dihydrochloride dihydrate (AQ) can inhibit ZIKV infection in hNPCs. Further validation showed that HH also rescues ZIKV-induced growth and differentiation defects in hNPCs and human fetal-like forebrain organoids. Finally, HH and AQ inhibit ZIKV infection in adult mouse brain in vivo. Strikingly, HH suppresses viral propagation when administered to adult mice with active ZIKV infection, highlighting its therapeutic potential. Our approach highlights the power of stem cell-based screens and validation in human forebrain organoids and mouse models in identifying drug candidates for treating ZIKV infection and related neurological complications in fetal and adult patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Pharmacology of dextromethorphan: Relevance to dextromethorphan/quinidine (Nuedexta®) clinical use.

    Science.gov (United States)

    Taylor, Charles P; Traynelis, Stephen F; Siffert, Joao; Pope, Laura E; Matsumoto, Rae R

    2016-08-01

    Dextromethorphan (DM) has been used for more than 50years as an over-the-counter antitussive. Studies have revealed a complex pharmacology of DM with mechanisms beyond blockade of N-methyl-d-aspartate (NMDA) receptors and inhibition of glutamate excitotoxicity, likely contributing to its pharmacological activity and clinical potential. DM is rapidly metabolized to dextrorphan, which has hampered the exploration of DM therapy separate from its metabolites. Coadministration of DM with a low dose of quinidine inhibits DM metabolism, yields greater bioavailability and enables more specific testing of the therapeutic properties of DM apart from its metabolites. The development of the drug combination DM hydrobromide and quinidine sulfate (DM/Q), with subsequent approval by the US Food and Drug Administration for pseudobulbar affect, led to renewed interest in understanding DM pharmacology. This review summarizes the interactions of DM with brain receptors and transporters and also considers its metabolic and pharmacokinetic properties. To assess the potential clinical relevance of these interactions, we provide an analysis comparing DM activity from in vitro functional assays with the estimated free drug DM concentrations in the brain following oral DM/Q administration. The findings suggest that DM/Q likely inhibits serotonin and norepinephrine reuptake and also blocks NMDA receptors with rapid kinetics. Use of DM/Q may also antagonize nicotinic acetylcholine receptors, particularly those composed of α3β4 subunits, and cause agonist activity at sigma-1 receptors. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Novel kinetic spectrophotometric method for estimation of certain biologically active phenolic sympathomimetic drugs in their bulk powders and different pharmaceutical formulations

    Science.gov (United States)

    Omar, Mahmoud A.; Badr El-Din, Khalid M.; Salem, Hesham; Abdelmageed, Osama H.

    2018-03-01

    A simple, selective and sensitive kinetic spectrophotometric method was described for estimation of four phenolic sympathomimetic drugs namely; terbutaline sulfate, fenoterol hydrobromide, isoxsuprine hydrochloride and etilefrine hydrochloride. This method is depended on the oxidation of the phenolic drugs with Folin-Ciocalteu reagent in presence of sodium carbonate. The rate of color development at 747-760 nm was measured spectrophotometrically. The experimental parameters controlling the color development were fully studied and optimized. The reaction mechanism for color development was proposed. The calibration graphs for both the initial rate and fixed time methods were constructed, where linear correlations were found in the general concentration ranges of 3.65 × 10- 6-2.19 × 10- 5 mol L- 1 and 2-24.0 μg mL- 1 with correlation coefficients in the following range 0.9992-0.9999, 0.9991-0.9998 respectively. The limits of detection and quantitation for the initial rate and fixed time methods were found to be in general concentration range 0.109-0.273, 0.363-0.910 and 0.210-0.483, 0.700-1.611 μg mL- 1 respectively. The developed method was validated according to ICH and USP 30 -NF 25 guidelines. The suggested method was successfully implemented to the estimation of these drugs in their commercial pharmaceutical formulations and the recovery percentages obtained were ranged from 97.63% ± 1.37 to 100.17% ± 0.95 and 97.29% ± 0.74 to 100.14 ± 0.81 for initial rate and fixed time methods respectively. The data obtained from the analysis of dosage forms were compared with those obtained by reported methods. Statistical analysis of these results indicated no significant variation in the accuracy and precision of both the proposed and reported methods.

  12. Radiation protection of male fertility in mouse and rat by a combination of 5-hydroxyl-L-tryptophan and a thiol compound (AET)

    International Nuclear Information System (INIS)

    George, S.; Chuttani, K.; Basu, S.K.

    1992-01-01

    Sperm abnormalities and fall in total sperm count following different doses (4 Gy, 5 Gy and 6 Gy) of whole body gamma irradiation (WBGR) were studied in adult male Swiss strain A mice. The protecting ability of a combination of 5-hydroxy-L-tryptophan (5-HTP, 100 mg/kg) and 2-aminoethylisothiuronium bromide hydrobromide (AET, 20 mg/kg) was also investigated. Pretreatment with a 5-HTP + AET formulation i.p., 30 min before irradiation modified the fall in sperm counts significantly. Exposures to 4 Gy, 5 Gy and 6 Gy WBGR caused marked increase of sperm abnormalities which could be significantly reduced by pretreatment with 5-HTP-AET. WBGR with 4 Gy, 5 Gy and 6 Gy produced a short period of sterility associated with oligospermia but these abnormalities were corrected by pretreatment with 5-HTP + AET. This finding was supported by breeding experiments in pretreated adult male Sprague-Dawley rats which showed delivery of normal offsprings in drug-protected irradiated groups in contrast to irradiated controls. (orig.)

  13. Determination and pharmacokinetic studies of arecoline in dog plasma by liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Li, Bing; Zhou, Xu-Zheng; Li, Jian-Yong; Yang, Ya-Jun; Niu, Jian-Rong; Wei, Xiao-Juan; Liu, Xi-Wang; Li, Jin-Shan; Zhang, Ji-Yu

    2014-10-15

    A rapid and sensitive high-performance liquid chromatography-tandem mass spectrometry (LC-MS/MS) method was developed and validated for the determination of arecoline concentration in dog plasma. Plasma sample was prepared by protein precipitation using n-hexane (containing 1% isoamyl alcohol) with β-pinene as an internal standard. Chromatographic separation was achieved on an Agilent C18 column (4.6×75mm, 3.5μm) using methanol: 5mM ammonium acetate as the mobile phase with isocratic elution. Mass detection was carried out using positive electrospray ionization in multiple reaction monitoring mode. The calibration curve for arecoline was linear over a concentration range of 2-500ng/mL. The intra-day and inter-day accuracy and precision were within the acceptable limits of ±10% at all concentrations. In summary, the LC-MS/MS method described herein was fully validated and successfully applied to the pharmacokinetic study of arecoline hydrobromide tablets in dogs after oral administration. Copyright © 2014. Published by Elsevier B.V.

  14. Using anti-muscarinic drugs in the management of death rattle: evidence-based guidelines for palliative care.

    Science.gov (United States)

    Bennett, Mike; Lucas, Viv; Brennan, Mary; Hughes, Andrew; O'Donnell, Valerie; Wee, Bee

    2002-09-01

    The management of 'death rattle' was reviewed by a task group on behalf of the Association for Palliative Medicine's Science Committee. Evidence was searched for the effectiveness of various anti-muscarinic drugs in drying oropharyngeal and bronchial secretions in dying patients. Clinical guidelines were constructed based on evidence from volunteer and clinical studies. Death rattle occurs in half of all dying patients and some response occurs in around 80% of treated patients. Clinical studies demonstrate that subcutaneous hyoscine hydrobromide 400 microg is more effective at improving symptoms at 30 min than glycopyrronium 200 microg by the same route. Volunteer studies demonstrate that intramuscular glycopyrronium 400 microg is as effective in drying secretions at 30 min as a dose of 200 microg given intravenously. Duration of response is shortest for hyoscine butylbromide (1 h) and longest for glycopyrronium (more than 6 h). There is insufficient evidence to support the use of one drug over another in a continuous infusion and prescribers should base decisions on different characteristics of each anti-muscarinic drug.

  15. Echinococcus granulosus infections of dogs in the Durazno region of Uruguay.

    Science.gov (United States)

    Parada, L; Cabrera, P; Burges, C; Acuña, A; Barcelona, C; Laurenson, M K; Gulland, F M; Agulla, J; Parietti, S; Paolillo, E

    1995-04-15

    The prevalence and distribution of Echinococcus granulosus in domestic dogs was examined in three dog populations in the Durazno region of Uruguay. The prevalence was 19.7 per cent in 704 dogs successfully purged with arecoline hydrobromide. Higher prevalences were detected in dogs from the rural area (30.0 per cent) and the village of La Paloma (25.9 per cent) than in the town of Sarandi del Yi (7.9 per cent). The frequency distribution of E granulosus was overdispersed (k, the negative binomial parameter = 0.08), with only a few animals harbouring heavy infections. The results of a questionnaire showed that the prevalence was greatest in male dogs, in dogs that were not kennelled, in dogs that had access to fields and in dogs that were not dosed with praziquantel. Dogs that were given raw sheep offal by their owners were no more likely to be parasitised than other dogs; this may reflect the inaccuracy of the owners' replies, or that the dogs were being infected outside their home.

  16. The first proton sponge-based amino acids: synthesis, acid-base properties and some reactivity.

    Science.gov (United States)

    Ozeryanskii, Valery A; Gorbacheva, Anastasia Yu; Pozharskii, Alexander F; Vlasenko, Marina P; Tereznikov, Alexander Yu; Chernov'yants, Margarita S

    2015-08-21

    The first hybrid base constructed from 1,8-bis(dimethylamino)naphthalene (proton sponge or DMAN) and glycine, N-methyl-N-(8-dimethylamino-1-naphthyl)aminoacetic acid, was synthesised in high yield and its hydrobromide was structurally characterised and used to determine the acid-base properties via potentiometric titration. It was found that the basic strength of the DMAN-glycine base (pKa = 11.57, H2O) is on the level of amidine amino acids like arginine and creatine and its structure, zwitterionic vs. neutral, based on the spectroscopic (IR, NMR, mass) and theoretical (DFT) approaches has a strong preference to the zwitterionic form. Unlike glycine, the DMAN-glycine zwitterion is N-chiral and is hydrolytically cleaved with the loss of glycolic acid on heating in DMSO. This reaction together with the mild decarboxylative conversion of proton sponge-based amino acids into 2,3-dihydroperimidinium salts under air-oxygen was monitored with the help of the DMAN-alanine amino acid. The newly devised amino acids are unique as they combine fluorescence, strongly basic and redox-active properties.

  17. Animal model of neuropathic tachycardia syndrome

    Science.gov (United States)

    Carson, R. P.; Appalsamy, M.; Diedrich, A.; Davis, T. L.; Robertson, D.

    2001-01-01

    Clinically relevant autonomic dysfunction can result from either complete or partial loss of sympathetic outflow to effector organs. Reported animal models of autonomic neuropathy have aimed to achieve complete lesions of sympathetic nerves, but incomplete lesions might be more relevant to certain clinical entities. We hypothesized that loss of sympathetic innervation would result in a predicted decrease in arterial pressure and a compensatory increase in heart rate. Increased heart rate due to loss of sympathetic innervation is seemingly paradoxical, but it provides a mechanistic explanation for clinical autonomic syndromes such as neuropathic postural tachycardia syndrome. Partially dysautonomic animals were generated by selectively lesioning postganglionic sympathetic neurons with 150 mg/kg 6-hydroxydopamine hydrobromide in male Sprague-Dawley rats. Blood pressure and heart rate were monitored using radiotelemetry. Systolic blood pressure decreased within hours postlesion (Delta>20 mm Hg). Within 4 days postlesion, heart rate rose and remained elevated above control levels. The severity of the lesion was determined functionally and pharmacologically by spectral analysis and responsiveness to tyramine. Low-frequency spectral power of systolic blood pressure was reduced postlesion and correlated with the diminished tyramine responsiveness (r=0.9572, P=0.0053). The tachycardia was abolished by treatment with the beta-antagonist propranolol, demonstrating that it was mediated by catecholamines acting on cardiac beta-receptors. Partial lesions of the autonomic nervous system have been hypothesized to underlie many disorders, including neuropathic postural tachycardia syndrome. This animal model may help us better understand the pathophysiology of autonomic dysfunction and lead to development of therapeutic interventions.

  18. Effects of SKF-83566 and haloperidol on performance on progressive ratio schedules maintained by sucrose and corn oil reinforcement: quantitative analysis using a new model derived from the Mathematical Principles of Reinforcement (MPR).

    Science.gov (United States)

    Olarte-Sánchez, C M; Valencia-Torres, L; Cassaday, H J; Bradshaw, C M; Szabadi, E

    2013-12-01

    Mathematical models can assist the interpretation of the effects of interventions on schedule-controlled behaviour and help to differentiate between processes that may be confounded in traditional performance measures such as response rate and the breakpoint in progressive ratio (PR) schedules. The effects of a D1-like dopamine receptor antagonist, 8-bromo-2,3,4,5-tetrahydro-3-methyl-5-phenyl-1H-3-benzazepin-7-ol hydrobromide (SKF-83566), and a D2-like receptor antagonist, haloperidol, on rats' performance on PR schedules maintained by sucrose and corn oil reinforcers were assessed using a new model derived from Killeen's (Behav Brain Sci 17:105-172, 1994) Mathematical Principles of Reinforcement. Separate groups of rats were trained under a PR schedule using sucrose or corn oil reinforcers. SKF-83566 (0.015 and 0.03 mg kg(-1)) and haloperidol (0.05 and 0.1 mg kg(-1)) were administered intraperitoneally (five administrations of each treatment). Running and overall response rates in successive ratios were analysed using the new model, and estimates of the model's parameters were compared between treatments. Haloperidol reduced a (the parameter expressing incentive value) in the case of both reinforcers, but did not affect the parameters related to response time and post-reinforcement pausing. SKF-83566 reduced a and k (the parameter expressing sensitivity of post-reinforcement pausing to the prior inter-reinforcement interval) in the case of sucrose, but did not affect any of the parameters in the case of corn oil. The results are consistent with the hypothesis that blockade of both D1-like and D2-like receptors reduces the incentive value of sucrose, whereas the incentive value of corn oil is more sensitive to blockade of D2-like than D1-like receptors.

  19. Effects of a compound from the group of substituted thiadiazines with hypothermia inducing properties on brain metabolism in rats, a study in vivo and in vitro.

    Directory of Open Access Journals (Sweden)

    O B Shevelev

    Full Text Available The aim of the present study was to examine how administration of a compound of 1,3,4- thiadiazine class 2-morpholino-5-phenyl-6H-1,3,4-thiadiazine, hydrobromide (L-17 with hypothermia inducing properties affects the brain metabolism. The mechanism by which L-17 induces hypothermia is unknown; it may involve hypothalamic central thermoregulation as well as act via inhibition of energy metabolism. We tested the hypothesis that L-17 may induce hypothermia by directly inhibiting energy metabolism. The study in vivo was carried out on Sprague-Dawley adult rats. Two doses of L-17 were administered (190 mg/kg and 760 mg/kg. Brain metabolites were analyzed in control and treated groups using magnetic resonance spectroscopy, along with blood flow rate measurements in carotid arteries and body temperature measurements. Further in vitro studies on primary cultures from rat hippocampus were carried out to perform a mitochondria function test of L-17 pre-incubation (100 μM, 30 min. Analysis of brain metabolites showed no significant changes in 190 mg/kg treated group along with a significant reduction in body temperature by 1.5°C. However, administration of L-17 in higher dose 760 mg/kg provoked changes in brain metabolites indicative of neurotoxicity as well as reduction in carotid arteries flow rate. In addition, a balance change of excitatory and inhibitory neurotransmitters was observed. The L-17 pre-incubation with cell primary cultures from rat brain showed no significant changes in mitochondrial function. The results obtained in the study indicate that acute administration of L-17 190 mg/kg in rats induces mild hypothermia with no adverse effects onto brain metabolism.

  20. Simultaneous determination of dextromethorphan, dextrorphan and doxylamine in human plasma by HPLC coupled to electrospray ionization tandem mass spectrometry: application to a pharmacokinetic study.

    Science.gov (United States)

    Donato, J L; Koizumi, F; Pereira, A S; Mendes, G D; De Nucci, G

    2012-06-15

    In the present study, a fast, sensitive and robust method to quantify dextromethorphan, dextrorphan and doxylamine in human plasma using deuterated internal standards (IS) is described. The analytes and the IS were extracted from plasma by a liquid-liquid extraction (LLE) using diethyl-ether/hexane (80/20, v/v). Extracted samples were analyzed by high performance liquid chromatography coupled to electrospray ionization tandem mass spectrometry (HPLC-ESI-MS/MS). Chromatographic separation was performed by pumping the mobile phase (acetonitrile/water/formic acid (90/9/1, v/v/v) during 4.0min at a flow-rate of 1.5 mL min⁻¹ into a Phenomenex Gemini® C18, 5 μm analytical column (150 × 4.6 mm i.d.). The calibration curve was linear over the range from 0.2 to 200 ng mL⁻¹ for dextromethorphan and doxylamine and 0.05 to 10 ng mL⁻¹ for dextrorphan. The intra-batch precision and accuracy (%CV) of the method ranged from 2.5 to 9.5%, and 88.9 to 105.1%, respectively. Method inter-batch precision (%CV) and accuracy ranged from 6.7 to 10.3%, and 92.2 to 107.1%, respectively. The run-time was for 4 min. The analytical procedure herein described was used to assess the pharmacokinetics of dextromethorphan, dextrorphan and doxylamine in healthy volunteers after a single oral dose of a formulation containing 30 mg of dextromethorphan hydrobromide and 12.5mg of doxylamine succinate. The method has high sensitivity, specificity and allows high throughput analysis required for a pharmacokinetic study. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Theory of Visual Attention (TVA) applied to mice in the 5-choice serial reaction time task.

    Science.gov (United States)

    Fitzpatrick, C M; Caballero-Puntiverio, M; Gether, U; Habekost, T; Bundesen, C; Vangkilde, S; Woldbye, D P D; Andreasen, J T; Petersen, A

    2017-03-01

    The 5-choice serial reaction time task (5-CSRTT) is widely used to measure rodent attentional functions. In humans, many attention studies in healthy and clinical populations have used testing based on Bundesen's Theory of Visual Attention (TVA) to estimate visual processing speeds and other parameters of attentional capacity. We aimed to bridge these research fields by modifying the 5-CSRTT's design and by mathematically modelling data to derive attentional parameters analogous to human TVA-based measures. C57BL/6 mice were tested in two 1-h sessions on consecutive days with a version of the 5-CSRTT where stimulus duration (SD) probe length was varied based on information from previous TVA studies. Thereafter, a scopolamine hydrobromide (HBr; 0.125 or 0.25 mg/kg) pharmacological challenge was undertaken, using a Latin square design. Mean score values were modelled using a new three-parameter version of TVA to obtain estimates of visual processing speeds, visual thresholds and motor response baselines in each mouse. The parameter estimates for each animal were reliable across sessions, showing that the data were stable enough to support analysis on an individual level. Scopolamine HBr dose-dependently reduced 5-CSRTT attentional performance while also increasing reward collection latency at the highest dose. Upon TVA modelling, scopolamine HBr significantly reduced visual processing speed at both doses, while having less pronounced effects on visual thresholds and motor response baselines. This study shows for the first time how 5-CSRTT performance in mice can be mathematically modelled to yield estimates of attentional capacity that are directly comparable to estimates from human studies.

  2. Effects of silk fibroin in murine dry eye

    Science.gov (United States)

    Kim, Chae Eun; Lee, Ji Hyun; Yeon, Yeung Kyu; Park, Chan Hum; Yang, Jaewook

    2017-03-01

    The study aimed to investigate the effects of silk fibroin in a mouse model of dry eye. The experimental dry eye mouse model was developed using more than twelve-weeks-old NOD.B10.H2b mice exposing them to 30-40% ambient humidity and injecting them with scopolamine hydrobromide for 10 days. Tear production and corneal irregularity score were measured by the instillation of phosphate buffered saline or silk fibroin. Corneal detachment and conjunctival goblet cell density were observed by hematoxylin and eosin or periodic acid Schiff staining in the cornea or conjunctiva. The expression of inflammatory markers was detected by immunohistochemistry in the lacrimal gland. The silk group tear production was increased, and corneal smoothness was improved. The corneal epithelial cells and conjunctival goblet cells were recovered in the silk groups. The expression of inflammatory factors was inhibited in the lacrimal gland of the silk group. These results show that silk fibroin improved the cornea, conjunctiva, and lacrimal gland in the mouse model of dry eye. These findings suggest that silk fibroin has anti-inflammatory effects in the experimental models of dry eye.

  3. Fluorescent bovine serum albumin interacting with the antitussive quencher dextromethorphan: a spectroscopic insight.

    Science.gov (United States)

    Durgannavar, Amar K; Patgar, Manjanath B; Nandibewoor, Sharanappa T; Chimatadar, Shivamurti A

    2016-05-01

    The interaction of dextromethorphan hydrobromide (DXM) with bovine serum albumin (BSA) is studied by using fluorescence spectra, UV-vis absorption, synchronous fluorescence spectra (SFS), 3D fluorescence spectra, Fourier transform infrared (FTIR) spectroscopy and circular dichroism under simulated physiological conditions. DXM effectively quenched the intrinsic fluorescence of BSA. Values of the binding constant, K(A), are 7.159 × 10(3), 9.398 × 10(3) and 16.101 × 10(3)  L/mol; the number of binding sites, n, and the corresponding thermodynamic parameters ΔG°, ΔH° and ΔS° between DXM and BSA were calculated at different temperatures. The interaction between DXM and BSA occurs through dynamic quenching and the effect of DXM on the conformation of BSA was analyzed using SFS. The average binding distance, r, between the donor (BSA) and acceptor (DXM) was determined based on Förster's theory. The results of fluorescence spectra, UV-vis absorption spectra and SFS show that the secondary structure of the protein has been changed in the presence of DXM. Copyright © 2015 John Wiley & Sons, Ltd.

  4. Compatibility and osmolality of inhaled N-acetylcysteine nebulizing solution with fenoterol and ipratropium.

    Science.gov (United States)

    Lee, Tzung-Yi; Chen, Chi-Ming; Lee, Chun-Nin; Chiang, Yi-Chun; Chen, Hsiang-Yin

    2005-04-15

    The compatibility, pH, and osmolality of N-acetylcysteine (NAC) nebulizing solution in the presence of ipratropium bromide or fenoterol hydrobromide were studied. Portions (400 microL) of each mixture were sampled immediately upon mixing and one, two, three, four, five, six, and seven hours after mixing and assayed by high-performance liquid chromatography. Osmolality was measured by sampling 100 microL from the filling cup at a five-minute interval during nebulization and by the freezing-point-depression method. Adding NAC solution to fenoterol solution raised the pH from 3.20 to 7.90 and the osmolality to a mean +/- S.D. of 1400.67 +/- 4.51 mOsm/kg. Fenoterol concentrations decreased to 93.71% and NAC concentrations to 92.54% of initial concentrations after seven hours. Mixing ipratropium with NAC solution raised the pH from 3.74 to 7.95 and the osmolality to a mean +/- S.D. of 1413 +/- 11.79 mOsm/kg. The initial ipratropium concentration declined 7.39% and 10.91% one and two hours after mixing with NAC solution, respectively. NAC and ipratropium were stable in nebulizing solution within one hour of mixing. NAC and fenoterol were compatible for at least seven hours.

  5. Detection of diffuse glomerular lesions in rats: II. Comparison of indium-111 cationic small macromolecules with technetium-99m DTPA

    International Nuclear Information System (INIS)

    McAfee, J.G.; Thomas, F.D.; Subramanian, G.; Schneider, R.D.; Lyons, B.; Roskopf, M.; Zapf-Longo, C.; Whaley, D.

    1986-01-01

    Dextrans with average molecular weights of 5000, 10,000, and 17,500 and inulin were rendered cationic by amination with 2-bromoethylamine hydrobromide. After limited coupling with DTPA cyclic dianhydride, they were labeled with 111In. A good correlation was found between their early renal uptake quantitated by camera-computer techniques and their renal clearance from multiple plasma samples in rats with glomerular damage induced by puromycin aminonucleoside and controls. However, there was poor correlation between the early renal uptake of these agents and the clearance of simultaneously injected [/sup 99m/Tc]DTPA. The 2-hr organ distribution and urinary excretion of these agents were compared with the corresponding values of DTPA. The differences in clearance between rats with glomerular damage and controls were greater with aminated dextran (mol wt 5000) than with DTPA, confirming previous work with infusions of nonradioactive charged dextrans and neutral inulin. The cationic dextrans appear to reflect the presence or absence of the normal anionic charge of the glomerular membrane as well as changes in filtration rate. Aminated inulin did not differentiate between controls and rats with glomerular disease any better than DTPA, probably because the number of amino groups conjugated was insufficient to produce the charge effect

  6. Photoactivation of isoflavonoid phytoalexins: involvement of free radicals

    International Nuclear Information System (INIS)

    Bakker, J.; Gommers, F.J.; Smits, L.; Fuchs, A.; Vries, F.W. de

    1983-01-01

    Ultraviolet irradiation of isoflavonoid phytoalexins phaseollin, 3.6a. 9-trihydroxypterocarpan, glyceollin, tuberosin and pisatin, but not medicarpin, brought about inactivation of glucose-6-phosphate dehydrogenase in an in vitro assay system. Photoinactivation of the enzyme by photoactivated pisatin in air-saturated solutions was hardly affected by singlet oxygen quenchers such as NaN 3 , bovine serum albumin, histidine or methionine. Neither addition of the hydroxyl radical scavengers mannitol, Na-benzoate and ethanol nor the presence of catalase or superoxide dismutase protected the enzyme against photoinactivation, suggesting that OHradical, H 2 O 2 and O 2 radical are not the reactive oxygen species involved. However, the free radical scavenger S-(2-amino-ethyl)isothiouronium bromide hydrobromide (AET) protected the enzyme against inactivation by photoactivated pisatin. Direct evidence for the generation of free radicals was obtained by ESR measurements of solutions of phaseollin, pisatin and medicarpin in hexane irradiated with ultraviolet light in the presence or absence of O 2 . Phaseollin produced the most stable free radicals, whereas medicarpin hardly gave rise to free radical formation; pisatin took a somewhat intermediate position by producing a strong ESR signal which, however, decayed rather quickly. These results indicate free radical formation as the cause for photoinactivation of enzymes by photoactivated isoflavonoid phytoalexins. (author)

  7. Improvement of Learning and Memory Induced by Cordyceps Polypeptide Treatment and the Underlying Mechanism

    Directory of Open Access Journals (Sweden)

    Guangxin Yuan

    2018-01-01

    Full Text Available Our previous research revealed that Cordyceps militaris can improve the learning and memory, and although the main active ingredient should be its polypeptide complexes, the underlying mechanism of its activity remains poorly understood. In this study, we explored the mechanisms by which Cordyceps militaris improves learning and memory in a mouse model. Mice were given scopolamine hydrobromide intraperitoneally to establish a mouse model of learning and memory impairment. The effects of Cordyceps polypeptide in this model were tested using the Morris water maze test; serum superoxide dismutase activity; serum malondialdehyde levels; activities of acetyl cholinesterase, Na+-k+-ATPase, and nitric oxide synthase; and gamma aminobutyric acid and glutamate contents in brain tissue. Moreover, differentially expressed genes and the related cellular signaling pathways were screened using an mRNA expression profile chip. The results showed that the genes Pik3r5, Il-1β, and Slc18a2 were involved in the effects of Cordyceps polypeptide on the nervous system of these mice. Our findings suggest that Cordyceps polypeptide may improve learning and memory in the scopolamine-induced mouse model of learning and memory impairment by scavenging oxygen free radicals, preventing oxidative damage, and protecting the nervous system.

  8. Feasibility of testing DNA repair inhibitors for mutagenicity by a simple method

    International Nuclear Information System (INIS)

    Sideropoulos, A.S.; Specht, S.M.; Jones, M.T.

    1980-01-01

    A simple screening methodology for the determination of mutagenicity of DNA repair inhibitors has been tested in this laboratory. Radiation-resistant E. coli B/r and WP2 hcr + and hcr - are suitable strains for mutagenicity testing. In these strains irradiated with 40-60 ergs/mm 2 , chemicals which interfere with repair of ultraviolet-induced pre-mutational lesions can be shown to enhance significantly the frequency of mutations to streptomycin resistance. This phenomenon is termed 'mutational synergism' [18,20]. We have attempted to apply the procedure for securing data for 'mutational synergism' between ultraviolet (UV) radiation and a number of antimalarial drugs including quinine hydrochloride (50 μg/ml), quinine hydrobromide (50 μg/ml), primaquine diphosphate (50 μg/ml), chloroquine (50 μg/ml), quinine (50 μg/ml) and quinacrine dihydrochloride (25 μg/ml). All drugs tested give synergistic effects with UV light. The synergistic activity ranges from 3- to 35-fold. Quinine and quinacrine dihydrochloride have been found to be much more efficient enhancers of the mutagenic effect of UV than caffeine. In general, we have found that the expression of synergistic action occurs at a concentration well below the minimum inhibitory concentration (MIC) with the drugs tested. The implication of these observations in the establishment of a screening method for the evaluation of the mutagenicity of DNA repair inhibitors is discussed. (orig.)

  9. The synthesis of desired functional groups on PEI microgel particles for biomedical and environmental applications

    Science.gov (United States)

    Sahiner, Nurettin; Demirci, Sahin; Sahiner, Mehtap; Al-Lohedan, Hamad

    2015-11-01

    Polyethyleneimine (PEI) microgels were synthesized by micro emulsion polymerization technique and converted to positively charged forms by chemical treatments with various modifying agents with different functional groups, such as 2-bromoethanol (-OH), 4-bromobutyronitrile (-CN), 2-bromoethylamine hydrobromide (-NH2), and glycidol (-OH). The functionalization of PEI microgels was confirmed by FT-IR, TGA and zeta potential measurements. Furthermore, a second modification of the modified PEI microgels was induced on 4-bromo butyronitrile-modified PEI microgels (PEI-CN) by amidoximation, to generate new functional groups on the modified PEI microgels. The PEI and modified PEI microgels were also tested for their antimicrobial effects against various bacteria such as Bacillus subtilis ATCC 6633, Escherichia coli ATCC 8739 and Staphylococcus aureus ATCC 25323. Moreover, the PEI-based particles were used for removal of organic dyes such as methyl orange (MO) and congo red (CR). The absorption capacity of PEI-based microgels increased with modification from 101.8 mg/g to 218.8 mg/g with 2-bromoethylamine, 216.2 m/g with 1-bromoethanol, and 224.5 mg/g with 4-bromobutyronitrile for MO. The increase in absorption for CR dyes was from 347.3 mg/g to 390.4 mg/g with 1-bromoethanol, 399.6 mg/g with glycidol, and 349.9 mg/g with 4-bromobutyronitrile.

  10. Feasibility of testing DNA repair inhibitors for mutagenicity by a simple method

    International Nuclear Information System (INIS)

    Sideropoulos, A.S.; Specht, S.M.; Jones, M.T.

    1980-01-01

    A simple screening methodology for the determination of mutagenictity of DNA repair inhibitors has been tested in this laboratory. Radiation-resistant E. coli B/r and WP2 hcr + and hcr - are suitable strains for mutagenicity testing. In these strains irradiated with 40-60 ergs/mm 2 , chemicals which interfere with repair of ultraviolet-induced pre-mutational lesions can be shown to enhance significantly the frequency of mutations to streptomycin resistance. This phenomenon is termed mutational synergism. We have attempted to apply the procedure for securing data for mutational synergism between ultraviolet (UV) radiation and a number of antimalarial drugs including quinine hydrochloride (50 μg/ml), quinine hydrobromide (50 μg/ml), primaquine diphosphate (50 μg/ml), chloroquine (50μg/ml) and quinacrine dihydrochloride (25 μg/ml). All drugs tested give synergistic effets with UV light. The synergistic activity ranges from 3- to 35-fold. Quinine and quinacrine dihydrochloride have been found to be much more efficient enhancers of the mutagenic effect of UV than caffeine. In general, we have found that the expression of synergistic action occurs at a concentration well below the minimum inhibitory concentration (MIC) with the drugs tested. The implication of these observations in the establishment of a screening method for the evaluation of the mutagenicity of DNA repair inhibitors is discussed. (orig.)

  11. Feasibility of testing DNA repair inhibitors for mutagenicity by a simple method

    Energy Technology Data Exchange (ETDEWEB)

    Sideropoulos, A S; Specht, S M; Jones, M T [Duquesne Univ., Pittsburgh, PA (USA). Dept. of Biological Sciences

    1980-04-01

    A simple screening methodology for the determination of mutagenictity of DNA repair inhibitors has been tested in this laboratory. Radiation-resistant E. coli B/r and WP2 hcr/sup +/ and hcr/sup -/ are suitable strains for mutagenicity testing. In these strains irradiated with 40-60 ergs/mm/sup 2/, chemicals which interfere with repair of ultraviolet-induced pre-mutational lesions can be shown to enhance significantly the frequency of mutations to streptomycin resistance. This phenomenon is termed mutational synergism. We have attempted to apply the procedure for securing data for mutational synergism between ultraviolet (uv) radiation and a number of antimalarial drugs including quinine hydrochloride (50 ..mu..g/ml), quinine hydrobromide (50 ..mu..g/ml), primaquine diphosphate (50 ..mu..g/ml), chloroquine (50..mu..g/ml) and quinacrine dihydrochloride (25 ..mu..g/ml). All drugs tested give synergistic effects with uv light. The synergistic activity ranges from 3- to 35-fold. Quinine and quinacrine dihydrochloride have been found to be much more efficient enhancers of the mutagenic effect of uv than caffeine. In general, we have found that the expression of synergistic action occurs at a concentration well below the minimum inhibitory concentration (MIC) with the drugs tested. The implication of these observations in the establishment of a screening method for the evaluation of the mutagenicity of DNA repair inhibitors is discussed.

  12. Feasibility of testing DNA repair inhibitors for mutagenicity by a simple method

    Energy Technology Data Exchange (ETDEWEB)

    Sideropoulos, A S; Specht, S M; Jones, M T [Duquesne Univ., Pittsburgh, PA (USA). Dept. of Biological Sciences

    1980-04-01

    A simple screening methodology for the determination of mutagenicity of DNA repair inhibitors has been tested in this laboratory. Radiation-resistant E. coli B/r and WP2 hcr/sup +/ and hcr/sup -/ are suitable strains for mutagenicity testing. In these strains irradiated with 40-60 ergs/mm/sup 2/, chemicals which interfere with repair of ultraviolet-induced pre-mutational lesions can be shown to enhance significantly the frequency of mutations to streptomycin resistance. This phenomenon is termed 'mutational synergism' (18,20). We have attempted to apply the procedure for securing data for 'mutational synergism' between ultraviolet (UV) radiation and a number of antimalarial drugs including quinine hydrochloride (50 ..mu..g/ml), quinine hydrobromide (50 ..mu..g/ml), primaquine diphosphate (50 ..mu..g/ml), chloroquine (50 ..mu..g/ml), quinine (50 ..mu..g/ml) and quinacrine dihydrochloride (25 ..mu..g/ml). All drugs tested give synergistic effects with UV light. The synergistic activity ranges from 3- to 35-fold. Quinine and quinacrine dihydrochloride have been found to be much more efficient enhancers of the mutagenic effect of UV than caffeine. In general, we have found that the expression of synergistic action occurs at a concentration well below the minimum inhibitory concentration (MIC) with the drugs tested. The implication of these observations in the establishment of a screening method for the evaluation of the mutagenicity of DNA repair inhibitors is discussed.

  13. Effects of dual combinations of antifolates with atovaquone or dapsone on nucleotide levels in Plasmodium falciparum.

    Science.gov (United States)

    Yeo, A E; Seymour, K K; Rieckmann, K H; Christopherson, R I

    1997-04-04

    The triazine antifolates, cycloguanil and 4,6-diamino-1,2-dihydro-2,2-dimethyl-1-[(2,4,5-trichlorophenoxy)propy loxy]-1,3,5-triazine hydrobromide (WR99210), and their parent biguanide compounds, proguanil and N-[3-(2,4,5-trichlorophenoxy)propyloxy]-n-(1-methylethyl)-imido dicarbonimidic-diamine hydrochloride (PS-15), were tested in combination with a series of antimalarial drugs for synergism against Plasmodium falciparum growing in erythrocytic culture. Four synergistic combinations were found: cycloguanil dapsone, WR99210-dapsone, proguanil-atovaquone, and PS-15-atovaquone. Cycloguanil-dapsone or WR99210-dapsone had a profound suppressive effect on the concentration of dTTP in parasites while that of dATP increased. Depletion of dTTP is consistent with cycloguanil or WR99210 inhibiting dihydrofolate reductase and dapsone inhibiting dihydropteroate synthase. For the combinations proguanil-atovaquone and PS-15-atovaquone, the levels of nucleoside triphosphates (NTPs) and dNTPs were generally suppressed, suggesting that inhibition is not through nucleotide pathways but probably through another metabolic mechanism(s). Combinations of two synergistic pairs of antimalarial drugs, (proguanil-atovaquone)-(cycloguanil-dapsone) and (PS-15-atovaquone)-(WR99210-dapsone), were tested, and it was found that NTPs and dNTPs decreased much more than for a single synergistic combination. Dual synergistic combinations could play an important role in the therapy of multidrug-resistant malaria, just as combination chemotherapy is used to treat cancer.

  14. The Role of Flavonoids as Potential Radioprotectors

    International Nuclear Information System (INIS)

    Benkovic, V.; Djikic, D.; Horvat Knezevic, A.; Lisicic, D.; Orsolic, N.; Kopjar, N.

    2011-01-01

    Investigations for effective and non toxic compounds with radioprotection capability led to increasing interest in naturally occurring antioxidants since most of known chemical radioprotectors (AET, WR2721, WR 1065, etc.) express toxic side effects that limit their use in medical practice. Among the promissing compounds there are flavonoids, whosentioxidant activity is based on ability of direct free radicals scavenging or stabilizing the reactive oxygen species (ROS) by interacting with the reactive compound of the radical. Because of the high reactivity of the hydroxyl substituents of flavonoids, radicals are made inactive. Flavonoids can also increase the function of the endogenous antioxidant enzyme systems: superoxide dismutase (SOD), catalase (CAT), glutathione peroxidase (GPx), glutathione reductase (GR) and gluthation. Antioxidant effects may be also a combined result of radical scavenging and interaction with enzyme functions. Flavonoids induce activities of the immune system as well. Increased hematopoietic activity could account for the improved hematopoietic tolerance to radiotherapy. In this study we evaluated radioprotective effects of selected flavonoids (caffeic acid, chrysin, naringin and quercetin) administered to mice prior to whole-body irradiation with γ-rays (absorbed dose was 9 Gy). The survival analysis and alkaline comet assay on white blood cells were employed both on irradiated and non-irradiated animals. Blood samples were taken 30 min. after irradiation. Appropriate negative and positive control groups (administered chemical radioprotector AET, S-(2-Aminoethyl) isothiouronium bromide hydrobromide, i. p. at a dose of 281 mg kg -1 body weight) were also selected and handled in the same manner. We observed statistically significant difference in surviving time of mice pre-treated with test components and the most effective radioprotector was quercetin. Tested flavonoids were not genotoxic to non-irradiated mice and offered good

  15. Eating high fat chow increases the sensitivity of rats to 8-OH-DPAT-induced lower lip retraction.

    Science.gov (United States)

    Li, Jun-Xu; Ju, Shutian; Baladi, Michelle G; Koek, Wouter; France, Charles P

    2011-12-01

    Eating high fat food can alter sensitivity to drugs acting on dopamine systems; this study examined whether eating high fat food alters sensitivity to a drug acting on serotonin (5-HT) systems. Sensitivity to (+)-8-hydroxy-2-(dipropylamino) tetralin hydrobromide (8-OH-DPAT; 5-HT1A receptor agonist)-induced lower lip retraction was examined in separate groups (n=8-9) of rats with free access to standard (5.7% fat) or high fat (34.3% fat) chow; sensitivity to quinpirole (dopamine D3/D2 receptor agonist)-induced yawning was also examined. Rats eating high fat chow gained more body weight than rats eating standard chow and, after 6 weeks of eating high fat chow, they were more sensitive to 8-OH-DPAT (0.01-0.1 mg/kg)-induced lower lip retraction and quinpirole (0.0032-0.32 mg/kg)-induced yawning. These changes were not reversed when rats that previously ate high fat chow were switched to eating standard chow and sensitivity to 8-OH-DPAT and quinpirole increased when rats that previously ate standard chow ate high fat chow. These data extend previous results showing changes in sensitivity to drugs acting on dopamine systems in animals eating high fat chow to a drug acting at 5-HT1A receptors and they provide support for the notion that eating certain foods impacts sensitivity to drugs acting on monoamine systems.

  16. Effects of Yizhi Capsule (益智胶囊) on Learning and Memory Disorder and β-amyloid Peptide Induced Neurotoxicity in Rats

    Institute of Scientific and Technical Information of China (English)

    WU Hang-yu; XU Jiang-ping; LI Lin; ZHU Bai-hua

    2006-01-01

    Objective: To explore the effects of Yizhi Capsule (益智胶囊, YZC) on learning and memory disorder and β-amyloid peptide induced neurotoxicity in rats. Methods: Various doses of YZC were administered to Sprague-Dawley (SD) rats for 8 consecutive days, twice a day. On the 8th day of the experiment,scopolamine hydrobromide was intraperitoneally injected to every rat and Morris water maze test and shuttle dark avoidance test were carried out respectively to explore the changes of learning and memory capacities in the rats. Besides, after the cerebral cortical neurons of newborn SD rats aged within 3 days were cultured in vitro for 7 days, drug serum containing YZC was added to the cultured neurons before or after β amyloid peptide25-35 (Aβ25-35) intoxication to observe the protective effect of YZC on neurotoxicity by MTT assay and to determine the LDH content in the supernatant. Results: Compared with those untreated with YZC, the rats having received YZC treatment got superiority in shorter time of platform seeking in Morris water maze test,as well as elongated latent period and less times of error in shuttle dark avoidance test. On the cultured neurons, YZC drug serum could effectively increase the survival rate of Aβ25-35 intoxicated neurons and reduce the LDH contents in cultured supernatant. Conclusion: YZC has an action of improving learning and memory disorder, and good protective effect on Aβ25-35 induced neurotoxicity in SD rats.

  17. Acetylcholinesterase inhibitor treatment alleviated cognitive impairment caused by delayed encephalopathy due to carbon monoxide poisoning: Two case reports and a review of the literature.

    Science.gov (United States)

    Yanagiha, Kumi; Ishii, Kazuhiro; Tamaoka, Akira

    2017-02-01

    Delayed encephalopathy due to carbon monoxide (CO) poisoning can even occur in patients with mild symptoms of acute CO poisoning. Some cases taking conventional hyperbaric oxygen (HBO) therapy or steroid-pulse therapy may be insufficient, and AchEI may be effective. We report two cases of delayed encephalopathy after acute CO poisoning involving two women aged 69 (Case 1) and 60 years (Case 2) whose cognitive function improved with acetylcholinesterase inhibitor (AchEI) treatment. Delayed encephalopathy occurred 25 and 35 days after acute CO poisoning in Case 1 and Case 2, respectively. Both patients demonstrated cognitive impairment, apathy, and hypokinesia on admission. Although hyperbaric oxygen therapy did not yield any significant improvements, cognitive dysfunction improved substantially. This was evidenced by an improved Mini-Mental State Examination score ffom 9 to 28 points in Case 1 and an improved Hasegawa's dementia rating scale score from 4 to 25 points in Case 2 after administration of an AchEI. In Case 1, we administered galantamine hydrobromide, which was related with improved white matter lesions initially detected on brain magnetic resonance imaging. However, in Case 2 white matter lesions persisted despite AchEI treatment. AchEI treatment may result in improved cognitive and frontal lobe function by increasing low acetylcholine concentrations in the hippocampus and frontal lobe caused by decreased nicotinic acetylcholine receptor levels in delayed encephalopathy after CO poisoning. Physicians should consider AchEIs for patients demonstrating delayed encephalopathy due to CO poisoning.

  18. Liraglutide, a GLP-1 Receptor Agonist, Which Decreases Hypothalamic 5-HT2A Receptor Expression, Reduces Appetite and Body Weight Independently of Serotonin Synthesis in Mice

    Directory of Open Access Journals (Sweden)

    Katsunori Nonogaki

    2018-01-01

    Full Text Available A recent report suggested that brain-derived serotonin (5-HT is critical for maintaining weight loss induced by glucagon-like peptide-1 (GLP-1 receptor activation in rats and that 5-HT2A receptors mediate the feeding suppression and weight loss induced by GLP-1 receptor activation. Here, we show that changes in daily food intake and body weight induced by intraperitoneal administration of liraglutide, a GLP-1 receptor agonist, over 4 days did not differ between mice treated with the tryptophan hydroxylase (Tph inhibitor p-chlorophenylalanine (PCPA for 3 days and mice without PCPA treatment. Treatment with PCPA did not affect hypothalamic 5-HT2A receptor expression. Despite the anorexic effect of liraglutide disappearing after the first day of treatment, the body weight loss induced by liraglutide persisted for 4 days in mice treated with or without PCPA. Intraperitoneal administration of liraglutide significantly decreased the gene expression of hypothalamic 5-HT2A receptors 1 h after injection. Moreover, the acute anorexic effects of liraglutide were blunted in mice treated with the high-affinity 5-HT2A agonist (4-bromo-3,6-dimethoxybenzocyclobuten-1-yl methylamine hydrobromide 14 h or 24 h before liraglutide injection. These findings suggest that liraglutide reduces appetite and body weight independently of 5-HT synthesis in mice, whereas GLP-1 receptor activation downregulates the gene expression of hypothalamic 5-HT2A receptors.

  19. Decision analysis multicriteria analysis

    International Nuclear Information System (INIS)

    Lombard, J.

    1986-09-01

    The ALARA procedure covers a wide range of decisions from the simplest to the most complex one. For the simplest one the engineering judgement is generally enough and the use of a decision aiding technique is therefore not necessary. For some decisions the comparison of the available protection option may be performed from two or a few criteria (or attributes) (protection cost, collective dose,...) and the use of rather simple decision aiding techniques, like the Cost Effectiveness Analysis or the Cost Benefit Analysis, is quite enough. For the more complex decisions, involving numerous criteria or for decisions involving large uncertainties or qualitative judgement the use of these techniques, even the extended cost benefit analysis, is not recommended and appropriate techniques like multi-attribute decision aiding techniques are more relevant. There is a lot of such particular techniques and it is not possible to present all of them. Therefore only two broad categories of multi-attribute decision aiding techniques will be presented here: decision analysis and the outranking analysis

  20. Shape analysis in medical image analysis

    CERN Document Server

    Tavares, João

    2014-01-01

    This book contains thirteen contributions from invited experts of international recognition addressing important issues in shape analysis in medical image analysis, including techniques for image segmentation, registration, modelling and classification, and applications in biology, as well as in cardiac, brain, spine, chest, lung and clinical practice. This volume treats topics such as, anatomic and functional shape representation and matching; shape-based medical image segmentation; shape registration; statistical shape analysis; shape deformation; shape-based abnormity detection; shape tracking and longitudinal shape analysis; machine learning for shape modeling and analysis; shape-based computer-aided-diagnosis; shape-based medical navigation; benchmark and validation of shape representation, analysis and modeling algorithms. This work will be of interest to researchers, students, and manufacturers in the fields of artificial intelligence, bioengineering, biomechanics, computational mechanics, computationa...

  1. The Effect of Chronic Administration of Buspirone on 6-Hydroxydopamine-Induced Catalepsy in Rats

    Directory of Open Access Journals (Sweden)

    Hamdolah Sharifi

    2012-06-01

    Full Text Available Purpose: Several evidences show that serotonergic neurons play a role in the regulation of movements executed by the basal ganglia. Recently we have reported that single dose of buspirone improved 6-hydroxydopamine (6-OHDA and haloperidol-induced catalepsy. This study is aimed to investigate effect of chronic intraperitoneal (i.p. administration of buspirone on 6-OHDA-induced catalepsy in male Wistar rats. Methods: Catalepsy was induced by unilateral infusion of 6-OHDA (8 μg/2 μl/rat into the central region of the SNc and was assayed by the bar-test method 5, 60, 120 and 180 min after drugs administration in 10th day. The effect of buspirone (0.5, 1 and 2 mg/kg, i.p. for 10 days was assessed in 6-OHDA-lesioned rats. Results: The results showed that chronic injection of buspirone (0.5, 1 and 2 mg/kg, i.p. for 10 days decreased catalepsy when compared with the control group. The best anticataleptic effect was observed at the dose of 1 mg/kg. The catalepsy-improving effect of buspirone was reversed by 1-(2-methoxyphenyl- 4-[4-(2-phthalimido butyl]piperazine hydrobromide (NAN-190, 0.5 mg/kg, i.p.,as a 5-HT1A receptor antagonist. Conclusion: Our study indicates that chronic administration of buspirone improves catalepsy in a 6-OHDA-induced animal model of parkinson's disease (PD. We also suggest that buspirone may be used as an adjuvant therapy to increase effectiveness of antiparkinsonian drugs. In order to prove this hypothesis, further clinical studies should be done.

  2. The effect of grape seed extract on the pharmacokinetics of dextromethorphan in healthy volunteers.

    Science.gov (United States)

    Goey, Andrew K L; Meijerman, Irma; Beijnen, Jos H; Schellens, Jan H M

    2013-11-01

    Grape seed extract (GSE) has been shown to inhibit the cytochrome P450 (CYP) 2D6 isoenzyme in vitro. To determine the clinical effect of GSE on CYP2D6, the pharmacokinetic interaction between GSE and the sensitive CYP2D6 probe dextromethorphan in healthy adult volunteers was examined. In this open label, randomized, cross-over study, 30 subjects were assigned to cohort A or B. Both cohorts ingested 30 mg dextromethorphan hydrobromide on day 1 and day 10. Cohort A received 100 mg GSE capsules three times daily on days 8, 9 and 10, while cohort B started with GSE on day -1 until day 1. After urine collection (0-8 h) on day 1 and day 10, the urinary dextromethorphan to dextrorphan metabolic ratio was determined. Among 28 evaluable subjects, an increase of the urinary metabolic ratio was observed in 16 subjects (57 %). The mean metabolic ratio (± standard deviation) before and after GSE supplementation was 0.41 (± 0.56) and 0.48 (± 0.59), respectively. This result was neither statistically (P = 0.342) nor clinically [geometric mean ratio 1.10, 90 % CI (0.93-1.30)] significant. Further, the majority (73 %) of the included subjects did not experience any adverse events after intake of dextromethorphan or GSE. Supplementation of GSE did not significantly affect the urinary dextromethorphan to dextrorphan metabolic ratio in healthy volunteers. The results of this clinical study indicate that GSE appears to be safe to combine with drugs extensively metabolized by CYP2D6, such as dextromethorphan and tamoxifen.

  3. New potentiometric and spectrophotometric methods for the determination of dextromethorphan in pharmaceutical preparations.

    Science.gov (United States)

    Elmosallamy, Mohamed A F; Amin, Alaa S

    2014-01-01

    New, simple and convenient potentiometric and spectrophotometric methods are described for the determination of dextromethorphan hydrobromide (DXM) in pharmaceutical preparations. The potentiometric technique is based on developing a potentiometric sensor incorporating the dextromethorphan tetrakis(p-chlorophenyl)borate ion-pair complex as an electroactive species in a plasticized PVC matrix membrane with o-nitophenyl octyl ether or dioctyl phthalate. The sensor shows a rapid near Nernstian response of over 1 × 10(-5) - 1 × 10(-2) mol L(-1) dextromethorphan in the pH range of 3.0 - 9.0. The detection limit is 2 × 10(-6) mol L(-1) DXM and the response time is instantaneous (2 s). The proposed spectrophotometric technique involves the reaction of DXM with eriochrom black T (EBT) to form an ion-associate complex. Solvent extraction is used to improve the selectivity of the method. The optimal extraction and reaction conditions have been studied, and the analytical characteristics of the method have been obtained. Linearity is obeyed in the range of 7.37 - 73.7 × 10(-5) mol L(-1) DXM, and the detection limit of the method is 1.29 × 10(-5) mol L(-1). The relative standard deviation (RSD) and relative error for six replicate measurements of 3.685 × 10(-4) mol L(-1) are 0.672 and 0.855%, respectively. The interference effect of some excepients has also been tested. The drug contents in pharmaceutical preparations were successfully determined by the proposed methods by applying the standard-addition technique.

  4. SSP-002392, a new 5-HT4 receptor agonist, dose-dependently reverses scopolamine-induced learning and memory impairments in C57Bl/6 mice.

    Science.gov (United States)

    Lo, Adrian C; De Maeyer, Joris H; Vermaercke, Ben; Callaerts-Vegh, Zsuzsanna; Schuurkes, Jan A J; D'Hooge, Rudi

    2014-10-01

    5-HT4 receptors (5-HT4R) are suggested to affect learning and memory processes. Earlier studies have shown that animals treated with 5-HT4R agonists, often with limited selectivity, show improved learning and memory with retention memory often being assessed immediately after or within 24 h after the last training session. In this study, we characterized the effect of pre-training treatment with the selective 5-HT4R agonist SSP-002392 on memory acquisition and the associated long-term memory retrieval in animal models of impaired cognition. Pre-training treatment with SSP-002392 (0.3 mg/kg, 1.5 mg/kg and 7.5 mg/kg p.o.) dose-dependently inhibited the cognitive deficits induced by scopolamine (0.5 mg/kg s.c.) in two different behavioral tasks: passive avoidance and Morris water maze. In the Morris water maze, spatial learning was significantly improved after treatment with SSP-002392 translating in an accelerated and more efficient localization of the hidden platform compared to scopolamine-treated controls. Moreover, retention memory was assessed 24 h (passive avoidance) and 72 h (Morris water maze) after the last training session of cognitive-impaired animals and this was significantly improved in animals treated with SSP-002392 prior to the training sessions. Furthermore, the effects of SSP-002392 were comparable to galanthamine hydrobromide. We conclude that SSP-002392 has potential as a memory-enhancing compound. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Theoretical numerical analysis a functional analysis framework

    CERN Document Server

    Atkinson, Kendall

    2005-01-01

    This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solu

  6. An example of multidimensional analysis: Discriminant analysis

    International Nuclear Information System (INIS)

    Lutz, P.

    1990-01-01

    Among the approaches on the data multi-dimensional analysis, lectures on the discriminant analysis including theoretical and practical aspects are presented. The discrimination problem, the analysis steps and the discrimination categories are stressed. Examples on the descriptive historical analysis, the discrimination for decision making, the demonstration and separation of the top quark are given. In the linear discriminant analysis the following subjects are discussed: Huyghens theorem, projection, discriminant variable, geometrical interpretation, case for g=2, classification method, separation of the top events. Criteria allowing the obtention of relevant results are included [fr

  7. Energy-Water Modeling and Analysis | Energy Analysis | NREL

    Science.gov (United States)

    Generation (ReEDS Model Analysis) U.S. Energy Sector Vulnerabilities to Climate Change and Extreme Weather Modeling and Analysis Energy-Water Modeling and Analysis NREL's energy-water modeling and analysis vulnerabilities from various factors, including water. Example Projects Renewable Electricity Futures Study

  8. Instrumental analysis

    International Nuclear Information System (INIS)

    Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok

    1989-02-01

    This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.

  9. Instrumental analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok

    1989-02-15

    This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.

  10. Analysis of Project Finance | Energy Analysis | NREL

    Science.gov (United States)

    Analysis of Project Finance Analysis of Project Finance NREL analysis helps potential renewable energy developers and investors gain insights into the complex world of project finance. Renewable energy project finance is complex, requiring knowledge of federal tax credits, state-level incentives, renewable

  11. Synthesis of highly anti-HIV active sulfated poly- and oligo-saccharides and analysis of their action mechanisms by NMR [nuclear magnetic resonance] spectroscopy

    International Nuclear Information System (INIS)

    Uryu, Toshiyuki

    1998-01-01

    . 2. NMR studies on action mechanism of curdlan sulfate, chondroitin sulfate, and heparin. In order to elucidate in vivo interactions of curdlan sulfate with virus proteins, 1 H and 13 C NMR spectra were measured on mixtures of electronegatively charged curdlan sulfate (CS) and electropositively charged polylysine (PL) hydrobromide. When CS and PL were mixed in appropriate molar ratios, ion complexes between CS and PL were formed and detected by NMR. Large changes in NMR absorptions appeared around 20 - 50 ppm region due to the side chain of polylysine. Similarly, in the mixture of heparin and PL, absorptions around 55 - 101 ppm region due to heparin moiety changed to a large extent. Consequently, it is assumed that the occurrence of the anti H IV activity is started from the interaction between curdlan sulfate and virus proteins containing sequences rich in basic amino acids of lysine and arginine. Full text

  12. Performance analysis

    International Nuclear Information System (INIS)

    2008-05-01

    This book introduces energy and resource technology development business with performance analysis, which has business division and definition, analysis of current situation of support, substance of basic plan of national energy, resource technique development, selection of analysis index, result of performance analysis by index, performance result of investigation, analysis and appraisal of energy and resource technology development business in 2007.

  13. Trial Sequential Analysis in systematic reviews with meta-analysis

    Directory of Open Access Journals (Sweden)

    Jørn Wetterslev

    2017-03-01

    Full Text Available Abstract Background Most meta-analyses in systematic reviews, including Cochrane ones, do not have sufficient statistical power to detect or refute even large intervention effects. This is why a meta-analysis ought to be regarded as an interim analysis on its way towards a required information size. The results of the meta-analyses should relate the total number of randomised participants to the estimated required meta-analytic information size accounting for statistical diversity. When the number of participants and the corresponding number of trials in a meta-analysis are insufficient, the use of the traditional 95% confidence interval or the 5% statistical significance threshold will lead to too many false positive conclusions (type I errors and too many false negative conclusions (type II errors. Methods We developed a methodology for interpreting meta-analysis results, using generally accepted, valid evidence on how to adjust thresholds for significance in randomised clinical trials when the required sample size has not been reached. Results The Lan-DeMets trial sequential monitoring boundaries in Trial Sequential Analysis offer adjusted confidence intervals and restricted thresholds for statistical significance when the diversity-adjusted required information size and the corresponding number of required trials for the meta-analysis have not been reached. Trial Sequential Analysis provides a frequentistic approach to control both type I and type II errors. We define the required information size and the corresponding number of required trials in a meta-analysis and the diversity (D2 measure of heterogeneity. We explain the reasons for using Trial Sequential Analysis of meta-analysis when the actual information size fails to reach the required information size. We present examples drawn from traditional meta-analyses using unadjusted naïve 95% confidence intervals and 5% thresholds for statistical significance. Spurious conclusions in

  14. Instrumental analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seung Jae; Seo, Seong Gyu

    1995-03-15

    This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.

  15. Instrumental analysis

    International Nuclear Information System (INIS)

    Kim, Seung Jae; Seo, Seong Gyu

    1995-03-01

    This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.

  16. Galantamine is a novel post-exposure therapeutic against lethal VX challenge

    International Nuclear Information System (INIS)

    Hilmas, Corey J.; Poole, Melissa J.; Finneran, Kathryn; Clark, Matthew G.; Williams, Patrick T.

    2009-01-01

    The ability of galantamine hydrobromide (GAL HBr) treatment to antagonize O-ethyl-S-(2-diisopropylaminoethyl) methylphosphonothiolate (VX)-induced lethality, impairment of muscle tension, and electroencephalographic (EEG) changes was assessed in guinea pigs. Guinea pigs were challenged with 16.8 μg/kg VX (2LD50). One min after challenge, animals were administered 0.5 mg/kg atropine sulfate (ATR) and 25 mg/kg pyridine-2-aldoxime methochloride (2-PAM). In addition, guinea pigs were given 0, 1, 2, 4, 8 or 10 mg/kg GAL as a post-exposure treatment immediately prior to ATR and 2-PAM. Animals were either monitored for 24-h survival, scheduled for electroencephalography (EEG) recording, or euthanized 60 min later for measurement of indirectly-elicited muscle tension in the hemidiaphragm. Post-exposure GAL therapy produced a dose-dependent increase in survival from lethal VX challenge. Optimal clinical benefits were observed in the presence of 10 mg/kg GAL, which led to 100% survival of VX-challenged guinea pigs. Based on muscle physiology studies, GAL post-exposure treatment protected the guinea pig diaphragm, the major effector muscle of respiration, from fatigue, tetanic fade, and muscular paralysis. Protection against the paralyzing effects of VX was dose-dependent. In EEG studies, GAL did not alter seizure onset for all doses tested. At the highest dose tested (10 mg/kg), GAL decreased seizure duration when administered as a post-exposure treatment 1 min after VX. GAL also reduced the high correlation associated between seizure activity and lethality after 2LD50 VX challenge. GAL may have additional benefits both centrally and peripherally that are unrelated to its established mechanism as a reversible acetylcholinesterase inhibitor (AChEI).

  17. The upright posture improves plantar stepping and alters responses to serotonergic drugs in spinal rats.

    Science.gov (United States)

    Sławińska, Urszula; Majczyński, Henryk; Dai, Yue; Jordan, Larry M

    2012-04-01

    Recent studies on the restoration of locomotion after spinal cord injury have employed robotic means of positioning rats above a treadmill such that the animals are held in an upright posture and engage in bipedal locomotor activity. However, the impact of the upright posture alone, which alters hindlimb loading, an important variable in locomotor control, has not been examined. Here we compared the locomotor capabilities of chronic spinal rats when placed in the horizontal and upright postures. Hindlimb locomotor movements induced by exteroceptive stimulation (tail pinching) were monitored with video and EMG recordings. We found that the upright posture alone significantly improved plantar stepping. Locomotor trials using anaesthesia of the paws and air stepping demonstrated that the cutaneous receptors of the paws are responsible for the improved plantar stepping observed when the animals are placed in the upright posture.We also tested the effectiveness of serotonergic drugs that facilitate locomotor activity in spinal rats in both the horizontal and upright postures. Quipazine and (±)-8-hydroxy-2-(dipropylamino)tetralin hydrobromide (8-OH-DPAT) improved locomotion in the horizontal posture but in the upright posture either interfered with or had no effect on plantar walking. Combined treatment with quipazine and 8-OH-DPAT at lower doses dramatically improved locomotor activity in both postures and mitigated the need to activate the locomotor CPG with exteroceptive stimulation. Our results suggest that afferent input from the paw facilitates the spinal CPG for locomotion. These potent effects of afferent input from the paw should be taken into account when interpreting the results obtained with rats in an upright posture and when designing interventions for restoration of locomotion after spinal cord injury.

  18. Combined effect of hormones and radioprotective substances in case of animal exposure to ionizing radiation

    International Nuclear Information System (INIS)

    Benke, D.; Bodo-Sekejchidinch, K.; Ehanta, A.

    1982-01-01

    The effect of anabolic and other relative preparations used in the national therapy in combination with radioprotective compounds tested earlier in experiments with animals was studied. The investigations were carried out on albino male mice of CFLP line. X-ray exposure was carried out with the help of a TNH-250 type unit for deep irradiation (630 R and 800 R doses). For gamma irradiation, a 60 Co facility was utilized. AET radioprotective compounds (S 2 -beta-aminoethylisothiouronium-bromide-hydrobromide) and ixeprin (bis-alfa-propinyl-glycyl-sodium disulfide) were used. Nerobolyl (norandrostenolon-phenylpropionate) and retabolyl (norandrostenolon-decanoat) were studied among anabolic hormones. Experiments were also conducted using retandrolom (testosteron-phenylpropionate) which did not belong to anabolics but was used in oncology as a supporting agent. Three days prior to the irradiation, intraperitoneal injections of nerobolyl (10 mg/kg) dissolved in oil for injections, ratabolyl (50 mg/kg) and retandrol (25 mg/kg) began to be made to groups of animals, 15 mice in each. Control groups received intraperitoneally only 0.5 ml of oil. In another series of experiments hormones were used even after the irradiation. Radioprotectors were introduced, as a rule, 20 min. prior to the radiation exposure. Ixeprin, as an effective radioprotector, was introduced 3 hours after the irradiation. When evaluating the results of the experiments, the number of animals which survived during 30 days after the irradiation and the rate of mortality were taken into consideration, gain in weig was also taken into accout. A single introduction of an anabolic in combination with a radioprotector (AET or ixeprin) usually did not contribute to an increase of the survival rate of irradiated animals

  19. Intravenous dextromethorphan/quinidine inhibits activity of dura-sensitive spinal trigeminal neurons in rats.

    Science.gov (United States)

    Sokolov, A Y; Lyubashina, O A; Berkovich, R R; Panteleev, S S

    2015-09-01

    Migraine is a chronic neurological disorder characterized by episodes of throbbing headaches. Practically all medications currently used in migraine prophylaxis have a number of substantial disadvantages and use limitations. Therefore, the further search for principally new prophylactic antimigraine agents remains an important task. The objective of our study was to evaluate the effects of a fixed combination of dextromethorphan hydrobromide and quinidine sulphate (DM/Q) on activity of the spinal trigeminal neurons in an electrophysiological model of trigemino-durovascular nociception. The study was performed in 15 male Wistar rats, which were anaesthetized with urethane/α-chloralose and paralysed using pipecuronium bromide. The effects of cumulative intravenous infusions of DM/Q (three steps performed 30 min apart, 15/7.5 mg/kg of DM/Q in 0.5 mL of isotonic saline per step) on ongoing and dural electrical stimulation-induced neuronal activities were tested in a group of eight rats over 90 min. Other seven animals received cumulative infusion of equal volumes of saline and served as control. Cumulative administration of DM/Q produced steady suppression of both the ongoing activity of the spinal trigeminal neurons and their responses to electrical stimulation of the dura mater. It is evident that the observed DM/Q-induced suppression of trigeminal neuron excitability can lead to a reduction in nociceptive transmission from meninges to higher centres of the brain. Since the same mechanism is believed to underlie the pharmacodynamics of many well-known antimigraine drugs, results of the present study enable us to anticipate the potential efficacy of DM/Q in migraine. © 2014 European Pain Federation - EFIC®

  20. Functional analysis

    CERN Document Server

    Kantorovich, L V

    1982-01-01

    Functional Analysis examines trends in functional analysis as a mathematical discipline and the ever-increasing role played by its techniques in applications. The theory of topological vector spaces is emphasized, along with the applications of functional analysis to applied analysis. Some topics of functional analysis connected with applications to mathematical economics and control theory are also discussed. Comprised of 18 chapters, this book begins with an introduction to the elements of the theory of topological spaces, the theory of metric spaces, and the theory of abstract measure space

  1. Activation analysis in food analysis. Pt. 9

    International Nuclear Information System (INIS)

    Szabo, S.A.

    1992-01-01

    An overview is presented on the application of activation analysis (AA) techniques for food analysis, as reflected at a recent international conference titled Activation Analysis and its Applications. The most popular analytical techniques include instrumental neutron AA, (INAA or NAA), radiochemical NAA (RNAA), X-ray fluorescence analysis and mass spectrometry. Data are presented for the multielemental NAA of instant soups, for elemental composition of drinking water in Iraq, for Na, K, Mn contents of various Indian rices, for As, Hg, Sb and Se determination in various seafoods, for daily microelement takeup in China, for the elemental composition of Chinese teas. Expected development trends in AA are outlined. (R.P.) 24 refs.; 8 tabs

  2. Cross-impacts analysis development and energy policy analysis applications

    Energy Technology Data Exchange (ETDEWEB)

    Roop, J.M.; Scheer, R.M.; Stacey, G.S.

    1986-12-01

    Purpose of this report is to describe the cross-impact analysis process and microcomputer software developed for the Office of Policy, Planning, and Analysis (PPA) of DOE. First introduced in 1968, cross-impact analysis is a technique that produces scenarios of future conditions and possibilities. Cross-impact analysis has several unique attributes that make it a tool worth examining, especially in the current climate when the outlook for the economy and several of the key energy markets is uncertain. Cross-impact analysis complements the econometric, engineering, systems dynamics, or trend approaches already in use at DOE. Cross-impact analysis produces self-consistent scenarios in the broadest sense and can include interaction between the economy, technology, society and the environment. Energy policy analyses that couple broad scenarios of the future with detailed forecasting can produce more powerful results than scenario analysis or forecasts can produce alone.

  3. DTI analysis methods : Voxel-based analysis

    NARCIS (Netherlands)

    Van Hecke, Wim; Leemans, Alexander; Emsell, Louise

    2016-01-01

    Voxel-based analysis (VBA) of diffusion tensor imaging (DTI) data permits the investigation of voxel-wise differences or changes in DTI metrics in every voxel of a brain dataset. It is applied primarily in the exploratory analysis of hypothesized group-level alterations in DTI parameters, as it does

  4. Image analysis

    International Nuclear Information System (INIS)

    Berman, M.; Bischof, L.M.; Breen, E.J.; Peden, G.M.

    1994-01-01

    This paper provides an overview of modern image analysis techniques pertinent to materials science. The usual approach in image analysis contains two basic steps: first, the image is segmented into its constituent components (e.g. individual grains), and second, measurement and quantitative analysis are performed. Usually, the segmentation part of the process is the harder of the two. Consequently, much of the paper concentrates on this aspect, reviewing both fundamental segmentation tools (commonly found in commercial image analysis packages) and more advanced segmentation tools. There is also a review of the most widely used quantitative analysis methods for measuring the size, shape and spatial arrangements of objects. Many of the segmentation and analysis methods are demonstrated using complex real-world examples. Finally, there is a discussion of hardware and software issues. 42 refs., 17 figs

  5. Common pitfalls in statistical analysis: Linear regression analysis

    Directory of Open Access Journals (Sweden)

    Rakesh Aggarwal

    2017-01-01

    Full Text Available In a previous article in this series, we explained correlation analysis which describes the strength of relationship between two continuous variables. In this article, we deal with linear regression analysis which predicts the value of one continuous variable from another. We also discuss the assumptions and pitfalls associated with this analysis.

  6. CSF analysis

    Science.gov (United States)

    Cerebrospinal fluid analysis ... Analysis of CSF can help detect certain conditions and diseases. All of the following can be, but ... An abnormal CSF analysis result may be due to many different causes, ... Encephalitis (such as West Nile and Eastern Equine) Hepatic ...

  7. Semen analysis

    Science.gov (United States)

    ... analysis URL of this page: //medlineplus.gov/ency/article/003627.htm Semen analysis To use the sharing features on this page, please enable JavaScript. Semen analysis measures the amount and quality of a man's semen and sperm. Semen is ...

  8. Models of Economic Analysis

    OpenAIRE

    Adrian Ioana; Tiberiu Socaciu

    2013-01-01

    The article presents specific aspects of management and models for economic analysis. Thus, we present the main types of economic analysis: statistical analysis, dynamic analysis, static analysis, mathematical analysis, psychological analysis. Also we present the main object of the analysis: the technological activity analysis of a company, the analysis of the production costs, the economic activity analysis of a company, the analysis of equipment, the analysis of labor productivity, the anal...

  9. Limestone rocks analysis by X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Izquierdo M, G.; Ponce R, R.; Vazquez J, J.

    1996-01-01

    By request of a private company, employing basically X-ray fluorescence analysis (X RF), was established a fast and accurate method for the analysis of the major elements in limestone rocks. Additionally, for complementing analysis was determined by ion chromatography, the chlorides appearance and by atomic absorption of sodium. By gravimetry, was determined the losses by ignition and the alpha quartz. (Author)

  10. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  11. Emergy-Based Regional Socio-Economic Metabolism Analysis: An Application of Data Envelopment Analysis and Decomposition Analysis

    OpenAIRE

    Zilong Zhang; Xingpeng Chen; Peter Heck

    2014-01-01

    Integrated analysis on socio-economic metabolism could provide a basis for understanding and optimizing regional sustainability. The paper conducted socio-economic metabolism analysis by means of the emergy accounting method coupled with data envelopment analysis and decomposition analysis techniques to assess the sustainability of Qingyang city and its eight sub-region system, as well as to identify the major driving factors of performance change during 2000–2007, to serve as the basis for f...

  12. failure analysis of a uav flight control system using markov analysis

    African Journals Online (AJOL)

    eobe

    2016-01-01

    Jan 1, 2016 ... Tree Analysis (FTA), Dependence Diagram Analysis. (DDA) and Markov Analysis (MA) are the most widely-used methods of probabilistic safety and reliability analysis for airborne system [1]. Fault trees analysis is a backward failure searching ..... [4] Christopher Dabrowski and Fern Hunt Markov Chain.

  13. Incidents analysis

    International Nuclear Information System (INIS)

    Francois, P.

    1996-01-01

    We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs

  14. Incidents analysis

    Energy Technology Data Exchange (ETDEWEB)

    Francois, P

    1997-12-31

    We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs.

  15. RELIABILITY ANALYSIS OF BENDING ELIABILITY ANALYSIS OF ...

    African Journals Online (AJOL)

    eobe

    Reliability analysis of the safety levels of the criteria slabs, have been .... was also noted [2] that if the risk level or β < 3.1), the ... reliability analysis. A study [6] has shown that all geometric variables, ..... Germany, 1988. 12. Hasofer, A. M and ...

  16. Safety analysis fundamentals

    International Nuclear Information System (INIS)

    Wright, A.C.D.

    2002-01-01

    This paper discusses the safety analysis fundamentals in reactor design. This study includes safety analysis done to show consequences of postulated accidents are acceptable. Safety analysis is also used to set design of special safety systems and includes design assist analysis to support conceptual design. safety analysis is necessary for licensing a reactor, to maintain an operating license, support changes in plant operations

  17. Visual physics analysis-from desktop to physics analysis at your fingertips

    International Nuclear Information System (INIS)

    Bretz, H-P; Erdmann, M; Fischer, R; Hinzmann, A; Klingebiel, D; Komm, M; Lingemann, J; Rieger, M; Müller, G; Steggemann, J; Winchen, T

    2012-01-01

    Visual Physics Analysis (VISPA) is an analysis environment with applications in high energy and astroparticle physics. Based on a data-flow-driven paradigm, it allows users to combine graphical steering with self-written C++ and Python modules. This contribution presents new concepts integrated in VISPA: layers, convenient analysis execution, and web-based physics analysis. While the convenient execution offers full flexibility to vary settings for the execution phase of an analysis, layers allow to create different views of the analysis already during its design phase. Thus, one application of layers is to define different stages of an analysis (e.g. event selection and statistical analysis). However, there are other use cases such as to independently optimize settings for different types of input data in order to guide all data through the same analysis flow. The new execution feature makes job submission to local clusters as well as the LHC Computing Grid possible directly from VISPA. Web-based physics analysis is realized in the VISPA-Web project, which represents a whole new way to design and execute analyses via a standard web browser.

  18. Experimental modal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ibsen, Lars Bo; Liingaard, M.

    2006-12-15

    This technical report concerns the basic theory and principles for experimental modal analysis. The sections within the report are: Output-only modal analysis software, general digital analysis, basics of structural dynamics and modal analysis and system identification. (au)

  19. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  20. Urban energy consumption: Different insights from energy flow analysis, input–output analysis and ecological network analysis

    International Nuclear Information System (INIS)

    Chen, Shaoqing; Chen, Bin

    2015-01-01

    Highlights: • Urban energy consumption was assessed from three different perspectives. • A new concept called controlled energy was developed from network analysis. • Embodied energy and controlled energy consumption of Beijing were compared. • The integration of all three perspectives will elucidate sustainable energy use. - Abstract: Energy consumption has always been a central issue for sustainable urban assessment and planning. Different forms of energy analysis can provide various insights for energy policy making. This paper brought together three approaches for energy consumption accounting, i.e., energy flow analysis (EFA), input–output analysis (IOA) and ecological network analysis (ENA), and compared their different perspectives and the policy implications for urban energy use. Beijing was used to exemplify the different energy analysis processes, and the 42 economic sectors of the city were aggregated into seven components. It was determined that EFA quantifies both the primary and final energy consumption of the urban components by tracking the different types of fuel used by the urban economy. IOA accounts for the embodied energy consumption (direct and indirect) used to produce goods and services in the city, whereas the control analysis of ENA quantifies the specific embodied energy that is regulated by the activities within the city’s boundary. The network control analysis can also be applied to determining which economic sectors drive the energy consumption and to what extent these sectors are dependent on each other for energy. So-called “controlled energy” is a new concept that adds to the analysis of urban energy consumption, indicating the adjustable energy consumed by sectors. The integration of insights from all three accounting perspectives further our understanding of sustainable energy use in cities

  1. CADDIS Volume 4. Data Analysis: Exploratory Data Analysis

    Science.gov (United States)

    Intro to exploratory data analysis. Overview of variable distributions, scatter plots, correlation analysis, GIS datasets. Use of conditional probability to examine stressor levels and impairment. Exploring correlations among multiple stressors.

  2. Social network analysis applied to team sports analysis

    CERN Document Server

    Clemente, Filipe Manuel; Mendes, Rui Sousa

    2016-01-01

    Explaining how graph theory and social network analysis can be applied to team sports analysis, This book presents useful approaches, models and methods that can be used to characterise the overall properties of team networks and identify the prominence of each team player. Exploring the different possible network metrics that can be utilised in sports analysis, their possible applications and variances from situation to situation, the respective chapters present an array of illustrative case studies. Identifying the general concepts of social network analysis and network centrality metrics, readers are shown how to generate a methodological protocol for data collection. As such, the book provides a valuable resource for students of the sport sciences, sports engineering, applied computation and the social sciences.

  3. Qualitative Content Analysis

    OpenAIRE

    Philipp Mayring

    2000-01-01

    The article describes an approach of systematic, rule guided qualitative text analysis, which tries to preserve some methodological strengths of quantitative content analysis and widen them to a concept of qualitative procedure. First the development of content analysis is delineated and the basic principles are explained (units of analysis, step models, working with categories, validity and reliability). Then the central procedures of qualitative content analysis, inductive development of ca...

  4. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  5. Information security risk analysis

    CERN Document Server

    Peltier, Thomas R

    2001-01-01

    Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex

  6. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    Science.gov (United States)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  7. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  8. Radioactivation analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1959-07-15

    Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation

  9. Radioactivation analysis

    International Nuclear Information System (INIS)

    1959-01-01

    Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation

  10. Uranium Isotopic Analysis with the FRAM Isotopic Analysis Code

    International Nuclear Information System (INIS)

    Vo, D.T.; Sampson, T.E.

    1999-01-01

    FRAM is the acronym for Fixed-Energy Response-Function Analysis with Multiple efficiency. This software was developed at Los Alamos National Laboratory originally for plutonium isotopic analysis. Later, it was adapted for uranium isotopic analysis in addition to plutonium. It is a code based on a self-calibration using several gamma-ray peaks for determining the isotopic ratios. The versatile-parameter database structure governs all facets of the data analysis. User editing of the parameter sets allows great flexibility in handling data with different isotopic distributions, interfering isotopes, and different acquisition parameters such as energy calibration and detector type

  11. Real analysis a comprehensive course in analysis, part 1

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 1 is devoted to real analysis. From one point of view, it presents the infinitesimal calculus of the twentieth century with the ultimate integral calculus (measure theory)

  12. Text analysis methods, text analysis apparatuses, and articles of manufacture

    Science.gov (United States)

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  13. NCEP SST Analysis

    Science.gov (United States)

    Organization Search Go Search Polar Go MMAB SST Analysis Main page About MMAB Our Mission Our Personnel EMC Branches Global Climate & Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Contact EMC (RTG_SST_HR) analysis For a regional map, click the desired area in the global SST analysis and anomaly maps

  14. Analysis of Heat Transfer

    International Nuclear Information System (INIS)

    2003-08-01

    This book deals with analysis of heat transfer which includes nonlinear analysis examples, radiation heat transfer, analysis of heat transfer in ANSYS, verification of analysis result, analysis of heat transfer of transition with automatic time stepping and open control, analysis of heat transfer using arrangement of ANSYS, resistance of thermal contact, coupled field analysis such as of thermal-structural interaction, cases of coupled field analysis, and phase change.

  15. Combining Conversation Analysis and Nexus Analysis to explore hospital practices

    DEFF Research Database (Denmark)

    Paasch, Bettina Sletten

    , ethnographic observations, interviews, photos and documents were obtained. Inspired by the analytical manoeuvre of zooming in and zooming out proposed by Nicolini (Nicolini, 2009; Nicolini, 2013) the present study uses Conversations Analysis (Sacks, Schegloff, & Jefferson, 1974) and Embodied Interaction...... of interaction. In the conducted interviews nurses report mobile work phones to disturb interactions with patients when they ring, however, analysing the recorded interactions with tools from Conversations Analysis and Embodied Interaction Analysis displays how nurses demonstrate sophisticated awareness...... interrelationships influencing it. The present study thus showcases how Conversation Analysis and Nexus Analysis can be combined to achieve a multi-layered perspective on interactions between nurses, patients and mobile work phones....

  16. http Log Analysis

    DEFF Research Database (Denmark)

    Bøving, Kristian Billeskov; Simonsen, Jesper

    2004-01-01

    This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data...

  17. Importance-performance analysis based SWOT analysis

    OpenAIRE

    Phadermrod, Boonyarat; Crowder, Richard M.; Wills, Gary B.

    2016-01-01

    SWOT analysis, a commonly used tool for strategic planning, is traditionally a form of brainstorming. Hence, it has been criticised that it is likely to hold subjective views of the individuals who participate in a brainstorming session and that SWOT factors are not prioritized by their significance thus it may result in an improper strategic action. While most studies of SWOT analysis have only focused on solving these shortcomings separately, this study offers an approach to diminish both s...

  18. Neutron activation analysis of high-purity iron in comparison with chemical analysis

    International Nuclear Information System (INIS)

    Kinomura, Atsushi; Horino, Yuji; Takaki, Seiichi; Abiko, Kenji

    2000-01-01

    Neutron activation analysis of iron samples of three different purity levels has been performed and compared with chemical analysis for 30 metallic and metalloid impurity elements. The concentration of As, Cl, Cu, Sb and V detected by neutron activation analysis was mostly in agreement with that obtained by chemical analysis. The sensitivity limits of neutron activation analysis of three kinds of iron samples were calculated and found to be reasonable compared with measured values or detection limits of chemical analysis; however, most of them were above the detection limits of chemical analysis. Graphite-shielded irradiation to suppress fast neutron reactions was effective for Mn analysis without decreasing sensitivity to the other impurity elements. (author)

  19. Sensitivity analysis for matched pair analysis of binary data: From worst case to average case analysis.

    Science.gov (United States)

    Hasegawa, Raiden; Small, Dylan

    2017-12-01

    In matched observational studies where treatment assignment is not randomized, sensitivity analysis helps investigators determine how sensitive their estimated treatment effect is to some unmeasured confounder. The standard approach calibrates the sensitivity analysis according to the worst case bias in a pair. This approach will result in a conservative sensitivity analysis if the worst case bias does not hold in every pair. In this paper, we show that for binary data, the standard approach can be calibrated in terms of the average bias in a pair rather than worst case bias. When the worst case bias and average bias differ, the average bias interpretation results in a less conservative sensitivity analysis and more power. In many studies, the average case calibration may also carry a more natural interpretation than the worst case calibration and may also allow researchers to incorporate additional data to establish an empirical basis with which to calibrate a sensitivity analysis. We illustrate this with a study of the effects of cellphone use on the incidence of automobile accidents. Finally, we extend the average case calibration to the sensitivity analysis of confidence intervals for attributable effects. © 2017, The International Biometric Society.

  20. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  1. Mathematical analysis fundamentals

    CERN Document Server

    Bashirov, Agamirza

    2014-01-01

    The author's goal is a rigorous presentation of the fundamentals of analysis, starting from elementary level and moving to the advanced coursework. The curriculum of all mathematics (pure or applied) and physics programs include a compulsory course in mathematical analysis. This book will serve as can serve a main textbook of such (one semester) courses. The book can also serve as additional reading for such courses as real analysis, functional analysis, harmonic analysis etc. For non-math major students requiring math beyond calculus, this is a more friendly approach than many math-centric o

  2. XML-based analysis interface for particle physics data analysis

    International Nuclear Information System (INIS)

    Hu Jifeng; Lu Xiaorui; Zhang Yangheng

    2011-01-01

    The letter emphasizes on an XML-based interface and its framework for particle physics data analysis. The interface uses a concise XML syntax to describe, in data analysis, the basic tasks: event-selection, kinematic fitting, particle identification, etc. and a basic processing logic: the next step goes on if and only if this step succeeds. The framework can perform an analysis without compiling by loading the XML-interface file, setting p in run-time and running dynamically. An analysis coding in XML instead of C++, easy-to-understood arid use, effectively reduces the work load, and enables users to carry out their analyses quickly. The framework has been developed on the BESⅢ offline software system (BOSS) with the object-oriented C++ programming. These functions, required by the regular tasks and the basic processing logic, are implemented with both standard modules or inherited from the modules in BOSS. The interface and its framework have been tested to perform physics analysis. (authors)

  3. Survival analysis using S analysis of time-to-event data

    CERN Document Server

    Tableman, Mara

    2003-01-01

    Survival Analysis Using S: Analysis of Time-to-Event Data is designed as a text for a one-semester or one-quarter course in survival analysis for upper-level or graduate students in statistics, biostatistics, and epidemiology. Prerequisites are a standard pre-calculus first course in probability and statistics, and a course in applied linear regression models. No prior knowledge of S or R is assumed. A wide choice of exercises is included, some intended for more advanced students with a first course in mathematical statistics. The authors emphasize parametric log-linear models, while also detailing nonparametric procedures along with model building and data diagnostics. Medical and public health researchers will find the discussion of cut point analysis with bootstrap validation, competing risks and the cumulative incidence estimator, and the analysis of left-truncated and right-censored data invaluable. The bootstrap procedure checks robustness of cut point analysis and determines cut point(s). In a chapter ...

  4. IAEA Review for Gap Analysis of Safety Analysis Capability

    International Nuclear Information System (INIS)

    Basic, Ivica; Kim, Manwoong; Huges, Peter; Lim, B-K; D'Auria, Francesco; Louis, Vidard Michael

    2014-01-01

    The IAEA Asian Nuclear Safety Network (ANSN) was launched in 2002 in the framework of the Extra Budgetary Programme (EBP) on the Safety of Nuclear Installations in the South East Asia, Pacific and Far East Countries. The main objective is to strengthen and expand human and advanced Information Technology (IT) network to pool, analyse and share nuclear safety knowledge and practical experience for peaceful uses in this region. Under the ANSN framework, a technical group on Safety Analysis (SATG) was established in 2004 aimed to providing a forum for the exchange of experience in the following areas of safety analysis: · To provide a forum for an exchange of experience in the area of safety analysis, · To maintain and improve the knowledge on safety analysis method, · To enhance the utilization of computer codes, · To pool and analyse the issues related with safety analysis of research reactor, and · To facilitate mutual interested on safety analysis among member countries. A sustainable and successful nuclear energy programme requires a strong technical infrastructure, including a workforce made up of highly specialized and well-educated professionals. A significant portion of this technical capacity must be dedicated to safety- especially to safety analysis- as only then can it serve as the basis for making the right decisions during the planning, licensing, construction and operation of new nuclear facilities. In this regard, the IAEA has provided ANSN member countries with comprehensive training opportunities for capacity building in safety analysis. Nevertheless, the SATG recognizes that it is difficult to achieve harmonization in this area among all member countries because of their different competency levels. Therefore, it is necessary to quickly identify the most obvious gaps in safety analysis capability and then to use existing resources to begin to fill those gaps. The goal of this Expert Mission (EM) for gap finding service is to facilitate

  5. Hydroeconomic analysis

    DEFF Research Database (Denmark)

    Bauer-Gottwein, Peter; Riegels, Niels; Pulido-Velazquez, Manuel

    2017-01-01

    Hydroeconomic analysis and modeling provides a consistent and quantitative framework to assess the links between water resources systems and economic activities related to water use, simultaneously modeling water supply and water demand. It supports water managers and decision makers in assessing...... trade-offs between different water uses, different geographic regions, and various economic sectors and between the present and the future. Hydroeconomic analysis provides consistent economic performance criteria for infrastructure development and institutional reform in water policies and management...... organizations. This chapter presents an introduction to hydroeconomic analysis and modeling, and reviews the state of the art in the field. We review available economic water-valuation techniques and summarize the main types of decision problems encountered in hydroeconomic analysis. Popular solution strategies...

  6. Sensitivity analysis

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/article/003741.htm Sensitivity analysis To use the sharing features on this page, please enable JavaScript. Sensitivity analysis determines the effectiveness of antibiotics against microorganisms (germs) ...

  7. The potential for meta-analysis to support decision analysis in ecology.

    Science.gov (United States)

    Mengersen, Kerrie; MacNeil, M Aaron; Caley, M Julian

    2015-06-01

    Meta-analysis and decision analysis are underpinned by well-developed methods that are commonly applied to a variety of problems and disciplines. While these two fields have been closely linked in some disciplines such as medicine, comparatively little attention has been paid to the potential benefits of linking them in ecology, despite reasonable expectations that benefits would be derived from doing so. Meta-analysis combines information from multiple studies to provide more accurate parameter estimates and to reduce the uncertainty surrounding them. Decision analysis involves selecting among alternative choices using statistical information that helps to shed light on the uncertainties involved. By linking meta-analysis to decision analysis, improved decisions can be made, with quantification of the costs and benefits of alternate decisions supported by a greater density of information. Here, we briefly review concepts of both meta-analysis and decision analysis, illustrating the natural linkage between them and the benefits from explicitly linking one to the other. We discuss some examples in which this linkage has been exploited in the medical arena and how improvements in precision and reduction of structural uncertainty inherent in a meta-analysis can provide substantive improvements to decision analysis outcomes by reducing uncertainty in expected loss and maximising information from across studies. We then argue that these significant benefits could be translated to ecology, in particular to the problem of making optimal ecological decisions in the face of uncertainty. Copyright © 2013 John Wiley & Sons, Ltd.

  8. Bayesian Mediation Analysis

    OpenAIRE

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...

  9. Economic and Financial Analysis Tools | Energy Analysis | NREL

    Science.gov (United States)

    Economic and Financial Analysis Tools Economic and Financial Analysis Tools Use these economic and . Job and Economic Development Impact (JEDI) Model Use these easy-to-use, spreadsheet-based tools to analyze the economic impacts of constructing and operating power generation and biofuel plants at the

  10. Computational Music Analysis

    DEFF Research Database (Denmark)

    This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....

  11. System analysis and design

    International Nuclear Information System (INIS)

    Son, Seung Hui

    2004-02-01

    This book deals with information technology and business process, information system architecture, methods of system development, plan on system development like problem analysis and feasibility analysis, cases for system development, comprehension of analysis of users demands, analysis of users demands using traditional analysis, users demands analysis using integrated information system architecture, system design using integrated information system architecture, system implementation, and system maintenance.

  12. Higher lung deposition with Respimat® Soft Mist™ Inhaler than HFA-MDI in COPD patients with poor technique

    Directory of Open Access Journals (Sweden)

    Peter Brand

    2008-08-01

    Full Text Available Peter Brand1, Bettina Hederer2, George Austen3, Helen Dewberry3, Thomas Meyer41RWTH, Aachen, Germany; 2Boehringer Ingelheim, Ingelheim, Germany; 3Boehringer Ingelheim, Bracknell, UK; 4Inamed Research, Gauting, GermanyAbstract: Aerosols delivered by Respimat® Soft Mist™ Inhaler (SMI are slower-moving and longer-lasting than those from pressurized metered-dose inhalers (pMDIs, improving the efficiency of pulmonary drug delivery to patients. In this four-way cross-over study, adults with chronic obstructive pulmonary disease (COPD and with poor pMDI technique received radiolabelled Berodual® (fenoterol hydrobromide 50 µg/ipratropium bromide 20 µg via Respimat® SMI or hydrofluoroalkane (HFA-MDI (randomized order on test days 1 and 2, with no inhaler technique training. The procedure was repeated on test days 3 and 4 after training. Deposition was measured by gamma scintigraphy. All 13 patients entered (9 males, mean age 62 years; FEV1 46% of predicted inhaled too fast at screening (peak inspiratory flow rate [IF]: 69–161 L/min. Whole lung deposition was higher with Respimat® SMI than with pMDI for untrained (37% of delivered dose vs 21% of metered dose and trained patients (53% of delivered vs 21% of metered dose (pSign-Test = 0.15; pANOVA< 0.05. Training also improved inhalation profiles (slower average and peak IF as well as longer breath-hold time. Drug delivery to the lungs with Respimat® SMI is more efficient than with pMDI, even with poor inhaler technique. Teaching patients to hold their breath as well as to inhale slowly and deeply increased further lung deposition using Respimat® SMI.Keywords: chronic obstructive pulmonary disease, drug delivery, inhalation, metered-dose inhaler, poor inhalation technique, training

  13. Expression of SIRT1 and oxidative stress in diabetic dry eye.

    Science.gov (United States)

    Liu, Hao; Sheng, Minjie; Liu, Yu; Wang, Peng; Chen, Yihui; Chen, Li; Wang, Weifang; Li, Bing

    2015-01-01

    To explore the expression of SIRT1 with oxidative stress and observe physiological and pathological changes in the corneas as well as the association between SIRT1 and oxidative stress of diabetic dry eyes in mice. Forty-eight C57BL/6Jdb/db mice at eight weeks of age were divided randomly into two groups: the diabetic dry eye group and the diabetic group. An additional forty-eight C57BL/6J mice at eight weeks of age were divided randomly into two groups: the dry eye group and the control group. Every mouse in the dry eye groups (diabetic and normal) was injected with scopolamine hydrobromide three times daily, combined with low humidity to establish a dry eye model. After the intervention, phenol red cotton string tests and corneal fluorescein staining were performed. In addition, HE staining and immunofluorescence were done. Expression of SIRT1 in the cornea was examined by real-time PCR and Western Blot and expression of FOXO3 and MnSOD proteins was detected by Western Blot. At one, four, and eight weeks post intervention, all of the groups except the controls showed significant decreases in tear production and increases in the corneal fluorescein stain (Pdry eye group had the least tear production and the highest corneal fluorescein stain score (Pdry eye group. In the 1(st) and 4(th) week, the expression of SIRT1, FOXO3, and MnSOD were significantly higher in the diabetic DE and DM groups but lower in the DE group compared to the controls (Pdry eye, tear production declined markedly coupled with seriously wounded corneal epithelium. Oxidative stress in the cornea was enhanced significantly and the expression of SIRT1 was decreased.

  14. Salt forms of the pharmaceutical amide dihydrocarbamazepine.

    Science.gov (United States)

    Buist, Amanda R; Kennedy, Alan R

    2016-02-01

    Carbamazepine (CBZ) is well known as a model active pharmaceutical ingredient used in the study of polymorphism and the generation and comparison of cocrystal forms. The pharmaceutical amide dihydrocarbamazepine (DCBZ) is a less well known material and is largely of interest here as a structural congener of CBZ. Reaction of DCBZ with strong acids results in protonation of the amide functionality at the O atom and gives the salt forms dihydrocarbamazepine hydrochloride {systematic name: [(10,11-dihydro-5H-dibenzo[b,f]azepin-5-yl)(hydroxy)methylidene]azanium chloride, C15H15N2O(+)·Cl(-)}, dihydrocarbamazepine hydrochloride monohydrate {systematic name: [(10,11-dihydro-5H-dibenzo[b,f]azepin-5-yl)(hydroxy)methylidene]azanium chloride monohydrate, C15H15N2O(+)·Cl(-)·H2O} and dihydrocarbamazepine hydrobromide monohydrate {systematic name: [(10,11-dihydro-5H-dibenzo[b,f]azepin-5-yl)(hydroxy)methylidene]azanium bromide monohydrate, C15H15N2O(+)·Br(-)·H2O}. The anhydrous hydrochloride has a structure with two crystallographically independent ion pairs (Z' = 2), wherein both cations adopt syn conformations, whilst the two hydrated species are mutually isostructural and have cations with anti conformations. Compared to neutral dihydrocarbamazepine structures, protonation of the amide group is shown to cause changes to both the molecular (C=O bond lengthening and C-N bond shortening) and the supramolecular structures. The amide-to-amide and dimeric hydrogen-bonding motifs seen for neutral polymorphs and cocrystalline species are replaced here by one-dimensional polymeric constructs with no direct amide-to-amide bonds. The structures are also compared with, and shown to be closely related to, those of the salt forms of the structurally similar pharmaceutical carbamazepine.

  15. Pharmacokinetic Effects of Isavuconazole Coadministration With the Cytochrome P450 Enzyme Substrates Bupropion, Repaglinide, Caffeine, Dextromethorphan, and Methadone in Healthy Subjects.

    Science.gov (United States)

    Yamazaki, Takao; Desai, Amit; Goldwater, Ronald; Han, David; Howieson, Corrie; Akhtar, Shahzad; Kowalski, Donna; Lademacher, Christopher; Pearlman, Helene; Rammelsberg, Diane; Townsend, Robert

    2017-01-01

    This report describes phase 1 clinical trials performed to assess interactions of oral isavuconazole at the clinically targeted dose (200 mg, administered as isavuconazonium sulfate 372 mg, 3 times a day for 2 days; 200 mg once daily [QD] thereafter) with single oral doses of the cytochrome P450 (CYP) substrates: bupropion hydrochloride (CYP2B6; 100 mg; n = 24), repaglinide (CYP2C8/CYP3A4; 0.5 mg; n = 24), caffeine (CYP1A2; 200 mg; n = 24), dextromethorphan hydrobromide (CYP2D6/CYP3A4; 30 mg; n = 24), and methadone (CYP2B6/CYP2C19/CYP3A4; 10 mg; n = 23). Compared with each drug alone, coadministration with isavuconazole changed the area under the concentration-time curves (AUC ∞ ) and maximum concentrations (C max ) as follows: bupropion, AUC ∞ reduced 42%, C max reduced 31%; repaglinide, AUC ∞ reduced 8%, C max reduced 14%; caffeine, AUC ∞ increased 4%, C max reduced 1%; dextromethorphan, AUC ∞ increased 18%, C max increased 17%; R-methadone, AUC ∞ reduced 10%, C max increased 3%; S-methadone, AUC ∞ reduced 35%, C max increased 1%. In all studies, there were no deaths, 1 serious adverse event (dextromethorphan study; perioral numbness, numbness of right arm and leg), and adverse events leading to study discontinuation were rare. Thus, isavuconazole is a mild inducer of CYP2B6 but does not appear to affect CYP1A2-, CYP2C8-, or CYP2D6-mediated metabolism. © 2016 The Authors. Clinical Pharmacology in Drug Development Published by Wiley Periodicals, Inc. on behalf of The American College of Clinical Pharmacology.

  16. Prevalência de Echinococcus granulosus (Batsch, 1786 em cães urbanos errantes do município de Dom Pedrito (RS, Brasil Prevalence of Echinococcus granulosus (Batsch, 1786 in urban stray dogs from Dom Pedrito in the State of Rio Grande do Sul, Brazil

    Directory of Open Access Journals (Sweden)

    Adriane Nunes Hoffmann

    2001-10-01

    Full Text Available Echinococcus granulosus é um dos parasitas mais importantes envolvidos em zoonoses de municípios próximos a fronteira do Rio grande do Sul com a Argentina e Uruguai. Amostras de 65 cães urbanos errantes do município de Dom Pedrito foram analisadas por meio de três técnicas: purgação pelo bromidrato de arecolina para visualização da presença do parasito; ensaio de imunoadsorção enzimática (ELISA, para detecção de coproantígenos, imunofluorescência indireta (IFI, para detecção de anticorpos séricos contra E. granulosus. Destes cães, 7,7% (5/65 apresentaram o parasita nas fezes, 10,8 (7/65 coproantígenos e 13,8% (9/65 anticorpos séricos contra o cestódeo. Conclui-se que a equinococose canina, no meio urbano, pode representar um sério problema à saúde pública, devido ao risco de hidatidose humana.Echinococcus granulosus is the one of the most important parasites involved in zoonosis in the State of Rio Grande do Sul, in cities near the Argentinian and Uruguayian border. Sixty-five samples of urban stray dogs from Dom Pedrito county were analyzed by three techniques: purgation by arecoline hydrobromide, to verify the presence of the parasite; enzyme-linked immunosorbent assay test (ELISA, to detect the coproantigen and indirect immunofluorescence antibody test (IFA to identify serum antibodies against E. granulosus. From the analyzed dogs 7.7% (5/65 have presented the parasite in feces, 10.8% (7/65 had coproantigens and 13.8% (9/65 serum antibodies against this cestode. It was concluded that canine echinococcosis in the urban zone may represent a serious problem to public health, due to hidatidosis risk for humans.

  17. Emergence of Serotonergic Neurons After Spinal Cord Injury in Turtles

    Directory of Open Access Journals (Sweden)

    Gabriela Fabbiani

    2018-03-01

    Full Text Available Plasticity of neural circuits takes many forms and plays a fundamental role in regulating behavior to changing demands while maintaining stability. For example, during spinal cord development neurotransmitter identity in neurons is dynamically adjusted in response to changes in the activity of spinal networks. It is reasonable to speculate that this type of plasticity might occur also in mature spinal circuits in response to injury. Because serotonergic signaling has a central role in spinal cord functions, we hypothesized that spinal cord injury (SCI in the fresh water turtle Trachemys scripta elegans may trigger homeostatic changes in serotonergic innervation. To test this possibility we performed immunohistochemistry for serotonin (5-HT and key molecules involved in the determination of the serotonergic phenotype before and after SCI. We found that as expected, in the acute phase after injury the dense serotonergic innervation was strongly reduced. However, 30 days after SCI the population of serotonergic cells (5-HT+ increased in segments caudal to the lesion site. These cells expressed the neuronal marker HuC/D and the transcription factor Nkx6.1. The new serotonergic neurons did not incorporate the thymidine analog 5-bromo-2′-deoxyuridine (BrdU and did not express the proliferating cell nuclear antigen (PCNA indicating that novel serotonergic neurons were not newborn but post-mitotic cells that have changed their neurochemical identity. Switching towards a serotonergic neurotransmitter phenotype may be a spinal cord homeostatic mechanism to compensate for the loss of descending serotonergic neuromodulation, thereby helping the outstanding functional recovery displayed by turtles. The 5-HT1A receptor agonist (±-8-Hydroxy-2-dipropylaminotetralin hydrobromide (8-OH-DPAT blocked the increase in 5-HT+ cells suggesting 5-HT1A receptors may trigger the respecification process.

  18. Emergence of Serotonergic Neurons After Spinal Cord Injury in Turtles

    Science.gov (United States)

    Fabbiani, Gabriela; Rehermann, María I.; Aldecosea, Carina; Trujillo-Cenóz, Omar; Russo, Raúl E.

    2018-01-01

    Plasticity of neural circuits takes many forms and plays a fundamental role in regulating behavior to changing demands while maintaining stability. For example, during spinal cord development neurotransmitter identity in neurons is dynamically adjusted in response to changes in the activity of spinal networks. It is reasonable to speculate that this type of plasticity might occur also in mature spinal circuits in response to injury. Because serotonergic signaling has a central role in spinal cord functions, we hypothesized that spinal cord injury (SCI) in the fresh water turtle Trachemys scripta elegans may trigger homeostatic changes in serotonergic innervation. To test this possibility we performed immunohistochemistry for serotonin (5-HT) and key molecules involved in the determination of the serotonergic phenotype before and after SCI. We found that as expected, in the acute phase after injury the dense serotonergic innervation was strongly reduced. However, 30 days after SCI the population of serotonergic cells (5-HT+) increased in segments caudal to the lesion site. These cells expressed the neuronal marker HuC/D and the transcription factor Nkx6.1. The new serotonergic neurons did not incorporate the thymidine analog 5-bromo-2′-deoxyuridine (BrdU) and did not express the proliferating cell nuclear antigen (PCNA) indicating that novel serotonergic neurons were not newborn but post-mitotic cells that have changed their neurochemical identity. Switching towards a serotonergic neurotransmitter phenotype may be a spinal cord homeostatic mechanism to compensate for the loss of descending serotonergic neuromodulation, thereby helping the outstanding functional recovery displayed by turtles. The 5-HT1A receptor agonist (±)-8-Hydroxy-2-dipropylaminotetralin hydrobromide (8-OH-DPAT) blocked the increase in 5-HT+ cells suggesting 5-HT1A receptors may trigger the respecification process. PMID:29593503

  19. Semen Analysis Test

    Science.gov (United States)

    ... Sources Ask Us Also Known As Sperm Analysis Sperm Count Seminal Fluid Analysis Formal Name Semen Analysis This ... semen Viscosity—consistency or thickness of the semen Sperm count—total number of sperm Sperm concentration (density)—number ...

  20. Qualitative Content Analysis

    Directory of Open Access Journals (Sweden)

    Satu Elo

    2014-02-01

    Full Text Available Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studies, our own experiences, and methodological textbooks. Trustworthiness was described for the main qualitative content analysis phases from data collection to reporting of the results. We concluded that it is important to scrutinize the trustworthiness of every phase of the analysis process, including the preparation, organization, and reporting of results. Together, these phases should give a reader a clear indication of the overall trustworthiness of the study. Based on our findings, we compiled a checklist for researchers attempting to improve the trustworthiness of a content analysis study. The discussion in this article helps to clarify how content analysis should be reported in a valid and understandable manner, which would be of particular benefit to reviewers of scientific articles. Furthermore, we discuss that it is often difficult to evaluate the trustworthiness of qualitative content analysis studies because of defective data collection method description and/or analysis description.

  1. Real analysis

    CERN Document Server

    McShane, Edward James

    2013-01-01

    This text surveys practical elements of real function theory, general topology, and functional analysis. Discusses the maximality principle, the notion of convergence, the Lebesgue-Stieltjes integral, function spaces and harmonic analysis. Includes exercises. 1959 edition.

  2. Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study.

    Science.gov (United States)

    Vaismoradi, Mojtaba; Turunen, Hannele; Bondas, Terese

    2013-09-01

    Qualitative content analysis and thematic analysis are two commonly used approaches in data analysis of nursing research, but boundaries between the two have not been clearly specified. In other words, they are being used interchangeably and it seems difficult for the researcher to choose between them. In this respect, this paper describes and discusses the boundaries between qualitative content analysis and thematic analysis and presents implications to improve the consistency between the purpose of related studies and the method of data analyses. This is a discussion paper, comprising an analytical overview and discussion of the definitions, aims, philosophical background, data gathering, and analysis of content analysis and thematic analysis, and addressing their methodological subtleties. It is concluded that in spite of many similarities between the approaches, including cutting across data and searching for patterns and themes, their main difference lies in the opportunity for quantification of data. It means that measuring the frequency of different categories and themes is possible in content analysis with caution as a proxy for significance. © 2013 Wiley Publishing Asia Pty Ltd.

  3. Numerical analysis

    CERN Document Server

    Khabaza, I M

    1960-01-01

    Numerical Analysis is an elementary introduction to numerical analysis, its applications, limitations, and pitfalls. Methods suitable for digital computers are emphasized, but some desk computations are also described. Topics covered range from the use of digital computers in numerical work to errors in computations using desk machines, finite difference methods, and numerical solution of ordinary differential equations. This book is comprised of eight chapters and begins with an overview of the importance of digital computers in numerical analysis, followed by a discussion on errors in comput

  4. Recursive analysis

    CERN Document Server

    Goodstein, R L

    2010-01-01

    Recursive analysis develops natural number computations into a framework appropriate for real numbers. This text is based upon primary recursive arithmetic and presents a unique combination of classical analysis and intuitional analysis. Written by a master in the field, it is suitable for graduate students of mathematics and computer science and can be read without a detailed knowledge of recursive arithmetic.Introductory chapters on recursive convergence and recursive and relative continuity are succeeded by explorations of recursive and relative differentiability, the relative integral, and

  5. Gap Analysis: Application to Earned Value Analysis

    OpenAIRE

    Langford, Gary O.; Franck, Raymond (Chip)

    2008-01-01

    Sponsored Report (for Acquisition Research Program) Earned Value is regarded as a useful tool to monitor commercial and defense system acquisitions. This paper applies the theoretical foundations and systematics of Gap Analysis to improve Earned Value Management. As currently implemented, Earned Value inaccurately provides a higher value for the work performed. This preliminary research indicates that Earned Value calculations can be corrected. Value Analysis, properly defined and enacted,...

  6. CONTENT ANALYSIS, DISCOURSE ANALYSIS, AND CONVERSATION ANALYSIS: PRELIMINARY STUDY ON CONCEPTUAL AND THEORETICAL METHODOLOGICAL DIFFERENCES

    Directory of Open Access Journals (Sweden)

    Anderson Tiago Peixoto Gonçalves

    2016-08-01

    Full Text Available This theoretical essay aims to reflect on three models of text interpretation used in qualitative research, which is often confused in its concepts and methodologies (Content Analysis, Discourse Analysis, and Conversation Analysis. After the presentation of the concepts, the essay proposes a preliminary discussion on conceptual and theoretical methodological differences perceived between them. A review of the literature was performed to support the conceptual and theoretical methodological discussion. It could be verified that the models have differences related to the type of strategy used in the treatment of texts, the type of approach, and the appropriate theoretical position.

  7. Moyer's method of mixed dentition analysis: a meta-analysis ...

    African Journals Online (AJOL)

    The applicability of tables derived from the data Moyer used to other ethnic groups has ... This implies that Moyer's method of prediction may have population variations. ... Key Words: meta-analysis, mixed dentition analysis, Moyer's method

  8. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  9. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  10. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  11. Trial Sequential Analysis in systematic reviews with meta-analysis

    DEFF Research Database (Denmark)

    Wetterslev, Jørn; Jakobsen, Janus Christian; Gluud, Christian

    2017-01-01

    BACKGROUND: Most meta-analyses in systematic reviews, including Cochrane ones, do not have sufficient statistical power to detect or refute even large intervention effects. This is why a meta-analysis ought to be regarded as an interim analysis on its way towards a required information size...... from traditional meta-analyses using unadjusted naïve 95% confidence intervals and 5% thresholds for statistical significance. Spurious conclusions in systematic reviews with traditional meta-analyses can be reduced using Trial Sequential Analysis. Several empirical studies have demonstrated...

  12. On-line analysis of reactor noise using time-series analysis

    International Nuclear Information System (INIS)

    McGevna, V.G.

    1981-10-01

    A method to allow use of time series analysis for on-line noise analysis has been developed. On-line analysis of noise in nuclear power reactors has been limited primarily to spectral analysis and related frequency domain techniques. Time series analysis has many distinct advantages over spectral analysis in the automated processing of reactor noise. However, fitting an autoregressive-moving average (ARMA) model to time series data involves non-linear least squares estimation. Unless a high speed, general purpose computer is available, the calculations become too time consuming for on-line applications. To eliminate this problem, a special purpose algorithm was developed for fitting ARMA models. While it is based on a combination of steepest descent and Taylor series linearization, properties of the ARMA model are used so that the auto- and cross-correlation functions can be used to eliminate the need for estimating derivatives. The number of calculations, per iteration varies lineegardless of the mee 0.2% yield strength displayed anisotropy, with axial and circumferential values being greater than radial. For CF8-CPF8 and CF8M-CPF8M castings to meet current ASME Code S acid fuel cells

  13. The Use of Object-Oriented Analysis Methods in Surety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.

  14. Circuit analysis for dummies

    CERN Document Server

    Santiago, John

    2013-01-01

    Circuits overloaded from electric circuit analysis? Many universities require that students pursuing a degree in electrical or computer engineering take an Electric Circuit Analysis course to determine who will ""make the cut"" and continue in the degree program. Circuit Analysis For Dummies will help these students to better understand electric circuit analysis by presenting the information in an effective and straightforward manner. Circuit Analysis For Dummies gives you clear-cut information about the topics covered in an electric circuit analysis courses to help

  15. Hazard Analysis Database Report

    CERN Document Server

    Grams, W H

    2000-01-01

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...

  16. Descriptive data analysis.

    Science.gov (United States)

    Thompson, Cheryl Bagley

    2009-01-01

    This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.

  17. International Market Analysis

    DEFF Research Database (Denmark)

    Sørensen, Olav Jull

    2009-01-01

    The review presents the book International Market Analysis: Theories and Methods, written by John Kuiada, professor at Centre of International Business, Department of Business Studies, Aalborg University. The book is refreshingly new in its way of looking at a classical problem. It looks at market...... analysis from the point of vie of ways of thinking about markets. Furthermore, the book includes the concept of learning in the analysis of markets og how the way we understand business reality influneces our choice of methodology for market analysis....

  18. Improvement of Binary Analysis Components in Automated Malware Analysis Framework

    Science.gov (United States)

    2017-02-21

    AFRL-AFOSR-JP-TR-2017-0018 Improvement of Binary Analysis Components in Automated Malware Analysis Framework Keiji Takeda KEIO UNIVERSITY Final...TYPE Final 3. DATES COVERED (From - To) 26 May 2015 to 25 Nov 2016 4. TITLE AND SUBTITLE Improvement of Binary Analysis Components in Automated Malware ...analyze malicious software ( malware ) with minimum human interaction. The system autonomously analyze malware samples by analyzing malware binary program

  19. Analysis of mineral phases in coal utilizing factor analysis

    International Nuclear Information System (INIS)

    Roscoe, B.A.; Hopke, P.K.

    1982-01-01

    The mineral phase inclusions of coal are discussed. The contribution of these to a coal sample are determined utilizing several techniques. Neutron activation analysis in conjunction with coal washability studies have produced some information on the general trends of elemental variation in the mineral phases. These results have been enhanced by the use of various statistical techniques. The target transformation factor analysis is specifically discussed and shown to be able to produce elemental profiles of the mineral phases in coal. A data set consisting of physically fractionated coal samples was generated. These samples were analyzed by neutron activation analysis and then their elemental concentrations examined using TTFA. Information concerning the mineral phases in coal can thus be acquired from factor analysis even with limited data. Additional data may permit the resolution of additional mineral phases as well as refinement of theose already identified

  20. Meta-analysis with R

    CERN Document Server

    Schwarzer, Guido; Rücker, Gerta

    2015-01-01

    This book provides a comprehensive introduction to performing meta-analysis using the statistical software R. It is intended for quantitative researchers and students in the medical and social sciences who wish to learn how to perform meta-analysis with R. As such, the book introduces the key concepts and models used in meta-analysis. It also includes chapters on the following advanced topics: publication bias and small study effects; missing data; multivariate meta-analysis, network meta-analysis; and meta-analysis of diagnostic studies.  .

  1. Cluster analysis for applications

    CERN Document Server

    Anderberg, Michael R

    1973-01-01

    Cluster Analysis for Applications deals with methods and various applications of cluster analysis. Topics covered range from variables and scales to measures of association among variables and among data units. Conceptual problems in cluster analysis are discussed, along with hierarchical and non-hierarchical clustering methods. The necessary elements of data analysis, statistics, cluster analysis, and computer implementation are integrated vertically to cover the complete path from raw data to a finished analysis.Comprised of 10 chapters, this book begins with an introduction to the subject o

  2. Job Analysis

    OpenAIRE

    Bravená, Helena

    2009-01-01

    This bacherlor thesis deals with the importance of job analysis for personnel activities in the company. The aim of this work is to find the most suitable method of job analysis in a particular enterprise, and continues creating descriptions and specifications of each job.

  3. Visual physics analysis VISPA

    International Nuclear Information System (INIS)

    Actis, Oxana; Brodski, Michael; Erdmann, Martin; Fischer, Robert; Hinzmann, Andreas; Mueller, Gero; Muenzer, Thomas; Plum, Matthias; Steggemann, Jan; Winchen, Tobias; Klimkovich, Tatsiana

    2010-01-01

    VISPA is a development environment for high energy physics analyses which enables physicists to combine graphical and textual work. A physics analysis cycle consists of prototyping, performing, and verifying the analysis. The main feature of VISPA is a multipurpose window for visual steering of analysis steps, creation of analysis templates, and browsing physics event data at different steps of an analysis. VISPA follows an experiment-independent approach and incorporates various tools for steering and controlling required in a typical analysis. Connection to different frameworks of high energy physics experiments is achieved by using different types of interfaces. We present the look-and-feel for an example physics analysis at the LHC and explain the underlying software concepts of VISPA.

  4. Analysis of the interaction between experimental and applied behavior analysis.

    Science.gov (United States)

    Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Cox, Alison D; Pear, Joseph J

    2014-01-01

    To study the influences between basic and applied research in behavior analysis, we analyzed the coauthorship interactions of authors who published in JABA and JEAB from 1980 to 2010. We paid particular attention to authors who published in both JABA and JEAB (dual authors) as potential agents of cross-field interactions. We present a comprehensive analysis of dual authors' coauthorship interactions using social networks methodology and key word analysis. The number of dual authors more than doubled (26 to 67) and their productivity tripled (7% to 26% of JABA and JEAB articles) between 1980 and 2010. Dual authors stood out in terms of number of collaborators, number of publications, and ability to interact with multiple groups within the field. The steady increase in JEAB and JABA interactions through coauthors and the increasing range of topics covered by dual authors provide a basis for optimism regarding the progressive integration of basic and applied behavior analysis. © Society for the Experimental Analysis of Behavior.

  5. K Basin Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    PECH, S.H.

    2000-08-23

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  6. K Basin Hazard Analysis

    International Nuclear Information System (INIS)

    PECH, S.H.

    2000-01-01

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report

  7. Energy analysis handbook. CAC document 214. [Combining process analysis with input-output analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bullard, C. W.; Penner, P. S.; Pilati, D. A.

    1976-10-01

    Methods are presented for calculating the energy required, directly and indirectly, to produce all types of goods and services. Procedures for combining process analysis with input-output analysis are described. This enables the analyst to focus data acquisition cost-effectively, and to achieve a specified degree of accuracy in the results. The report presents sample calculations and provides the tables and charts needed to perform most energy cost calculations, including the cost of systems for producing or conserving energy.

  8. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    Science.gov (United States)

    Sanfilippo, Antonio P [Richland, WA; Cowell, Andrew J [Kennewick, WA; Gregory, Michelle L [Richland, WA; Baddeley, Robert L [Richland, WA; Paulson, Patrick R [Pasco, WA; Tratz, Stephen C [Richland, WA; Hohimer, Ryan E [West Richland, WA

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  9. Analysis I

    CERN Document Server

    Tao, Terence

    2016-01-01

    This is part one of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...

  10. Analysis II

    CERN Document Server

    Tao, Terence

    2016-01-01

    This is part two of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...

  11. CMS analysis operations

    International Nuclear Information System (INIS)

    Andreeva, J; Maier, G; Spiga, D; Calloni, M; Colling, D; Fanzago, F; D'Hondt, J; Maes, J; Van Mulders, P; Villella, I; Klem, J; Letts, J; Padhi, S; Sarkar, S

    2010-01-01

    During normal data taking CMS expects to support potentially as many as 2000 analysis users. Since the beginning of 2008 there have been more than 800 individuals who submitted a remote analysis job to the CMS computing infrastructure. The bulk of these users will be supported at the over 40 CMS Tier-2 centres. Supporting a globally distributed community of users on a globally distributed set of computing clusters is a task that requires reconsidering the normal methods of user support for Analysis Operations. In 2008 CMS formed an Analysis Support Task Force in preparation for large-scale physics analysis activities. The charge of the task force was to evaluate the available support tools, the user support techniques, and the direct feedback of users with the goal of improving the success rate and user experience when utilizing the distributed computing environment. The task force determined the tools needed to assess and reduce the number of non-zero exit code applications submitted through the grid interfaces and worked with the CMS experiment dashboard developers to obtain the necessary information to quickly and proactively identify issues with user jobs and data sets hosted at various sites. Results of the analysis group surveys were compiled. Reference platforms for testing and debugging problems were established in various geographic regions. The task force also assessed the resources needed to make the transition to a permanent Analysis Operations task. In this presentation the results of the task force will be discussed as well as the CMS Analysis Operations plans for the start of data taking.

  12. Harmonic analysis a comprehensive course in analysis, part 3

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 3 returns to the themes of Part 1 by discussing pointwise limits (going beyond the usual focus on the Hardy-Littlewood maximal function by including ergodic theorems and m

  13. Cost benefit analysis cost effectiveness analysis

    International Nuclear Information System (INIS)

    Lombard, J.

    1986-09-01

    The comparison of various protection options in order to determine which is the best compromise between cost of protection and residual risk is the purpose of the ALARA procedure. The use of decision-aiding techniques is valuable as an aid to selection procedures. The purpose of this study is to introduce two rather simple and well known decision aiding techniques: the cost-effectiveness analysis and the cost-benefit analysis. These two techniques are relevant for the great part of ALARA decisions which need the use of a quantitative technique. The study is based on an hypothetical case of 10 protection options. Four methods are applied to the data

  14. Basic complex analysis a comprehensive course in analysis, part 2a

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 2A is devoted to basic complex analysis. It interweaves three analytic threads associated with Cauchy, Riemann, and Weierstrass, respectively. Cauchy's view focuses on th

  15. pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis.

    Science.gov (United States)

    Giannakopoulos, Theodoros

    2015-01-01

    Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library.

  16. Introduction of thermal-hydraulic analysis code and system analysis code for HTGR

    International Nuclear Information System (INIS)

    Tanaka, Mitsuhiro; Izaki, Makoto; Koike, Hiroyuki; Tokumitsu, Masashi

    1984-01-01

    Kawasaki Heavy Industries Ltd. has advanced the development and systematization of analysis codes, aiming at lining up the analysis codes for heat transferring flow and control characteristics, taking up HTGR plants as the main object. In order to make the model of flow when shock waves propagate to heating tubes, SALE-3D which can analyze a complex system was developed, therefore, it is reported in this paper. Concerning the analysis code for control characteristics, the method of sensitivity analysis in a topological space including an example of application is reported. The flow analysis code SALE-3D is that for analyzing the flow of compressible viscous fluid in a three-dimensional system over the velocity range from incompressibility limit to supersonic velocity. The fundamental equations and fundamental algorithm of the SALE-3D, the calculation of cell volume, the plotting of perspective drawings and the analysis of the three-dimensional behavior of shock waves propagating in heating tubes after their rupture accident are described. The method of sensitivity analysis was added to the analysis code for control characteristics in a topological space, and blow-down phenomena was analyzed by its application. (Kako, I.)

  17. Investigation on method of elasto-plastic analysis for piping system (benchmark analysis)

    International Nuclear Information System (INIS)

    Kabaya, Takuro; Kojima, Nobuyuki; Arai, Masashi

    2015-01-01

    This paper provides method of an elasto-plastic analysis for practical seismic design of nuclear piping system. JSME started up the task to establish method of an elasto-plastic analysis for nuclear piping system. The benchmark analyses have been performed in the task to investigate on method of an elasto-plastic analysis. And our company has participated in the benchmark analyses. As a result, we have settled on the method which simulates the result of piping exciting test accurately. Therefore the recommended method of an elasto-plastic analysis is shown as follows; 1) An elasto-plastic analysis is composed of dynamic analysis of piping system modeled by using beam elements and static analysis of deformed elbow modeled by using shell elements. 2) Bi-linear is applied as an elasto-plastic property. Yield point is standardized yield point multiplied by 1.2 times, and second gradient is 1/100 young's modulus. Kinematic hardening is used as a hardening rule. 3) The fatigue life is evaluated on strain ranges obtained by elasto-plastic analysis, by using the rain flow method and the fatigue curve of previous studies. (author)

  18. K Basins Hazard Analysis

    International Nuclear Information System (INIS)

    WEBB, R.H.

    1999-01-01

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062/Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report

  19. Functional data analysis

    CERN Document Server

    Ramsay, J O

    1997-01-01

    Scientists today collect samples of curves and other functional observations. This monograph presents many ideas and techniques for such data. Included are expressions in the functional domain of such classics as linear regression, principal components analysis, linear modelling, and canonical correlation analysis, as well as specifically functional techniques such as curve registration and principal differential analysis. Data arising in real applications are used throughout for both motivation and illustration, showing how functional approaches allow us to see new things, especially by exploiting the smoothness of the processes generating the data. The data sets exemplify the wide scope of functional data analysis; they are drwan from growth analysis, meterology, biomechanics, equine science, economics, and medicine. The book presents novel statistical technology while keeping the mathematical level widely accessible. It is designed to appeal to students, to applied data analysts, and to experienced researc...

  20. Dimensional Analysis

    Indian Academy of Sciences (India)

    Dimensional analysis is a useful tool which finds important applications in physics and engineering. It is most effective when there exist a maximal number of dimensionless quantities constructed out of the relevant physical variables. Though a complete theory of dimen- sional analysis was developed way back in 1914 in a.

  1. A sensory analysis of butter cookies: An application of generalized procrustes analysis

    DEFF Research Database (Denmark)

    Juhl, Hans Jørn

    1994-01-01

    Executive Summary: 1. A sensory analysis is one of the first steps in product development in the food industry. A thorough analysis of the results from such an analysis may give important input to the development process. 2. A sensory analysis on butter cookies is conducted in order to evaluate...... if some butter may be replaced by vegetable fat without a significant change in the sensory profile. The conclusion is that the replacement is possible without a considerable change in the sensory profile. 3. Generalized Procrustes Analysis is used to analyze the results. It is a relatively new technique...

  2. Contributions to sensitivity analysis and generalized discriminant analysis

    International Nuclear Information System (INIS)

    Jacques, J.

    2005-12-01

    Two topics are studied in this thesis: sensitivity analysis and generalized discriminant analysis. Global sensitivity analysis of a mathematical model studies how the output variables of this last react to variations of its inputs. The methods based on the study of the variance quantify the part of variance of the response of the model due to each input variable and each subset of input variables. The first subject of this thesis is the impact of a model uncertainty on results of a sensitivity analysis. Two particular forms of uncertainty are studied: that due to a change of the model of reference, and that due to the use of a simplified model with the place of the model of reference. A second problem was studied during this thesis, that of models with correlated inputs. Indeed, classical sensitivity indices not having significance (from an interpretation point of view) in the presence of correlation of the inputs, we propose a multidimensional approach consisting in expressing the sensitivity of the output of the model to groups of correlated variables. Applications in the field of nuclear engineering illustrate this work. Generalized discriminant analysis consists in classifying the individuals of a test sample in groups, by using information contained in a training sample, when these two samples do not come from the same population. This work extends existing methods in a Gaussian context to the case of binary data. An application in public health illustrates the utility of generalized discrimination models thus defined. (author)

  3. Chemical analysis of carbonates and carbonate rocks by atomic absorption analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tardon, S

    1981-01-01

    Evaluates methods of determining chemical composition of rocks surrounding black coal seams. Carbonate rock samples were collected in the Ostrava-Karvina coal mines. Sampling methods are described. Determination of the following elements and compounds in carbonate rocks is discussed: calcium, magnesium, iron, manganese, barium, silicon, aluminium, titanium, sodium, potassium, sulfur trioxide, phosphorus pentoxide, water and carbon dioxide. Proportion of compounds insoluble in water in the investigated rocks is also determined. Most of the elements are determined by means of atomic absorption analysis. Phosphorus is also determined by atomic absorption analysis. Other compounds are determined gravimetrically. The described procedure permits weight of a rock sample to be reduced to 0.5 g without reducing analysis accuracy. The results of determining carbonate rock components by X-ray analysis and by chemical analysis are compared. Equipment used for atomic absorption analysis is characterized (the 503 Perkin-Elmer and the CF-4 Optica-Milano spectrophotometers). The analyzed method for determining carbonate rock permits more accurate classification of rocks surrounding coal seams and rock impurities in run-of-mine coal. (22 refs.) (In Czech)

  4. Practical data analysis

    CERN Document Server

    Cuesta, Hector

    2013-01-01

    Each chapter of the book quickly introduces a key 'theme' of Data Analysis, before immersing you in the practical aspects of each theme. You'll learn quickly how to perform all aspects of Data Analysis.Practical Data Analysis is a book ideal for home and small business users who want to slice & dice the data they have on hand with minimum hassle.

  5. Left ventricular wall motion abnormalities evaluated by factor analysis as compared with Fourier analysis

    International Nuclear Information System (INIS)

    Hirota, Kazuyoshi; Ikuno, Yoshiyasu; Nishikimi, Toshio

    1986-01-01

    Factor analysis was applied to multigated cardiac pool scintigraphy to evaluate its ability to detect left ventricular wall motion abnormalities in 35 patients with old myocardial infarction (MI), and in 12 control cases with normal left ventriculography. All cases were also evaluated by conventional Fourier analysis. In most cases with normal left ventriculography, the ventricular and atrial factors were extracted by factor analysis. In cases with MI, the third factor was obtained in the left ventricle corresponding to wall motion abnormality. Each case was scored according to the coincidence of findings of ventriculography and those of factor analysis or Fourier analysis. Scores were recorded for three items; the existence, location, and degree of asynergy. In cases of MI, the detection rate of asynergy was 94 % by factor analysis, 83 % by Fourier analysis, and the agreement in respect to location was 71 % and 66 %, respectively. Factor analysis had higher scores than Fourier analysis, but this was not significant. The interobserver error of factor analysis was less than that of Fourier analysis. Factor analysis can display locations and dynamic motion curves of asynergy, and it is regarded as a useful method for detecting and evaluating left ventricular wall motion abnormalities. (author)

  6. Time-frequency analysis : mathematical analysis of the empirical mode decomposition.

    Science.gov (United States)

    2009-01-01

    Invented over 10 years ago, empirical mode : decomposition (EMD) provides a nonlinear : time-frequency analysis with the ability to successfully : analyze nonstationary signals. Mathematical : Analysis of the Empirical Mode Decomposition : is a...

  7. QUANTITATIVE ANALYSIS OF FLUX REGULATION THROUGH HIERARCHICAL REGULATION ANALYSIS

    NARCIS (Netherlands)

    van Eunen, Karen; Rossell, Sergio; Bouwman, Jildau; Westerhoff, Hans V.; Bakker, Barbara M.; Jameson, D; Verma, M; Westerhoff, HV

    2011-01-01

    Regulation analysis is a methodology that quantifies to what extent a change in the flux through a metabolic pathway is regulated by either gene expression or metabolism. Two extensions to regulation analysis were developed over the past years: (i) the regulation of V(max) can be dissected into the

  8. Quantitative analysis of flux regulation through hierarchical regulation analysis

    NARCIS (Netherlands)

    Eunen, K. van; Rossell, S.; Bouwman, J.; Westerhoff, H.V.; Bakker, B.M.

    2011-01-01

    Regulation analysis is a methodology that quantifies to what extent a change in the flux through a metabolic pathway is regulated by either gene expression or metabolism. Two extensions to regulation analysis were developed over the past years: (i) the regulation of Vmax can be dissected into the

  9. Object-Oriented Analysis, Structured Analysis, and Jackson System Development

    NARCIS (Netherlands)

    Van Assche, F.; Wieringa, Roelf J.; Moulin, B.; Rolland, C

    1991-01-01

    Conceptual modeling is the activity of producing a conceptual model of an actual or desired version of a universe of discourse (UoD). In this paper, two methods of conceptual modeling are compared, structured analysis (SA) and object-oriented analysis (OOA). This is done by transforming a model

  10. Foundations of mathematical analysis

    CERN Document Server

    Johnsonbaugh, Richard

    2010-01-01

    This classroom-tested volume offers a definitive look at modern analysis, with views of applications to statistics, numerical analysis, Fourier series, differential equations, mathematical analysis, and functional analysis. Upper-level undergraduate students with a background in calculus will benefit from its teachings, along with beginning graduate students seeking a firm grounding in modern analysis. A self-contained text, it presents the necessary background on the limit concept, and the first seven chapters could constitute a one-semester introduction to limits. Subsequent chapters discuss

  11. Advanced complex analysis a comprehensive course in analysis, part 2b

    CERN Document Server

    Simon, Barry

    2015-01-01

    A Comprehensive Course in Analysis by Poincaré Prize winner Barry Simon is a five-volume set that can serve as a graduate-level analysis textbook with a lot of additional bonus information, including hundreds of problems and numerous notes that extend the text and provide important historical background. Depth and breadth of exposition make this set a valuable reference source for almost all areas of classical analysis. Part 2B provides a comprehensive look at a number of subjects of complex analysis not included in Part 2A. Presented in this volume are the theory of conformal metrics (includ

  12. Development of Performance Analysis Program for an Axial Compressor with Meanline Analysis

    International Nuclear Information System (INIS)

    Park, Jun Young; Park, Moo Ryong; Choi, Bum Suk; Song, Je Wook

    2009-01-01

    Axial-flow compressor is one of the most important parts of gas turbine units with axial turbine and combustor. Therefore, precise prediction of performance is very important for development of new compressor or modification of existing one. Meanline analysis is a simple, fast and powerful method for performance prediction of axial-flow compressors with different geometries. So, Meanline analysis is frequently used in preliminary design stage and performance analysis for given geometry data. Much correlations for meanline analysis have been developed theoretically and experimentally for estimating various types of losses and flow deviation angle for long time. In present study, meanline analysis program was developed to estimate compressor losses, incidence angles, deviation angles, stall and surge conditions with many correlations. Performance prediction of one stage axial compressors is conducted with this meanline analysis program. The comparison between experimental and numerical results show a good agreement. This meanline analysis program can be used for various types of single stage axial-flow compressors with different geometries, as well as multistage axial-flow compressors

  13. Nonactivation interaction analysis. Chapter 5

    International Nuclear Information System (INIS)

    1976-01-01

    Analyses are described including the alpha scattering analysis, beta absorption and scattering analysis, gamma and X-ray absorption and scattering analysis, X-ray fluorescence analysis, neutron absorption and scattering analysis, Moessbauer effect application and an analysis based on the application of radiation ionizing effects. (J.P.)

  14. Goal-oriented failure analysis - a systems analysis approach to hazard identification

    International Nuclear Information System (INIS)

    Reeves, A.B.; Davies, J.; Foster, J.; Wells, G.L.

    1990-01-01

    Goal-Oriented Failure Analysis, GOFA, is a methodology which is being developed to identify and analyse the potential failure modes of a hazardous plant or process. The technique will adopt a structured top-down approach, with a particular failure goal being systematically analysed. A systems analysis approach is used, with the analysis being organised around a systems diagram of the plant or process under study. GOFA will also use checklists to supplement the analysis -these checklists will be prepared in advance of a group session and will help to guide the analysis and avoid unnecessary time being spent on identifying obvious failure modes or failing to identify certain hazards or failures. GOFA is being developed with the aim of providing a hazard identification methodology which is more efficient and stimulating than the conventional approach to HAZOP. The top-down approach should ensure that the analysis is more focused and the use of a systems diagram will help to pull the analysis together at an early stage whilst also helping to structure the sessions in a more stimulating way than the conventional techniques. GOFA will be, essentially, an extension of the HAZOP methodology. GOFA is currently being computerised using a knowledge-based systems approach for implementation. The Goldworks II expert systems development tool is being used. (author)

  15. Trace analysis

    International Nuclear Information System (INIS)

    Warner, M.

    1987-01-01

    What is the current state of quantitative trace analytical chemistry? What are today's research efforts? And what challenges does the future hold? These are some of the questions addressed at a recent four-day symposium sponsored by the National Bureau of Standards (NBS) entitled Accuracy in Trace Analysis - Accomplishments, Goals, Challenges. The two plenary sessions held on the first day of the symposium reviewed the history of quantitative trace analysis, discussed the present situation from academic and industrial perspectives, and summarized future needs. The remaining three days of the symposium consisted of parallel sessions dealing with the measurement process; quantitation in materials; environmental, clinical, and nutrient analysis; and advances in analytical techniques

  16. Analysis in usability evaluations

    DEFF Research Database (Denmark)

    Følstad, Asbjørn; Lai-Chong Law, Effie; Hornbæk, Kasper

    2010-01-01

    While the planning and implementation of usability evaluations are well described in the literature, the analysis of the evaluation data is not. We present interviews with 11 usability professionals on how they conduct analysis, describing the resources, collaboration, creation of recommendations......, and prioritization involved. The interviews indicate a lack of structure in the analysis process and suggest activities, such as generating recommendations, that are unsupported by existing methods. We discuss how to better support analysis, and propose four themes for future research on analysis in usability...

  17. What Is Public Agency Strategic Analysis (PASA and How Does It Differ from Public Policy Analysis and Firm Strategy Analysis?

    Directory of Open Access Journals (Sweden)

    Aidan R. Vining

    2016-12-01

    Full Text Available Public agency strategic analysis (PASA is different from public policy analysis because public agency executives face numerous constraints that those performing “unconstrained” policy analysis do not. It is also different from private sector strategic analysis. But because of similar constraints and realities, some generic and private sector strategic analysis techniques can be useful to those carrying out PASA, if appropriately modified. Analysis of the external agency environment (external forces and internal value creation processes (“value chains”, “modular assembly” processes or “multi-sided intermediation platforms” are the most important components of PASA. Also, agency executives must focus on feasible alternatives. In sum, PASA must be practical. But public executives need to take seriously public value, and specifically social efficiency, when engaging in PASA. Unless they do so, their strategic analyses will not have normative legitimacy because enhancing public value is not the same as in some versions of public value or in agency “profit maximization”. Although similarly constrained, normatively appropriate public agency strategic analysis is not “giving clients what they want” or “making the public sector business case”. PASA must be both practical and principled.

  18. Foundations of factor analysis

    CERN Document Server

    Mulaik, Stanley A

    2009-01-01

    Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti

  19. Is activation analysis still active?

    International Nuclear Information System (INIS)

    Chai Zhifang

    2001-01-01

    This paper reviews some aspects of neutron activation analysis (NAA), covering instrumental neutron activation analysis (INAA), k 0 method, prompt gamma-ray neutron activation analysis (PGNAA), radiochemical neutron activation analysis (RNAA) and molecular activation analysis (MAA). The comparison of neutron activation analysis with other analytical techniques are also made. (author)

  20. Semiotic Analysis.

    Science.gov (United States)

    Thiemann, Francis C.

    Semiotic analysis is a method of analyzing signs (e.g., words) to reduce non-numeric data to their component parts without losing essential meanings. Semiotics dates back to Aristotle's analysis of language; it was much advanced by nineteenth-century analyses of style and logic and by Whitehead and Russell's description in this century of the role…

  1. Multivariate analysis with LISREL

    CERN Document Server

    Jöreskog, Karl G; Y Wallentin, Fan

    2016-01-01

    This book traces the theory and methodology of multivariate statistical analysis and shows how it can be conducted in practice using the LISREL computer program. It presents not only the typical uses of LISREL, such as confirmatory factor analysis and structural equation models, but also several other multivariate analysis topics, including regression (univariate, multivariate, censored, logistic, and probit), generalized linear models, multilevel analysis, and principal component analysis. It provides numerous examples from several disciplines and discusses and interprets the results, illustrated with sections of output from the LISREL program, in the context of the example. The book is intended for masters and PhD students and researchers in the social, behavioral, economic and many other sciences who require a basic understanding of multivariate statistical theory and methods for their analysis of multivariate data. It can also be used as a textbook on various topics of multivariate statistical analysis.

  2. An Array of Qualitative Data Analysis Tools: A Call for Data Analysis Triangulation

    Science.gov (United States)

    Leech, Nancy L.; Onwuegbuzie, Anthony J.

    2007-01-01

    One of the most important steps in the qualitative research process is analysis of data. The purpose of this article is to provide elements for understanding multiple types of qualitative data analysis techniques available and the importance of utilizing more than one type of analysis, thus utilizing data analysis triangulation, in order to…

  3. Essential real analysis

    CERN Document Server

    Field, Michael

    2017-01-01

    This book provides a rigorous introduction to the techniques and results of real analysis, metric spaces and multivariate differentiation, suitable for undergraduate courses. Starting from the very foundations of analysis, it offers a complete first course in real analysis, including topics rarely found in such detail in an undergraduate textbook such as the construction of non-analytic smooth functions, applications of the Euler-Maclaurin formula to estimates, and fractal geometry.  Drawing on the author’s extensive teaching and research experience, the exposition is guided by carefully chosen examples and counter-examples, with the emphasis placed on the key ideas underlying the theory. Much of the content is informed by its applicability: Fourier analysis is developed to the point where it can be rigorously applied to partial differential equations or computation, and the theory of metric spaces includes applications to ordinary differential equations and fractals. Essential Real Analysis will appeal t...

  4. Robust multivariate analysis

    CERN Document Server

    J Olive, David

    2017-01-01

    This text presents methods that are robust to the assumption of a multivariate normal distribution or methods that are robust to certain types of outliers. Instead of using exact theory based on the multivariate normal distribution, the simpler and more applicable large sample theory is given.  The text develops among the first practical robust regression and robust multivariate location and dispersion estimators backed by theory.   The robust techniques  are illustrated for methods such as principal component analysis, canonical correlation analysis, and factor analysis.  A simple way to bootstrap confidence regions is also provided. Much of the research on robust multivariate analysis in this book is being published for the first time. The text is suitable for a first course in Multivariate Statistical Analysis or a first course in Robust Statistics. This graduate text is also useful for people who are familiar with the traditional multivariate topics, but want to know more about handling data sets with...

  5. Data analysis workbench

    International Nuclear Information System (INIS)

    Goetz, A.; Gerring, M.; Svensson, O.; Brockhauser, S.

    2012-01-01

    Data Analysis Workbench (DAWB) is a new software tool being developed at the ESRF. Its goal is to provide a tool for both online data analysis which can be used on the beamlines and for offline data analysis which users can use during experiments or take home. The tool includes support for data visualization and work-flows. work-flows allow algorithms which exploit parallel architectures to be designed from existing high level modules for data analysis in combination with data collection. The workbench uses Passerelle as the work-flow engine and EDNA plug-ins for data analysis. Actors talking to Tango are used for sending commands to a limited set of hardware to start existing data collection algorithms. A Tango server allows work-flows to be executed from existing applications. There are scripting interfaces to Python, Javascript and SPEC. The current state at the ESRF is the workbench is in test on a selected number of beamlines. (authors)

  6. Data-variant kernel analysis

    CERN Document Server

    Motai, Yuichi

    2015-01-01

    Describes and discusses the variants of kernel analysis methods for data types that have been intensely studied in recent years This book covers kernel analysis topics ranging from the fundamental theory of kernel functions to its applications. The book surveys the current status, popular trends, and developments in kernel analysis studies. The author discusses multiple kernel learning algorithms and how to choose the appropriate kernels during the learning phase. Data-Variant Kernel Analysis is a new pattern analysis framework for different types of data configurations. The chapters include

  7. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  8. Emission spectrochemical analysis

    International Nuclear Information System (INIS)

    Rives, R.D.; Bruks, R.R.

    1983-01-01

    The emission spectrochemical method of analysis based on the fact that atoms of elements can be excited in the electric arc or in the laser beam and will emit radiation with characteristic wave lengths is considered. The review contains the data on spectrochemical analysis, of liquids geological materials, scheme of laser microprobe. The main characteristics of emission spectroscopy, atomic absorption spectroscopy and X-ray fluorescent analysis, are aeneralized

  9. Comparative risk analysis

    International Nuclear Information System (INIS)

    Niehaus, F.

    1988-01-01

    In this paper, the risks of various energy systems are discussed considering severe accidents analysis, particularly the probabilistic safety analysis, and probabilistic safety criteria, and the applications of these criteria and analysis. The comparative risk analysis has demonstrated that the largest source of risk in every society is from daily small accidents. Nevertheless, we have to be more concerned about severe accidents. The comparative risk analysis of five different energy systems (coal, oil, gas, LWR and STEC (Solar)) for the public has shown that the main sources of risks are coal and oil. The latest comparative risk study of various energy has been conducted in the USA and has revealed that the number of victims from coal is 42 as many than victims from nuclear. A study for severe accidents from hydro-dams in United States has estimated the probability of dam failures at 1 in 10,000 years and the number of victims between 11,000 and 260,000. The average occupational risk from coal is one fatal accident in 1,000 workers/year. The probabilistic safety analysis is a method that can be used to assess nuclear energy risks, and to analyze the severe accidents, and to model all possible accident sequences and consequences. The 'Fault tree' analysis is used to know the probability of failure of the different systems at each point of accident sequences and to calculate the probability of risks. After calculating the probability of failure, the criteria for judging the numerical results have to be developed, that is the quantitative and qualitative goals. To achieve these goals, several systems have been devised by various countries members of AIEA. The probabilistic safety ana-lysis method has been developed by establishing a computer program permit-ting to know different categories of safety related information. 19 tabs. (author)

  10. Fault tree analysis

    International Nuclear Information System (INIS)

    1981-09-01

    Suggestion are made concerning the method of the fault tree analysis, the use of certain symbols in the examination of system failures. This purpose of the fault free analysis is to find logical connections of component or subsystem failures leading to undesirable occurrances. The results of these examinations are part of the system assessment concerning operation and safety. The objectives of the analysis are: systematical identification of all possible failure combinations (causes) leading to a specific undesirable occurrance, finding of reliability parameters such as frequency of failure combinations, frequency of the undesirable occurrance or non-availability of the system when required. The fault tree analysis provides a near and reconstructable documentation of the examination. (orig./HP) [de

  11. Social Set Analysis

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Mukkamala, Raghava Rao; Hussain, Abid

    2016-01-01

    , conceptual and formal models of social data, and an analytical framework for combining big social data sets with organizational and societal data sets. Three empirical studies of big social data are presented to illustrate and demonstrate social set analysis in terms of fuzzy set-theoretical sentiment...... automata and agent-based modeling). However, when it comes to organizational and societal units of analysis, there exists no approach to conceptualize, model, analyze, explain, and predict social media interactions as individuals' associations with ideas, values, identities, and so on. To address...... analysis, crisp set-theoretical interaction analysis, and event-studies-oriented set-theoretical visualizations. Implications for big data analytics, current limitations of the set-theoretical approach, and future directions are outlined....

  12. Hermeneutic phenomenological analysis: the 'possibility' beyond 'actuality' in thematic analysis.

    Science.gov (United States)

    Ho, Ken H M; Chiang, Vico C L; Leung, Doris

    2017-07-01

    This article discusses the ways researchers may become open to manifold interpretations of lived experience through thematic analysis that follows the tradition of hermeneutic phenomenology. Martin Heidegger's thinking about historical contexts of understandings and the notions of 'alētheia' and 'techne' disclose what he called meaning of lived experience, as the 'unchanging Being of changing beings'. While these notions remain central to hermeneutic phenomenological research, novice phenomenologists usually face the problem of how to incorporate these philosophical tenets into thematic analysis. Discussion paper. This discussion paper is based on our experiences of hermeneutic analysis supported by the writings of Heidegger. Literature reviewed for this paper ranges from 1927 - 2014. We draw on data from a study of foreign domestic helpers in Hong Kong to demonstrate how 'dwelling' in the language of participants' 'ek-sistence' supported us in a process of thematic analysis. Data were collected from December 2013 - February 2016. Nurses doing hermeneutic phenomenology have to develop self-awareness of one's own 'taken-for-granted' thinking to disclose the unspoken meanings hidden in the language of participants. Understanding the philosophical tenets of hermeneutic phenomenology allows nurses to preserve possibilities of interpretations in thinking. In so doing, methods of thematic analysis can uncover and present the structure of the meaning of lived experience. We provide our readers with vicarious experience of how to begin cultivating thinking that is aligned with hermeneutic phenomenological philosophical tenets to conduct thematic analysis. © 2017 John Wiley & Sons Ltd.

  13. Physicochemical properties of different corn varieties by principal components analysis and cluster analysis

    International Nuclear Information System (INIS)

    Zeng, J.; Li, G.; Sun, J.

    2013-01-01

    Principal components analysis and cluster analysis were used to investigate the properties of different corn varieties. The chemical compositions and some properties of corn flour which processed by drying milling were determined. The results showed that the chemical compositions and physicochemical properties were significantly different among twenty six corn varieties. The quality of corn flour was concerned with five principal components from principal component analysis and the contribution rate of starch pasting properties was important, which could account for 48.90%. Twenty six corn varieties could be classified into four groups by cluster analysis. The consistency between principal components analysis and cluster analysis indicated that multivariate analyses were feasible in the study of corn variety properties. (author)

  14. Compatibility analysis of DUPIC fuel(4) - thermal hydraulic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jee Won; Chae, Kyung Myung; Choi, Hang Bok

    2000-07-01

    Thermal-hydraulic compatibility of the DUPIC fuel bundle in the CANDU reactor has been studied. The critical channel power, the critical power ratio, the channel exit quality and the channel flow are calculated for the DUPIC and the standard fuels by using the NUCIRC code. The physical models and associated parametric values for the NUCIRC analysis of the fuels are also presented. Based upon the slave channel analysis, the critical channel power and the critical power ratios have been found to be very similar for the two fuel types. The same dryout model is used in this study for the standard and the DUPIC fuel bundles. To assess the dryout characteristics of the DUPIC fuel bundle, the ASSERT-PV code has been used for the subchannel analysis. Based upon the results of the subchannel analysis, it is found that the dryout location and the power for the two fuel types are indeed very similar. This study shows that thermal performance of the DUPIC fuel is not significantly different from that of the standard fuel.

  15. The ATLAS Analysis Model

    CERN Multimedia

    Amir Farbin

    The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...

  16. Hazard Analysis Database Report

    Energy Technology Data Exchange (ETDEWEB)

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  17. A sensory analysis of butter cookies: An application of generalized procrustes analysis

    OpenAIRE

    Juhl, Hans Jørn

    1994-01-01

    Executive Summary: 1. A sensory analysis is one of the first steps in product development in the food industry. A thorough analysis of the results from such an analysis may give important input to the development process. 2. A sensory analysis on butter cookies is conducted in order to evaluate if some butter may be replaced by vegetable fat without a significant change in the sensory profile. The conclusion is that the replacement is possible without a considerable change in the sensory prof...

  18. Data analysis and approximate models model choice, location-scale, analysis of variance, nonparametric regression and image analysis

    CERN Document Server

    Davies, Patrick Laurie

    2014-01-01

    Introduction IntroductionApproximate Models Notation Two Modes of Statistical AnalysisTowards One Mode of Analysis Approximation, Randomness, Chaos, Determinism ApproximationA Concept of Approximation Approximation Approximating a Data Set by a Model Approximation Regions Functionals and EquivarianceRegularization and Optimality Metrics and DiscrepanciesStrong and Weak Topologies On Being (almost) Honest Simulations and Tables Degree of Approximation and p-values ScalesStability of Analysis The Choice of En(α, P) Independence Procedures, Approximation and VaguenessDiscrete Models The Empirical Density Metrics and Discrepancies The Total Variation Metric The Kullback-Leibler and Chi-Squared Discrepancies The Po(λ) ModelThe b(k, p) and nb(k, p) Models The Flying Bomb Data The Student Study Times Data OutliersOutliers, Data Analysis and Models Breakdown Points and Equivariance Identifying Outliers and Breakdown Outliers in Multivariate Data Outliers in Linear Regression Outliers in Structured Data The Location...

  19. Frontier Assignment for Sensitivity Analysis of Data Envelopment Analysis

    Science.gov (United States)

    Naito, Akio; Aoki, Shingo; Tsuji, Hiroshi

    To extend the sensitivity analysis capability for DEA (Data Envelopment Analysis), this paper proposes frontier assignment based DEA (FA-DEA). The basic idea of FA-DEA is to allow a decision maker to decide frontier intentionally while the traditional DEA and Super-DEA decide frontier computationally. The features of FA-DEA are as follows: (1) provides chances to exclude extra-influential DMU (Decision Making Unit) and finds extra-ordinal DMU, and (2) includes the function of the traditional DEA and Super-DEA so that it is able to deal with sensitivity analysis more flexibly. Simple numerical study has shown the effectiveness of the proposed FA-DEA and the difference from the traditional DEA.

  20. Overview of methods for uncertainty analysis and sensitivity analysis in probabilistic risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.; Helton, J.C.

    1985-01-01

    Probabilistic Risk Assessment (PRA) is playing an increasingly important role in the nuclear reactor regulatory process. The assessment of uncertainties associated with PRA results is widely recognized as an important part of the analysis process. One of the major criticisms of the Reactor Safety Study was that its representation of uncertainty was inadequate. The desire for the capability to treat uncertainties with the MELCOR risk code being developed at Sandia National Laboratories is indicative of the current interest in this topic. However, as yet, uncertainty analysis and sensitivity analysis in the context of PRA is a relatively immature field. In this paper, available methods for uncertainty analysis and sensitivity analysis in a PRA are reviewed. This review first treats methods for use with individual components of a PRA and then considers how these methods could be combined in the performance of a complete PRA. In the context of this paper, the goal of uncertainty analysis is to measure the imprecision in PRA outcomes of interest, and the goal of sensitivity analysis is to identify the major contributors to this imprecision. There are a number of areas that must be considered in uncertainty analysis and sensitivity analysis for a PRA: (1) information, (2) systems analysis, (3) thermal-hydraulic phenomena/fission product behavior, (4) health and economic consequences, and (5) display of results. Each of these areas and the synthesis of them into a complete PRA are discussed

  1. Meta-Analysis for Primary and Secondary Data Analysis: The Super-Experiment Metaphor.

    Science.gov (United States)

    Jackson, Sally

    1991-01-01

    Considers the relation between meta-analysis statistics and analysis of variance statistics. Discusses advantages and disadvantages as a primary data analysis tool. Argues that the two approaches are partial paraphrases of one another. Advocates an integrative approach that introduces the best of meta-analytic thinking into primary analysis…

  2. Canonical Information Analysis

    DEFF Research Database (Denmark)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg

    2015-01-01

    is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator......Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables...... for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical...

  3. Biorefinery Sustainability Analysis

    DEFF Research Database (Denmark)

    J. S. M. Silva, Carla; Prunescu, Remus Mihail; Gernaey, Krist

    2017-01-01

    This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system and of t......This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system...... and of the biorefinery-based system. Socio-economic criteria and indicators used in sustainability frameworks assessment are presented and discussed. There is not one single methodology that can aptly cover the synergies of environmental, economic, social and governance issues required to assess the sustainable...

  4. Application of optical deformation analysis system on wedge splitting test and its inverse analysis

    DEFF Research Database (Denmark)

    Skocek, Jan; Stang, Henrik

    2010-01-01

    . Results of the inverse analysis are compared with traditional inverse analysis based on clip gauge data. Then the optically measured crack profile and crack tip position are compared with predictions done by the non-linear hinge model and a finite element analysis. It is shown that the inverse analysis...... based on the optically measured data can provide material parameters of the fictitious crack model matching favorably those obtained by classical inverse analysis based on the clip gauge data. Further advantages of using of the optical deformation analysis lie in identification of such effects...

  5. Handbook of Applied Analysis

    CERN Document Server

    Papageorgiou, Nikolaos S

    2009-01-01

    Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.

  6. How Content Analysis may Complement and Extend the Insights of Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Tracey Feltham-King

    2016-02-01

    Full Text Available Although discourse analysis is a well-established qualitative research methodology, little attention has been paid to how discourse analysis may be enhanced through careful supplementation with the quantification allowed in content analysis. In this article, we report on a research study that involved the use of both Foucauldian discourse analysis (FDA and directed content analysis based on social constructionist theory and our qualitative research findings. The research focused on the discourses deployed, and the ways in which women were discursively positioned, in relation to abortion in 300 newspaper articles, published in 25 national and regional South African newspapers over 28 years, from 1978 to 2005. While the FDA was able to illuminate the constitutive network of power relations constructing women as subjects of a particular kind, questions emerged that were beyond the scope of the FDA. These questions concerned understanding the relative weightings of various discourses and tracing historical changes in the deployment of these discourses. In this article, we show how the decision to combine FDA and content analysis affected our sampling methodology. Using specific examples, we illustrate the contribution of the FDA to the study. Then, we indicate how subject positioning formed the link between the FDA and the content analysis. Drawing on the same examples, we demonstrate how the content analysis supplemented the FDA through tracking changes over time and providing empirical evidence of the extent to which subject positionings were deployed.

  7. Abstract interfaces for data analysis - component architecture for data analysis tools

    International Nuclear Information System (INIS)

    Barrand, G.; Binko, P.; Doenszelmann, M.; Pfeiffer, A.; Johnson, A.

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis'99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and implementations exist in the form of libraries and tools using C++ (Anaphe/Lizard, OpenScientist) and Java (Java Analysis Studio). A special implementation aims at accessing the Java libraries (through their Abstract Interfaces) from C++. The authors give an overview of the architecture and design of the various components for data analysis as discussed in AIDA

  8. LULU analysis program

    International Nuclear Information System (INIS)

    Crawford, H.J.; Lindstrom, P.J.

    1983-06-01

    Our analysis program LULU has proven very useful in all stages of experiment analysis, from prerun detector debugging through final data reduction. It has solved our problem of having arbitrary word length events and is easy enough to use that many separate experimenters are now analyzing with LULU. The ability to use the same software for all stages of experiment analysis greatly eases the programming burden. We may even get around to making the graphics elegant someday

  9. Stochastic Analysis 2010

    CERN Document Server

    Crisan, Dan

    2011-01-01

    "Stochastic Analysis" aims to provide mathematical tools to describe and model high dimensional random systems. Such tools arise in the study of Stochastic Differential Equations and Stochastic Partial Differential Equations, Infinite Dimensional Stochastic Geometry, Random Media and Interacting Particle Systems, Super-processes, Stochastic Filtering, Mathematical Finance, etc. Stochastic Analysis has emerged as a core area of late 20th century Mathematics and is currently undergoing a rapid scientific development. The special volume "Stochastic Analysis 2010" provides a sa

  10. Ecosystem Analysis Program

    International Nuclear Information System (INIS)

    Burgess, R.L.

    1978-01-01

    Progress is reported on the following research programs: analysis and modeling of ecosystems; EDFB/IBP data center; biome analysis studies; land/water interaction studies; and computer programs for development of models

  11. Trend analysis

    International Nuclear Information System (INIS)

    Smith, M.; Jones, D.R.

    1991-01-01

    The goal of exploration is to find reserves that will earn an adequate rate of return on the capital invested. Neither exploration nor economics is an exact science. The authors must therefore explore in those trends (plays) that have the highest probability of achieving this goal. Trend analysis is a technique for organizing the available data to make these strategic exploration decisions objectively and is in conformance with their goals and risk attitudes. Trend analysis differs from resource estimation in its purpose. It seeks to determine the probability of economic success for an exploration program, not the ultimate results of the total industry effort. Thus the recent past is assumed to be the best estimate of the exploration probabilities for the near future. This information is combined with economic forecasts. The computer software tools necessary for trend analysis are (1) Information data base - requirements and sources. (2) Data conditioning program - assignment to trends, correction of errors, and conversion into usable form. (3) Statistical processing program - calculation of probability of success and discovery size probability distribution. (4) Analytical processing - Monte Carlo simulation to develop the probability distribution of the economic return/investment ratio for a trend. Limited capital (short-run) effects are analyzed using the Gambler's Ruin concept in the Monte Carlo simulation and by a short-cut method. Multiple trend analysis is concerned with comparing and ranking trends, allocating funds among acceptable trends, and characterizing program risk by using risk profiles. In summary, trend analysis is a reality check for long-range exploration planning

  12. Radiation analysis devices, radiation analysis methods, and articles of manufacture

    Science.gov (United States)

    Roybal, Lyle Gene

    2010-06-08

    Radiation analysis devices include circuitry configured to determine respective radiation count data for a plurality of sections of an area of interest and combine the radiation count data of individual of sections to determine whether a selected radioactive material is present in the area of interest. An amount of the radiation count data for an individual section is insufficient to determine whether the selected radioactive material is present in the individual section. An article of manufacture includes media comprising programming configured to cause processing circuitry to perform processing comprising determining one or more correction factors based on a calibration of a radiation analysis device, measuring radiation received by the radiation analysis device using the one or more correction factors, and presenting information relating to an amount of radiation measured by the radiation analysis device having one of a plurality of specified radiation energy levels of a range of interest.

  13. Incorporation of advanced accident analysis methodology into safety analysis reports

    International Nuclear Information System (INIS)

    2003-05-01

    The IAEA Safety Guide on Safety Assessment and Verification defines that the aim of the safety analysis should be by means of appropriate analytical tools to establish and confirm the design basis for the items important to safety, and to ensure that the overall plant design is capable of meeting the prescribed and acceptable limits for radiation doses and releases for each plant condition category. Practical guidance on how to perform accident analyses of nuclear power plants (NPPs) is provided by the IAEA Safety Report on Accident Analysis for Nuclear Power Plants. The safety analyses are performed both in the form of deterministic and probabilistic analyses for NPPs. It is customary to refer to deterministic safety analyses as accident analyses. This report discusses the aspects of using the advanced accident analysis methods to carry out accident analyses in order to introduce them into the Safety Analysis Reports (SARs). In relation to the SAR, purposes of deterministic safety analysis can be further specified as (1) to demonstrate compliance with specific regulatory acceptance criteria; (2) to complement other analyses and evaluations in defining a complete set of design and operating requirements; (3) to identify and quantify limiting safety system set points and limiting conditions for operation to be used in the NPP limits and conditions; (4) to justify appropriateness of the technical solutions employed in the fulfillment of predetermined safety requirements. The essential parts of accident analyses are performed by applying sophisticated computer code packages, which have been specifically developed for this purpose. These code packages include mainly thermal-hydraulic system codes and reactor dynamics codes meant for the transient and accident analyses. There are also specific codes such as those for the containment thermal-hydraulics, for the radiological consequences and for severe accident analyses. In some cases, codes of a more general nature such

  14. Professionalizing Intelligence Analysis

    Directory of Open Access Journals (Sweden)

    James B. Bruce

    2015-09-01

    Full Text Available This article examines the current state of professionalism in national security intelligence analysis in the U.S. Government. Since the introduction of major intelligence reforms directed by the Intelligence Reform and Terrorism Prevention Act (IRTPA in December, 2004, we have seen notable strides in many aspects of intelligence professionalization, including in analysis. But progress is halting, uneven, and by no means permanent. To consolidate its gains, and if it is to continue improving, the U.S. intelligence community (IC should commit itself to accomplishing a new program of further professionalization of analysis to ensure that it will develop an analytic cadre that is fully prepared to deal with the complexities of an emerging multipolar and highly dynamic world that the IC itself is forecasting. Some recent reforms in intelligence analysis can be assessed against established standards of more fully developed professions; these may well fall short of moving the IC closer to the more fully professionalized analytical capability required for producing the kind of analysis needed now by the United States.

  15. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    Directory of Open Access Journals (Sweden)

    Kiuru Aaro

    2003-01-01

    Full Text Available The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT and nuclear medicine (NM studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.

  16. Electronic Circuit Analysis Language (ECAL)

    Science.gov (United States)

    Chenghang, C.

    1983-03-01

    The computer aided design technique is an important development in computer applications and it is an important component of computer science. The special language for electronic circuit analysis is the foundation of computer aided design or computer aided circuit analysis (abbreviated as CACD and CACA) of simulated circuits. Electronic circuit analysis language (ECAL) is a comparatively simple and easy to use circuit analysis special language which uses the FORTRAN language to carry out the explanatory executions. It is capable of conducting dc analysis, ac analysis, and transient analysis of a circuit. Futhermore, the results of the dc analysis can be used directly as the initial conditions for the ac and transient analyses.

  17. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    Science.gov (United States)

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  18. Bearing defect signature analysis using advanced nonlinear signal analysis in a controlled environment

    Science.gov (United States)

    Zoladz, T.; Earhart, E.; Fiorucci, T.

    1995-01-01

    Utilizing high-frequency data from a highly instrumented rotor assembly, seeded bearing defect signatures are characterized using both conventional linear approaches, such as power spectral density analysis, and recently developed nonlinear techniques such as bicoherence analysis. Traditional low-frequency (less than 20 kHz) analysis and high-frequency envelope analysis of both accelerometer and acoustic emission data are used to recover characteristic bearing distress information buried deeply in acquired data. The successful coupling of newly developed nonlinear signal analysis with recovered wideband envelope data from accelerometers and acoustic emission sensors is the innovative focus of this research.

  19. New trends in applied harmonic analysis sparse representations, compressed sensing, and multifractal analysis

    CERN Document Server

    Cabrelli, Carlos; Jaffard, Stephane; Molter, Ursula

    2016-01-01

    This volume is a selection of written notes corresponding to courses taught at the CIMPA School: "New Trends in Applied Harmonic Analysis: Sparse Representations, Compressed Sensing and Multifractal Analysis". New interactions between harmonic analysis and signal and image processing have seen striking development in the last 10 years, and several technological deadlocks have been solved through the resolution of deep theoretical problems in harmonic analysis. New Trends in Applied Harmonic Analysis focuses on two particularly active areas that are representative of such advances: multifractal analysis, and sparse representation and compressed sensing. The contributions are written by leaders in these areas, and covers both theoretical aspects and applications. This work should prove useful not only to PhD students and postdocs in mathematics and signal and image processing, but also to researchers working in related topics.

  20. Introductory numerical analysis

    CERN Document Server

    Pettofrezzo, Anthony J

    2006-01-01

    Written for undergraduates who require a familiarity with the principles behind numerical analysis, this classical treatment encompasses finite differences, least squares theory, and harmonic analysis. Over 70 examples and 280 exercises. 1967 edition.

  1. Regression analysis by example

    CERN Document Server

    Chatterjee, Samprit

    2012-01-01

    Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded

  2. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    Science.gov (United States)

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e

  3. Synovial fluid analysis

    Science.gov (United States)

    Joint fluid analysis; Joint fluid aspiration ... El-Gabalawy HS. Synovial fluid analysis, synovial biopsy, and synovial pathology. In: Firestein GS, Budd RC, Gabriel SE, McInnes IB, O'Dell JR, eds. Kelly's Textbook of ...

  4. Study on mixed analysis method for fatigue analysis of oblique safety injection nozzle on main piping

    International Nuclear Information System (INIS)

    Lu Xifeng; Zhang Yixiong; Ai Honglei; Wang Xinjun; He Feng

    2014-01-01

    The simplified analysis method and the detailed analysis method were used for the fatigue analysis of the nozzle on the main piping. Because the structure of the oblique safety injection nozzle is complex and some more severe transients are subjected. The results obtained are more penalized and cannot be validate when the simplified analysis method used for the fatigue analysis. It will be little conservative when the detailed analysis method used, but it is more complex and time-consuming and boring labor. To reduce the conservatism and save time, the mixed analysis method which combining the simplified analysis method with the detailed analysis method is used for the fatigue analysis. The heat transfer parameters between the fluid and the structure which used for analysis were obtained by heat transfer property experiment. The results show that the mixed analysis which heat transfer property is considered can reduce the conservatism effectively, and the mixed analysis method is a more effective and practical method used for the fatigue analysis of the oblique safety injection nozzle. (authors)

  5. Factor analysis

    CERN Document Server

    Gorsuch, Richard L

    2013-01-01

    Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.

  6. Cask crush pad analysis using detailed and simplified analysis methods

    International Nuclear Information System (INIS)

    Uldrich, E.D.; Hawkes, B.D.

    1997-01-01

    A crush pad has been designed and analyzed to absorb the kinetic energy of a hypothetically dropped spent nuclear fuel shipping cask into a 44-ft. deep cask unloading pool at the Fluorinel and Storage Facility (FAST). This facility, located at the Idaho Chemical Processing Plant (ICPP) at the Idaho national Engineering and Environmental Laboratory (INEEL), is a US Department of Energy site. The basis for this study is an analysis by Uldrich and Hawkes. The purpose of this analysis was to evaluate various hypothetical cask drop orientations to ensure that the crush pad design was adequate and the cask deceleration at impact was less than 100 g. It is demonstrated herein that a large spent fuel shipping cask, when dropped onto a foam crush pad, can be analyzed by either hand methods or by sophisticated dynamic finite element analysis using computer codes such as ABAQUS. Results from the two methods are compared to evaluate accuracy of the simplified hand analysis approach

  7. Plug-in Based Analysis Framework for LHC Post-Mortem Analysis

    CERN Document Server

    Gorbonosov, R; Zerlauth, M; Baggiolini, V

    2014-01-01

    Plug-in based software architectures [1] are extensible, enforce modularity and allow several teams to work in parallel. But they have certain technical and organizational challenges, which we discuss in this paper. We gained our experience when developing the Post-Mortem Analysis (PMA) system, which is a mission critical system for the Large Hadron Collider (LHC). We used a plugin-based architecture with a general-purpose analysis engine, for which physicists and equipment experts code plugins containing the analysis algorithms. We have over 45 analysis plugins developed by a dozen of domain experts. This paper focuses on the design challenges we faced in order to mitigate the risks of executing third-party code: assurance that even a badly written plugin doesn't perturb the work of the overall application; plugin execution control which allows to detect plugin misbehaviour and react; robust communication mechanism between plugins, diagnostics facilitation in case of plugin failure; testing of the plugins be...

  8. Functional analysis and applications

    CERN Document Server

    Siddiqi, Abul Hasan

    2018-01-01

    This self-contained textbook discusses all major topics in functional analysis. Combining classical materials with new methods, it supplies numerous relevant solved examples and problems and discusses the applications of functional analysis in diverse fields. The book is unique in its scope, and a variety of applications of functional analysis and operator-theoretic methods are devoted to each area of application. Each chapter includes a set of problems, some of which are routine and elementary, and some of which are more advanced. The book is primarily intended as a textbook for graduate and advanced undergraduate students in applied mathematics and engineering. It offers several attractive features making it ideally suited for courses on functional analysis intended to provide a basic introduction to the subject and the impact of functional analysis on applied and computational mathematics, nonlinear functional analysis and optimization. It introduces emerging topics like wavelets, Gabor system, inverse pro...

  9. Easy instrumental analysis

    International Nuclear Information System (INIS)

    Ko, Myeong Su; Kim, Tae Hwa; Park, Gyu Hyeon; Yang, Jong Beom; Oh, Chang Hwan; Lee, Kyoung Hye

    2010-04-01

    This textbook describes instrument analysis in easy way with twelve chapters. The contents of the book are pH measurement on principle, pH meter, pH measurement, examples of the experiments, centrifugation, Absorptiometry, Fluorescent method, Atomic absorption analysis, Gas-chromatography, Gas chromatography-mass spectrometry, High performance liquid chromatography liquid chromatograph-mass spectrometry, Electrophoresis on practical case and analysis of the result and examples, PCR on principle, device, application and examples and Enzyme-linked immunosorbent assay with indirect ELISA, sandwich ELISA and ELISA reader.

  10. Easy instrumental analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Myeong Su; Kim, Tae Hwa; Park, Gyu Hyeon; Yang, Jong Beom; Oh, Chang Hwan; Lee, Kyoung Hye

    2010-04-15

    This textbook describes instrument analysis in easy way with twelve chapters. The contents of the book are pH measurement on principle, pH meter, pH measurement, examples of the experiments, centrifugation, Absorptiometry, Fluorescent method, Atomic absorption analysis, Gas-chromatography, Gas chromatography-mass spectrometry, High performance liquid chromatography liquid chromatograph-mass spectrometry, Electrophoresis on practical case and analysis of the result and examples, PCR on principle, device, application and examples and Enzyme-linked immunosorbent assay with indirect ELISA, sandwich ELISA and ELISA reader.

  11. Fundamentals of PIXE analysis

    International Nuclear Information System (INIS)

    Ishii, Keizo

    1997-01-01

    Elemental analysis based on the particle induced x-ray emission (PIXE) is a novel technique to analyze trace elements. It is a very simple method, its sensitivity is very high, multiple elements in a sample can be simultaneously analyzed and a few 10 μg of a sample is enough to be analyzed. Owing to these characteristics, the PIXE analysis is now used in many fields (e.g. biology, medicine, dentistry, environmental pollution, archaeology, culture assets etc.). Fundamentals of the PIXE analysis are described here: the production of characteristic x-rays and inner shell ionization by heavy charged particles, the continuous background in PIXE spectrum, quantitative formulae of the PIXE analysis, the detection limit of PIXE analysis, etc. (author)

  12. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  13. Instrumental neutron activation analysis as a routine method for rock analysis

    International Nuclear Information System (INIS)

    Rosenberg, R.J.

    1977-06-01

    Instrumental neutron activation methods for the analysis of geological samples have been developed. Special emphasis has been laid on the improvement of sensitivity and accuracy in order to maximize tha quality of the analyses. Furthermore, the procedures have been automated as far as possible in order to minimize the cost of the analysis. A short review of the basic literature is given followed by a description of the principles of the method. All aspects concerning the sensitivity are discussed thoroughly in view of the analyst's possibility of influencing them. Experimentally determined detection limits for Na, Al, K, Ca, Sc, Cr, Ti, V, Mn, Fe, Ni, Co, Rb, Zr, Sb, Cs, Ba, La, Ce, Nd, Sm, Eu, Gd, Tb, Dy, Yb, Lu, Hf, Ta, Th and U are given. The errors of the method are discussed followed by actions taken to avoid them. The most significant error was caused by flux deviation, but this was avoided by building a rotating sample holder for rotating the samples during irradiation. A scheme for the INAA of 32 elements is proposed. The method has been automated as far as possible and an automatic γ-spectrometer and a computer program for the automatic calculation of the results are described. Furthermore, a completely automated uranium analyzer based on delayed neutron counting is described. The methods are discussed in view of their applicability to rock analysis. It is stated that the sensitivity varies considerably from element to element and instrumental activation analysis is an excellent method for the analysis of some specific elements like lanthanides, thorium and uranium but less so for many other elements. The accuracy is good varying from 2% to 10% for most elements. Instrumental activation analysis for most elements is rather an expensive method there being, however, a few exceptions. The most important of these is uranium. The analysis of uranium by delayed neutron counting is an inexpensive means for the analysis of large numbers of samples needed for

  14. Numerical analysis

    CERN Document Server

    Rao, G Shanker

    2006-01-01

    About the Book: This book provides an introduction to Numerical Analysis for the students of Mathematics and Engineering. The book is designed in accordance with the common core syllabus of Numerical Analysis of Universities of Andhra Pradesh and also the syllabus prescribed in most of the Indian Universities. Salient features: Approximate and Numerical Solutions of Algebraic and Transcendental Equation Interpolation of Functions Numerical Differentiation and Integration and Numerical Solution of Ordinary Differential Equations The last three chapters deal with Curve Fitting, Eigen Values and Eigen Vectors of a Matrix and Regression Analysis. Each chapter is supplemented with a number of worked-out examples as well as number of problems to be solved by the students. This would help in the better understanding of the subject. Contents: Errors Solution of Algebraic and Transcendental Equations Finite Differences Interpolation with Equal Intervals Interpolation with Unequal Int...

  15. Wind energy analysis system

    OpenAIRE

    2014-01-01

    M.Ing. (Electrical & Electronic Engineering) One of the most important steps to be taken before a site is to be selected for the extraction of wind energy is the analysis of the energy within the wind on that particular site. No wind energy analysis system exists for the measurement and analysis of wind power. This dissertation documents the design and development of a Wind Energy Analysis System (WEAS). Using a micro-controller based design in conjunction with sensors, WEAS measure, calcu...

  16. Discourse analysis and Foucault's

    Directory of Open Access Journals (Sweden)

    Jansen I.

    2008-01-01

    Full Text Available Discourse analysis is a method with up to now was less recognized in nursing science, althoughmore recently nursing scientists are discovering it for their purposes. However, several authors have criticized thatdiscourse analysis is often misinterpreted because of a lack of understanding of its theoretical backgrounds. In thisarticle, I reconstruct Foucault’s writings in his “Archaeology of Knowledge” to provide a theoretical base for futurearchaeological discourse analysis, which can be categorized as a socio-linguistic discourse analysis.

  17. Slice hyperholomorphic Schur analysis

    CERN Document Server

    Alpay, Daniel; Sabadini, Irene

    2016-01-01

    This book defines and examines the counterpart of Schur functions and Schur analysis in the slice hyperholomorphic setting. It is organized into three parts: the first introduces readers to classical Schur analysis, while the second offers background material on quaternions, slice hyperholomorphic functions, and quaternionic functional analysis. The third part represents the core of the book and explores quaternionic Schur analysis and its various applications. The book includes previously unpublished results and provides the basis for new directions of research.

  18. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  19. WHAT IF (Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Iulian N. BUJOREANU

    2011-01-01

    Full Text Available Sensitivity analysis represents such a well known and deeply analyzed subject that anyone to enter the field feels like not being able to add anything new. Still, there are so many facets to be taken into consideration.The paper introduces the reader to the various ways sensitivity analysis is implemented and the reasons for which it has to be implemented in most analyses in the decision making processes. Risk analysis is of outmost importance in dealing with resource allocation and is presented at the beginning of the paper as the initial cause to implement sensitivity analysis. Different views and approaches are added during the discussion about sensitivity analysis so that the reader develops an as thoroughly as possible opinion on the use and UTILITY of the sensitivity analysis. Finally, a round-up conclusion brings us to the question of the possibility of generating the future and analyzing it before it unfolds so that, when it happens it brings less uncertainty.

  20. Fundamentals of functional analysis

    CERN Document Server

    Farenick, Douglas

    2016-01-01

    This book provides a unique path for graduate or advanced undergraduate students to begin studying the rich subject of functional analysis with fewer prerequisites than is normally required. The text begins with a self-contained and highly efficient introduction to topology and measure theory, which focuses on the essential notions required for the study of functional analysis, and which are often buried within full-length overviews of the subjects. This is particularly useful for those in applied mathematics, engineering, or physics who need to have a firm grasp of functional analysis, but not necessarily some of the more abstruse aspects of topology and measure theory normally encountered. The reader is assumed to only have knowledge of basic real analysis, complex analysis, and algebra. The latter part of the text provides an outstanding treatment of Banach space theory and operator theory, covering topics not usually found together in other books on functional analysis. Written in a clear, concise manner,...

  1. From analysis to surface

    DEFF Research Database (Denmark)

    Bemman, Brian; Meredith, David

    it with a “ground truth” analysis of the same music pro- duced by a human expert (see, in particular, [5]). In this paper, we explore the problem of generating an encoding of the musical surface of a work automatically from a systematic encoding of an analysis. The ability to do this depends on one having...... an effective (i.e., comput- able), correct and complete description of some aspect of the structure of the music. Generating the surface struc- ture of a piece from an analysis in this manner serves as a proof of the analysis' correctness, effectiveness and com- pleteness. We present a reductive analysis......In recent years, a significant body of research has focused on developing algorithms for computing analyses of mu- sical works automatically from encodings of these works' surfaces [3,4,7,10,11]. The quality of the output of such analysis algorithms is typically evaluated by comparing...

  2. Functional Object Analysis

    DEFF Research Database (Denmark)

    Raket, Lars Lau

    We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...

  3. Basic stress analysis

    CERN Document Server

    Iremonger, M J

    1982-01-01

    BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c

  4. Biological sequence analysis

    DEFF Research Database (Denmark)

    Durbin, Richard; Eddy, Sean; Krogh, Anders Stærmose

    This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis, and phylogene...

  5. Confirmatory Composite Analysis

    NARCIS (Netherlands)

    Schuberth, Florian; Henseler, Jörg; Dijkstra, Theo K.

    2018-01-01

    We introduce confirmatory composite analysis (CCA) as a structural equation modeling technique that aims at testing composite models. CCA entails the same steps as confirmatory factor analysis: model specification, model identification, model estimation, and model testing. Composite models are

  6. Evaluation of pavement life cycle cost analysis: Review and analysis

    Directory of Open Access Journals (Sweden)

    Peyman Babashamsi

    2016-07-01

    Full Text Available The cost of road construction consists of design expenses, material extraction, construction equipment, maintenance and rehabilitation strategies, and operations over the entire service life. An economic analysis process known as Life-Cycle Cost Analysis (LCCA is used to evaluate the cost-efficiency of alternatives based on the Net Present Value (NPV concept. It is essential to evaluate the above-mentioned cost aspects in order to obtain optimum pavement life-cycle costs. However, pavement managers are often unable to consider each important element that may be required for performing future maintenance tasks. Over the last few decades, several approaches have been developed by agencies and institutions for pavement Life-Cycle Cost Analysis (LCCA. While the transportation community has increasingly been utilising LCCA as an essential practice, several organisations have even designed computer programs for their LCCA approaches in order to assist with the analysis. Current LCCA methods are analysed and LCCA software is introduced in this article. Subsequently, a list of economic indicators is provided along with their substantial components. Collecting previous literature will help highlight and study the weakest aspects so as to mitigate the shortcomings of existing LCCA methods and processes. LCCA research will become more robust if improvements are made, facilitating private industries and government agencies to accomplish their economic aims. Keywords: Life-Cycle Cost Analysis (LCCA, Pavement management, LCCA software, Net Present Value (NPV

  7. Real time analysis under EDS

    International Nuclear Information System (INIS)

    Schneberk, D.

    1985-07-01

    This paper describes the analysis component of the Enrichment Diagnostic System (EDS) developed for the Atomic Vapor Laser Isotope Separation Program (AVLIS) at Lawrence Livermore National Laboratory (LLNL). Four different types of analysis are performed on data acquired through EDS: (1) absorption spectroscopy on laser-generated spectral lines, (2) mass spectrometer analysis, (3) general purpose waveform analysis, and (4) separation performance calculations. The information produced from this data includes: measures of particle density and velocity, partial pressures of residual gases, and overall measures of isotope enrichment. The analysis component supports a variety of real-time modeling tasks, a means for broadcasting data to other nodes, and a great degree of flexibility for tailoring computations to the exact needs of the process. A particular data base structure and program flow is common to all types of analysis. Key elements of the analysis component are: (1) a fast access data base which can configure all types of analysis, (2) a selected set of analysis routines, (3) a general purpose data manipulation and graphics package for the results of real time analysis. Each of these components are described with an emphasis upon how each contributes to overall system capability. 3 figs

  8. Trend Analysis Using Microcomputers.

    Science.gov (United States)

    Berger, Carl F.

    A trend analysis statistical package and additional programs for the Apple microcomputer are presented. They illustrate strategies of data analysis suitable to the graphics and processing capabilities of the microcomputer. The programs analyze data sets using examples of: (1) analysis of variance with multiple linear regression; (2) exponential…

  9. Life-Cycle Cost-Benefit Analysis

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    2010-01-01

    The future use of Life-Cycle Cost-Benefit (LCCB) analysis is discussed in this paper. A more complete analysis including not only the traditional factors and user costs, but also factors which are difficult to include in the analysis is needed in the future.......The future use of Life-Cycle Cost-Benefit (LCCB) analysis is discussed in this paper. A more complete analysis including not only the traditional factors and user costs, but also factors which are difficult to include in the analysis is needed in the future....

  10. Evaluating Style Analysis

    NARCIS (Netherlands)

    F.A. de Roon (Frans); T.E. Nijman (Theo); B.J.M. Werker

    2000-01-01

    textabstractIn this paper we evaluate applications of (return based) style analysis. The portfolio and positivity constraints imposed by style analysis are useful in constructing mimicking portfolios without short positions. Such mimicking portfolios can be used e.g. to construct efficient

  11. Strictness Analysis for Attribute Grammars

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1992-01-01

    interpretation of attribute grammars. The framework is used to construct a strictness analysis for attribute grammars. Results of the analysis enable us to transform an attribute grammar such that attributes are evaluated during parsing, if possible. The analysis is proved correct by relating it to a fixpoint...... semantics for attribute grammars. An implementation of the analysis is discussed and some extensions to the analysis are mentioned....

  12. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  13. Rescaled Range Analysis and Detrended Fluctuation Analysis: Finite Sample Properties and Confidence Intervals

    Czech Academy of Sciences Publication Activity Database

    Krištoufek, Ladislav

    4/2010, č. 3 (2010), s. 236-250 ISSN 1802-4696 R&D Projects: GA ČR GD402/09/H045; GA ČR GA402/09/0965 Grant - others:GA UK(CZ) 118310 Institutional research plan: CEZ:AV0Z10750506 Keywords : rescaled range analysis * detrended fluctuation analysis * Hurst exponent * long-range dependence Subject RIV: AH - Economics http://library.utia.cas.cz/separaty/2010/E/kristoufek-rescaled range analysis and detrended fluctuation analysis finite sample properties and confidence intervals.pdf

  14. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  15. Isogeometric failure analysis

    NARCIS (Netherlands)

    Verhoosel, C.V.; Scott, M.A.; Borden, M.J.; Borst, de R.; Hughes, T.J.R.; Mueller-Hoeppe, D.; Loehnert, S.; Reese, S.

    2011-01-01

    Isogeometric analysis is a versatile tool for failure analysis. On the one hand, the excellent control over the inter-element continuity conditions enables a natural incorporation of continuum constitutive relations that incorporate higher-order strain gradients, as in gradient plasticity or damage.

  16. Cognitive task analysis

    NARCIS (Netherlands)

    Schraagen, J.M.C.

    2000-01-01

    Cognitive task analysis is defined as the extension of traditional task analysis techniques to yield information about the knowledge, thought processes and goal structures that underlie observable task performance. Cognitive task analyses are conducted for a wide variety of purposes, including the

  17. Evaluating Style Analysis

    NARCIS (Netherlands)

    de Roon, F.A.; Nijman, T.E.; Ter Horst, J.R.

    2000-01-01

    In this paper we evaluate applications of (return based) style analysis.The portfolio and positivity constraints imposed by style analysis are useful in constructing mimicking portfolios without short positions.Such mimicking portfolios can be used, e.g., to construct efficient portfolios of mutual

  18. Circuit analysis with Multisim

    CERN Document Server

    Baez-Lopez, David

    2011-01-01

    This book is concerned with circuit simulation using National Instruments Multisim. It focuses on the use and comprehension of the working techniques for electrical and electronic circuit simulation. The first chapters are devoted to basic circuit analysis.It starts by describing in detail how to perform a DC analysis using only resistors and independent and controlled sources. Then, it introduces capacitors and inductors to make a transient analysis. In the case of transient analysis, it is possible to have an initial condition either in the capacitor voltage or in the inductor current, or bo

  19. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, John (Massachusetts Institute of Technology)

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  20. Panel Analysis

    DEFF Research Database (Denmark)

    Brænder, Morten; Andersen, Lotte Bøgh

    2014-01-01

    Based on our 2013-article, ”Does Deployment to War Affect Soldiers' Public Service Motivation – A Panel Study of Soldiers Before and After their Service in Afghanistan”, we present Panel Analysis as a methodological discipline. Panels consist of multiple units of analysis, observed at two or more...... in research settings where it is not possible to distribute units of analysis randomly or where the independent variables cannot be manipulated. The greatest disadvantage in regard to using panel studies is that data may be difficult to obtain. This is most clearly vivid in regard to the use of panel surveys...... points in time. In comparison with traditional cross-sectional studies, the advantage of using panel studies is that the time dimension enables us to study effects. Whereas experimental designs may have a clear advantage in regard to causal inference, the strength of panel studies is difficult to match...

  1. Real analysis

    CERN Document Server

    DiBenedetto, Emmanuele

    2016-01-01

    The second edition of this classic textbook presents a rigorous and self-contained introduction to real analysis with the goal of providing a solid foundation for future coursework and research in applied mathematics. Written in a clear and concise style, it covers all of the necessary subjects as well as those often absent from standard introductory texts. Each chapter features a “Problems and Complements” section that includes additional material that briefly expands on certain topics within the chapter and numerous exercises for practicing the key concepts. The first eight chapters explore all of the basic topics for training in real analysis, beginning with a review of countable sets before moving on to detailed discussions of measure theory, Lebesgue integration, Banach spaces, functional analysis, and weakly differentiable functions. More topical applications are discussed in the remaining chapters, such as maximal functions, functions of bounded mean oscillation, rearrangements, potential theory, a...

  2. Numerical analysis

    CERN Document Server

    Scott, L Ridgway

    2011-01-01

    Computational science is fundamentally changing how technological questions are addressed. The design of aircraft, automobiles, and even racing sailboats is now done by computational simulation. The mathematical foundation of this new approach is numerical analysis, which studies algorithms for computing expressions defined with real numbers. Emphasizing the theory behind the computation, this book provides a rigorous and self-contained introduction to numerical analysis and presents the advanced mathematics that underpin industrial software, including complete details that are missing from most textbooks. Using an inquiry-based learning approach, Numerical Analysis is written in a narrative style, provides historical background, and includes many of the proofs and technical details in exercises. Students will be able to go beyond an elementary understanding of numerical simulation and develop deep insights into the foundations of the subject. They will no longer have to accept the mathematical gaps that ex...

  3. Modern real analysis

    CERN Document Server

    Ziemer, William P

    2017-01-01

    This first year graduate text is a comprehensive resource in real analysis based on a modern treatment of measure and integration. Presented in a definitive and self-contained manner, it features a natural progression of concepts from simple to difficult. Several innovative topics are featured, including differentiation of measures, elements of Functional Analysis, the Riesz Representation Theorem, Schwartz distributions, the area formula, Sobolev functions and applications to harmonic functions. Together, the selection of topics forms a sound foundation in real analysis that is particularly suited to students going on to further study in partial differential equations. This second edition of Modern Real Analysis contains many substantial improvements, including the addition of problems for practicing techniques, and an entirely new section devoted to the relationship between Lebesgue and improper integrals. Aimed at graduate students with an understanding of advanced calculus, the text will also appeal to mo...

  4. A PROOF Analysis Framework

    International Nuclear Information System (INIS)

    González Caballero, I; Cuesta Noriega, A; Rodríguez Marrero, A; Fernández del Castillo, E

    2012-01-01

    The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of the routine tasks as much as possible. We describe the details of the PAF implementation as well as how we succeeded in engaging a group of CMS physicists to use PAF as their daily analysis framework.

  5. Marketing Mix Formulation for Higher Education: An Integrated Analysis Employing Analytic Hierarchy Process, Cluster Analysis and Correspondence Analysis

    Science.gov (United States)

    Ho, Hsuan-Fu; Hung, Chia-Chi

    2008-01-01

    Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…

  6. Manual versus Automated Narrative Analysis of Agrammatic Production Patterns: The Northwestern Narrative Language Analysis and Computerized Language Analysis

    Science.gov (United States)

    Hsu, Chien-Ju; Thompson, Cynthia K.

    2018-01-01

    Purpose: The purpose of this study is to compare the outcomes of the manually coded Northwestern Narrative Language Analysis (NNLA) system, which was developed for characterizing agrammatic production patterns, and the automated Computerized Language Analysis (CLAN) system, which has recently been adopted to analyze speech samples of individuals…

  7. Cost analysis guidelines

    International Nuclear Information System (INIS)

    Strait, R.S.

    1996-01-01

    The first phase of the Depleted Uranium Hexafluoride Management Program (Program)--management strategy selection--consists of several program elements: Technology Assessment, Engineering Analysis, Cost Analysis, and preparation of an Environmental Impact Statement (EIS). Cost Analysis will estimate the life-cycle costs associated with each of the long-term management strategy alternatives for depleted uranium hexafluoride (UF6). The scope of Cost Analysis will include all major expenditures, from the planning and design stages through decontamination and decommissioning. The costs will be estimated at a scoping or preconceptual design level and are intended to assist decision makers in comparing alternatives for further consideration. They will not be absolute costs or bid-document costs. The purpose of the Cost Analysis Guidelines is to establish a consistent approach to analyzing of cost alternatives for managing Department of Energy's (DOE's) stocks of depleted uranium hexafluoride (DUF6). The component modules that make up the DUF6 management program differ substantially in operational maintenance, process-options, requirements for R and D, equipment, facilities, regulatory compliance, (O and M), and operations risk. To facilitate a consistent and equitable comparison of costs, the guidelines offer common definitions, assumptions or basis, and limitations integrated with a standard approach to the analysis. Further, the goal is to evaluate total net life-cycle costs and display them in a way that gives DOE the capability to evaluate a variety of overall DUF6 management strategies, including commercial potential. The cost estimates reflect the preconceptual level of the designs. They will be appropriate for distinguishing among management strategies

  8. Radiation and environmental data analysis computer (REDAC) hardware, software band analysis procedures

    International Nuclear Information System (INIS)

    Hendricks, T.J.

    1985-01-01

    The REDAC was conceived originally as a tape verifier for the Radiation and Environmental Data Acquisition Recorder (REDAR). From that simple beginning in 1971, the REDAC has evolved into a family of systems used for complete analysis of data obtained by the REDAR and other acquisition systems. Portable or mobile REDACs are deployed to support checkout and analysis tasks in the field. Laboratory systems are additionally used for software development, physics investigations, data base management and graphics. System configurations range from man-portable systems to a large laboratory-based system which supports time-shared analysis and development tasks. Custom operating software allows the analyst to process data either interactively or by batch procedures. Analysis packages are provided for numerous necessary functions. All these analysis procedures can be performed even on the smallest man-portable REDAC. Examples of the multi-isotope stripping and radiation isopleth mapping are presented. Techniques utilized for these operations are also presented

  9. Systems engineering and analysis

    CERN Document Server

    Blanchard, Benjamin S

    2010-01-01

    For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.

  10. Numerical analysis

    CERN Document Server

    Brezinski, C

    2012-01-01

    Numerical analysis has witnessed many significant developments in the 20th century. This book brings together 16 papers dealing with historical developments, survey papers and papers on recent trends in selected areas of numerical analysis, such as: approximation and interpolation, solution of linear systems and eigenvalue problems, iterative methods, quadrature rules, solution of ordinary-, partial- and integral equations. The papers are reprinted from the 7-volume project of the Journal of Computational and Applied Mathematics on '/homepage/sac/cam/na2000/index.html<

  11. Automation of activation analysis

    International Nuclear Information System (INIS)

    Ivanov, I.N.; Ivanets, V.N.; Filippov, V.V.

    1985-01-01

    The basic data on the methods and equipment of activation analysis are presented. Recommendations on the selection of activation analysis techniques, and especially the technique envisaging the use of short-lived isotopes, are given. The equipment possibilities to increase dataway carrying capacity, using modern computers for the automation of the analysis and data processing procedure, are shown

  12. Intelligent audio analysis

    CERN Document Server

    Schuller, Björn W

    2013-01-01

    This book provides the reader with the knowledge necessary for comprehension of the field of Intelligent Audio Analysis. It firstly introduces standard methods and discusses the typical Intelligent Audio Analysis chain going from audio data to audio features to audio recognition.  Further, an introduction to audio source separation, and enhancement and robustness are given. After the introductory parts, the book shows several applications for the three types of audio: speech, music, and general sound. Each task is shortly introduced, followed by a description of the specific data and methods applied, experiments and results, and a conclusion for this specific task. The books provides benchmark results and standardized test-beds for a broader range of audio analysis tasks. The main focus thereby lies on the parallel advancement of realism in audio analysis, as too often today’s results are overly optimistic owing to idealized testing conditions, and it serves to stimulate synergies arising from transfer of ...

  13. Advantages of Wavelet analysis compared to Fourier analysis for the interpretation of electrochemical noise

    International Nuclear Information System (INIS)

    Espada, L.; Sanjurjo, M.; Urrejola, S.; Bouzada, F.; Rey, G.; Sanchez, A.

    2003-01-01

    Given its simplicity and low cost compared to other types of methodologies, the measurement and interpretation of Electrochemical Noise, is consolidating itself as one of the analysis methods most frequently used for the interpretation of corrosion. As the technique is still evolving, standard treatment methodologies for data retrieved in experiments do not exist yet. To date, statistical analysis and the Fourier analysis are commonly used in order to establish the parameters that may characterize the recording of potential and current electrochemical noise. This study introduces a new methodology based on wavelet analysis and presents its advantages with regards to the Fourier analysis in distinguishes periodical and non-periodical variations in the signal power in time and frequency, as opposed to the Fourier analysis that only considers the frequency. (Author) 15 refs

  14. 21 CFR 1230.34 - Analysis.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Analysis. 1230.34 Section 1230.34 Food and Drugs... POISON ACT Administrative Procedures § 1230.34 Analysis. (a) The methods of examination or analysis..., provided, however, that any method of analysis or examination satisfactory to the Food and Drug...

  15. Real analysis

    CERN Document Server

    Loeb, Peter A

    2016-01-01

    This textbook is designed for a year-long course in real analysis taken by beginning graduate and advanced undergraduate students in mathematics and other areas such as statistics, engineering, and economics. Written by one of the leading scholars in the field, it elegantly explores the core concepts in real analysis and introduces new, accessible methods for both students and instructors. The first half of the book develops both Lebesgue measure and, with essentially no additional work for the student, general Borel measures for the real line. Notation indicates when a result holds only for Lebesgue measure. Differentiation and absolute continuity are presented using a local maximal function, resulting in an exposition that is both simpler and more general than the traditional approach. The second half deals with general measures and functional analysis, including Hilbert spaces, Fourier series, and the Riesz representation theorem for positive linear functionals on continuous functions with compact support....

  16. Numerical analysis

    CERN Document Server

    Jacques, Ian

    1987-01-01

    This book is primarily intended for undergraduates in mathematics, the physical sciences and engineering. It introduces students to most of the techniques forming the core component of courses in numerical analysis. The text is divided into eight chapters which are largely self-contained. However, with a subject as intricately woven as mathematics, there is inevitably some interdependence between them. The level of difficulty varies and, although emphasis is firmly placed on the methods themselves rather than their analysis, we have not hesitated to include theoretical material when we consider it to be sufficiently interesting. However, it should be possible to omit those parts that do seem daunting while still being able to follow the worked examples and to tackle the exercises accompanying each section. Familiarity with the basic results of analysis and linear algebra is assumed since these are normally taught in first courses on mathematical methods. For reference purposes a list of theorems used in the t...

  17. Physics and Video Analysis

    Science.gov (United States)

    Allain, Rhett

    2016-05-01

    We currently live in a world filled with videos. There are videos on YouTube, feature movies and even videos recorded with our own cameras and smartphones. These videos present an excellent opportunity to not only explore physical concepts, but also inspire others to investigate physics ideas. With video analysis, we can explore the fantasy world in science-fiction films. We can also look at online videos to determine if they are genuine or fake. Video analysis can be used in the introductory physics lab and it can even be used to explore the make-believe physics embedded in video games. This book covers the basic ideas behind video analysis along with the fundamental physics principles used in video analysis. The book also includes several examples of the unique situations in which video analysis can be used.

  18. Real analysis and applications

    CERN Document Server

    Botelho, Fabio Silva

    2018-01-01

    This textbook introduces readers to real analysis in one and n dimensions. It is divided into two parts: Part I explores real analysis in one variable, starting with key concepts such as the construction of the real number system, metric spaces, and real sequences and series. In turn, Part II addresses the multi-variable aspects of real analysis. Further, the book presents detailed, rigorous proofs of the implicit theorem for the vectorial case by applying the Banach fixed-point theorem and the differential forms concept to surfaces in Rn. It also provides a brief introduction to Riemannian geometry. With its rigorous, elegant proofs, this self-contained work is easy to read, making it suitable for undergraduate and beginning graduate students seeking a deeper understanding of real analysis and applications, and for all those looking for a well-founded, detailed approach to real analysis.

  19. Comparison of descriptive sensory analysis and chemical analysis for oxidative changes in milk

    DEFF Research Database (Denmark)

    Hedegaard, R V; Kristensen, D; Nielsen, Jacob Holm

    2006-01-01

    and lipolytic changes occurring in the milk during chill storage for 4 d. Sensory analysis and chemical analysis showed high correlation between the typical descriptors for oxidation such as cardboard, metallic taste, and boiled milk and specific chemical markers for oxidation such as hexanal. Notably, primary......Oxidation in 3 types of bovine milk with different fatty acid profiles obtained through manipulation of feed was evaluated by analytical methods quantifying the content of potential antioxidants, the tendency of formation of free radicals, and the accumulation of primary and secondary oxidation...... products. The milk samples were evaluated in parallel by descriptive sensory analysis by a trained panel, and the correlation between the chemical analysis and the descriptive sensory analysis was evaluated. The fatty acid composition of the 3 types of milk was found to influence the oxidative...

  20. Comparison of descriptive sensory analysis and chemical analysis for oxidative changes in milk

    DEFF Research Database (Denmark)

    Hedegaard, Rikke Susanne Vingborg; Kristensen, D.; Nielsen, J. H.

    2006-01-01

    products. The milk samples were evaluated in parallel by descriptive sensory analysis by a trained panel, and the correlation between the chemical analysis and the descriptive sensory analysis was evaluated. The fatty acid composition of the 3 types of milk was found to influence the oxidative...... and lipolytic changes occurring in the milk during chill storage for 4 d. Sensory analysis and chemical analysis showed high correlation between the typical descriptors for oxidation such as cardboard, metallic taste, and boiled milk and specific chemical markers for oxidation such as hexanal. Notably, primary...... oxidation products (i.e., lipid hydroperoxides) and even the tendency of formation of radicals as measured by electron spin resonance spectroscopy were also highly correlated to the sensory descriptors for oxidation. Electron spin resonance spectroscopy should accordingly be further explored as a routine...

  1. Containment vessel stability analysis

    International Nuclear Information System (INIS)

    Harstead, G.A.; Morris, N.F.; Unsal, A.I.

    1983-01-01

    The stability analysis for a steel containment shell is presented herein. The containment is a freestanding shell consisting of a vertical cylinder with a hemispherical dome. It is stiffened by large ring stiffeners and relatively small longitudinal stiffeners. The containment vessel is subjected to both static and dynamic loads which can cause buckling. These loads must be combined prior to their use in a stability analysis. The buckling loads were computed with the aid of the ASME Code case N-284 used in conjunction with general purpose computer codes and in-house programs. The equations contained in the Code case were used to compute the knockdown factors due to shell imperfections. After these knockdown factors were applied to the critical stress states determined by freezing the maximum dynamic stresses and combining them with other static stresses, a linear bifurcation analysis was carried out with the aid of the BOSOR4 program. Since the containment shell contained large penetrations, the Code case had to be supplemented by a local buckling analysis of the shell area surrounding the largest penetration. This analysis was carried out with the aid of the NASTRAN program. Although the factor of safety against buckling obtained in this analysis was satisfactory, it is claimed that the use of the Code case knockdown factors are unduly conservative when applied to the analysis of buckling around penetrations. (orig.)

  2. Statistical data analysis

    International Nuclear Information System (INIS)

    Hahn, A.A.

    1994-11-01

    The complexity of instrumentation sometimes requires data analysis to be done before the result is presented to the control room. This tutorial reviews some of the theoretical assumptions underlying the more popular forms of data analysis and presents simple examples to illuminate the advantages and hazards of different techniques

  3. Elemental analysis of granite by instrumental neutron activation analysis (INAA) and X-ray fluorescence analysis (XRF)

    International Nuclear Information System (INIS)

    El-Taher, A.

    2012-01-01

    The instrumental neutron activation analysis technique (INAA) was used for qualitative and quantitative analysis of granite samples collected from four locations in the Aswan area in South Egypt. The samples were prepared together with their standards and simultaneously irradiated in a neutron flux of 7×10 11 n/cm 2 s in the TRIGA Mainz research reactor. Gamma-ray spectra from an hyper-pure germanium detector were analyzed. The present study provides the basic data of elemental concentrations of granite rocks. The following elements have been determined Na, Mg, K, Fe, Mn, Sc, Cr, Ti, Co, Zn, Ga, Rb, Zr, Nb, Sn, Ba, Cs, La, Ce, Nd, Sm, Eu, Yb, Lu, Hf, Ta, Th and U. The X-ray fluorescence (XRF) was used for comparison and to detect elements, which can be detected only by XRF such as F, S, Cl, Co, Cu, Mo, Ni, Pb, Se and V. The data presented here are our contribution to understanding the elemental composition of the granite rocks. Because there are no existing databases for the elemental analysis of granite, our results are a start to establishing a database for the Egyptian granite. It is hoped that the data presented here will be useful to those dealing with geochemistry, granite chemistry and related fields. - Highlights: ► Instrumental neutron activation analysis technique (INAA) was used for qualitative and quantitative analysis of granite. ► The samples were prepared together with their standards and simultaneously irradiated in a neutron flux of 7×10 11 n/cm 2 s in the TRIGA Mainz research reactor. ► Following elements have been determined Na, Mg, K, Fe, Mn, Sc, Cr, Ti, Co, Zn, Ga, Rb, Zr, Nb, Sn, Ba, Cs, La, Ce, Nd, Sm, Eu, Yb, Lu, Hf, Ta, Th and U.

  4. Regional energy facility siting analysis

    International Nuclear Information System (INIS)

    Eberhart, R.C.; Eagles, T.W.

    1976-01-01

    Results of the energy facility siting analysis portion of a regional pilot study performed for the anticipated National Energy Siting and Facility Report are presented. The question of cell analysis versus site-specific analysis is explored, including an evaluation of the difference in depth between the two approaches. A discussion of the possible accomplishments of regional analysis is presented. It is concluded that regional sitting analysis could be of use in a national siting study, if its inherent limits are recognized

  5. Analysis of Lead and Zinc by Mercury-Free Potentiometric Stripping Analysis

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    A method is presented for trace-element analysis of lead and zinc by potentiometric stripping analysis (PSA) where both the glassy-carbon working electrode and the electrolyte are free of mercury. Analysis of zinc requires an activation procedure of the glassy-carbon electrode. The activation...... is performed by pre-concentrating zinc on glassy carbon at -1400 mV(SCE) in a mercury-free electrolyte containing 0.1 M HCl and 2 ppm Zn2+, followed by stripping at approx. -1050 mV. A linear relationship between stripping peak areas, recorded in the derivative mode, and concentration was found...

  6. Evaluation and presentation of analysis methods for reception analysis in reprocessing

    International Nuclear Information System (INIS)

    Mainka, E.

    1985-01-01

    The fissile material content in the dissolving or balancing tank of a reprocessing plant has special significance in nuclear fuel balancing. This is the first opportunity for destructive analysis of the fuel content of the material after burn-up of fuel elements in the reactor. In the current state-of-the-art, all balancing methods are based directly or indirectly on data obtained by chemical analysis. The following methods are evaluated: Mass-spectroscopic isotope dilution analysis, X-ray fluorescence spectroscopy, Isotopic correlation, Gamma absorptiometry, Redox titration, Emission spectroscopy after plasma excitation, Alpha spectroscopy, and Laser Raman spectroscopy

  7. Analysis apparatus and method of analysis

    International Nuclear Information System (INIS)

    1976-01-01

    A continuous streaming method developed for the excution of immunoassays is described in this patent. In addition, a suitable apparatus for the method was developed whereby magnetic particles are automatically employed for the consecutive analysis of a series of liquid samples via the RIA technique

  8. Defining disease beyond conceptual analysis: an analysis of conceptual analysis in philosophy of medicine.

    Science.gov (United States)

    Lemoine, Maël

    2013-08-01

    Conceptual analysis of health and disease is portrayed as consisting in the confrontation of a set of criteria--a "definition"--with a set of cases, called instances of either "health" or "disease." Apart from logical counter-arguments, there is no other way to refute an opponent's definition than by providing counter-cases. As resorting to intensional stipulation (stipulation of meaning) is not forbidden, several contenders can therefore be deemed to have succeeded. This implies that conceptual analysis alone is not likely to decide between naturalism and normativism. An alternative to this approach would be to examine whether the concept of disease can be naturalized.

  9. Mastering Clojure data analysis

    CERN Document Server

    Rochester, Eric

    2014-01-01

    This book consists of a practical, example-oriented approach that aims to help you learn how to use Clojure for data analysis quickly and efficiently.This book is great for those who have experience with Clojure and who need to use it to perform data analysis. This book will also be hugely beneficial for readers with basic experience in data analysis and statistics.

  10. Evaluation of Analysis by Cross-Validation, Part II: Diagnostic and Optimization of Analysis Error Covariance

    Directory of Open Access Journals (Sweden)

    Richard Ménard

    2018-02-01

    Full Text Available We present a general theory of estimation of analysis error covariances based on cross-validation as well as a geometric interpretation of the method. In particular, we use the variance of passive observation-minus-analysis residuals and show that the true analysis error variance can be estimated, without relying on the optimality assumption. This approach is used to obtain near optimal analyses that are then used to evaluate the air quality analysis error using several different methods at active and passive observation sites. We compare the estimates according to the method of Hollingsworth-Lönnberg, Desroziers et al., a new diagnostic we developed, and the perceived analysis error computed from the analysis scheme, to conclude that, as long as the analysis is near optimal, all estimates agree within a certain error margin.

  11. Convex analysis

    CERN Document Server

    Rockafellar, Ralph Tyrell

    2015-01-01

    Available for the first time in paperback, R. Tyrrell Rockafellar's classic study presents readers with a coherent branch of nonlinear mathematical analysis that is especially suited to the study of optimization problems. Rockafellar's theory differs from classical analysis in that differentiability assumptions are replaced by convexity assumptions. The topics treated in this volume include: systems of inequalities, the minimum or maximum of a convex function over a convex set, Lagrange multipliers, minimax theorems and duality, as well as basic results about the structure of convex sets and

  12. Outlier analysis

    CERN Document Server

    Aggarwal, Charu C

    2013-01-01

    With the increasing advances in hardware technology for data collection, and advances in software technology (databases) for data organization, computer scientists have increasingly participated in the latest advancements of the outlier analysis field. Computer scientists, specifically, approach this field based on their practical experiences in managing large amounts of data, and with far fewer assumptions- the data can be of any type, structured or unstructured, and may be extremely large.Outlier Analysis is a comprehensive exposition, as understood by data mining experts, statisticians and

  13. Elementary analysis

    CERN Document Server

    Snell, K S; Langford, W J; Maxwell, E A

    1966-01-01

    Elementary Analysis, Volume 2 introduces several of the ideas of modern mathematics in a casual manner and provides the practical experience in algebraic and analytic operations that lays a sound foundation of basic skills. This book focuses on the nature of number, algebraic and logical structure, groups, rings, fields, vector spaces, matrices, sequences, limits, functions and inverse functions, complex numbers, and probability. The logical structure of analysis given through the treatment of differentiation and integration, with applications to the trigonometric and logarithmic functions, is

  14. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  15. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  16. Maritime transportation risk analysis: Review and analysis in light of some foundational issues

    International Nuclear Information System (INIS)

    Goerlandt, Floris; Montewka, Jakub

    2015-01-01

    Many methods and applications for maritime transportation risk analysis have been presented in the literature. In parallel, there is a recent focus on foundational issues in risk analysis, with calls for intensified research on fundamental concepts and principles underlying the scientific field. This paper presents a review and analysis of risk definitions, perspectives and scientific approaches to risk analysis found in the maritime transportation application area, focusing on applications addressing accidental risk of shipping in a sea area. For this purpose, a classification of risk definitions, an overview of elements in risk perspectives and a classification of approaches to risk analysis science are applied. Results reveal that in the application area, risk is strongly tied to probability, both in definitions and perspectives, while alternative views exist. A diffuse situation is also found concerning the scientific approach to risk analysis, with realist, proceduralist and constructivist foundations co-existing. Realist approaches dominate the application area. Very few applications systematically account for uncertainty, neither concerning the evidence base nor in relation to the limitations of the risk model in relation to the space of possible outcomes. Some suggestions are made to improve the current situation, aiming to strengthen the scientific basis for risk analysis. - Highlights: • Risk analyses in maritime transportation analysed in light of foundational issues. • Focus on definitions, perspectives and scientific approaches to risk analysis. • Probability-based definitions and realist approaches dominate the field. • Findings support calls for increased focus on foundational issues in risk research. • Some suggestions are made to improve the current situation

  17. Analysis of a Braking System on the Basis of Structured Analysis Methods

    OpenAIRE

    Ben Salem J.; Lakhoua M.N.; El Amraoui L.

    2016-01-01

    In this paper, we present the general context of the research in the domain of analysis and modeling of mechatronic systems. In fact, we present à bibliographic review on some works of research about the systemic analysis of mechatronic systems. To better understand its characteristics, we start with an introduction about mechatronic systems and various fields related to these systems, after we present a few analysis and design methods applied to mechatronic systems. Finally, we apply the two...

  18. Integrated piping structural analysis system

    International Nuclear Information System (INIS)

    Motoi, Toshio; Yamadera, Masao; Horino, Satoshi; Idehata, Takamasa

    1979-01-01

    Structural analysis of the piping system for nuclear power plants has become larger in scale and in quantity. In addition, higher quality analysis is regarded as of major importance nowadays from the point of view of nuclear plant safety. In order to fulfill to the above requirements, an integrated piping structural analysis system (ISAP-II) has been developed. Basic philosophy of this system is as follows: 1. To apply the date base system. All information is concentrated. 2. To minimize the manual process in analysis, evaluation and documentation. Especially to apply the graphic system as much as possible. On the basis of the above philosophy four subsystems were made. 1. Data control subsystem. 2. Analysis subsystem. 3. Plotting subsystem. 4. Report subsystem. Function of the data control subsystem is to control all information of the data base. Piping structural analysis can be performed by using the analysis subsystem. Isometric piping drawing and mode shape, etc. can be plotted by using the plotting subsystem. Total analysis report can be made without the manual process through the reporting subsystem. (author)

  19. Longitudinal Meta-analysis

    NARCIS (Netherlands)

    Hox, J.J.; Maas, C.J.M.; Lensvelt-Mulders, G.J.L.M.

    2004-01-01

    The goal of meta-analysis is to integrate the research results of a number of studies on a specific topic. Characteristic for meta-analysis is that in general only the summary statistics of the studies are used and not the original data. When the published research results to be integrated

  20. Enabling interdisciplinary analysis

    Science.gov (United States)

    L. M. Reid

    1996-01-01

    'New requirements for evaluating environmental conditions in the Pacific Northwest have led to increased demands for interdisciplinary analysis of complex environmental problems. Procedures for watershed analysis have been developed for use on public and private lands in Washington State (Washington Forest Practices Board 1993) and for federal lands in the Pacific...

  1. Incorporating technical analysis in undergraduate curricula

    Directory of Open Access Journals (Sweden)

    Michael R. Melton

    2017-11-01

    Full Text Available Purpose – The purpose of this paper is to introduce instruction of technical analysis on the undergraduate level that can coincide with traditional teachings of fundamental analysis. Design/methodology/approach – Through examples using the latest in security analysis technology, this paper illustrates the importance of technical security analysis. Findings – This research illustrates how technical analysis techniques may be used to make more significant investment decisions. Originality/value – Kirkpatrick and Dahlquist define technical analysis as a security analysis discipline for forecasting future direction of prices through the study of past market data primarily price and volume This form of analysis has stood in direct contrast to the fundamental analysis approach whereby actual facts of the company its industry and sector may be ignored. Understanding this contrast, much of academia has chosen to continue to focus its finance curricula on fundamental analysis techniques. As more universities implement trading rooms to reflect that of industry, they must recognize that any large brokerage trading group or financial institution will typically have both a technical analysis and fundamental analysis team. Thus, the need to incorporate technical analysis into undergraduate finance curricula.

  2. Survival analysis models and applications

    CERN Document Server

    Liu, Xian

    2012-01-01

    Survival analysis concerns sequential occurrences of events governed by probabilistic laws.  Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin

  3. WWW-based remote analysis framework for UniSampo and Shaman analysis software

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Ala-Heikkilae, J.J.; Routti, J.T.; Nikkinen, M.T.

    2005-01-01

    UniSampo and Shaman are well-established analytical tools for gamma-ray spectrum analysis and the subsequent radionuclide identification. These tools are normally run locally on a Unix or Linux workstation in interactive mode. However, it is also possible to run them in batch/non-interactive mode by starting them with the correct parameters. This is how they are used in the standard analysis pipeline operation. This functionality also makes it possible to use them for remote operation over the network. Framework for running UniSampo and Shaman analysis using the standard WWW-protocol has been developed. A WWW-server receives requests from the client WWW-browser and runs the analysis software via a set of CGI-scripts. Authentication, input data transfer, and output and display of the final analysis results is all carried out using standard WWW-mechanisms. This WWW-framework can be utilized, for example, by organizations that have radioactivity surveillance stations in a wide area. A computer with a standard internet/intranet connection suffices for on-site analyses. (author)

  4. Integrated genetic analysis microsystems

    International Nuclear Information System (INIS)

    Lagally, Eric T; Mathies, Richard A

    2004-01-01

    With the completion of the Human Genome Project and the ongoing DNA sequencing of the genomes of other animals, bacteria, plants and others, a wealth of new information about the genetic composition of organisms has become available. However, as the demand for sequence information grows, so does the workload required both to generate this sequence and to use it for targeted genetic analysis. Microfabricated genetic analysis systems are well poised to assist in the collection and use of these data through increased analysis speed, lower analysis cost and higher parallelism leading to increased assay throughput. In addition, such integrated microsystems may point the way to targeted genetic experiments on single cells and in other areas that are otherwise very difficult. Concomitant with these advantages, such systems, when fully integrated, should be capable of forming portable systems for high-speed in situ analyses, enabling a new standard in disciplines such as clinical chemistry, forensics, biowarfare detection and epidemiology. This review will discuss the various technologies available for genetic analysis on the microscale, and efforts to integrate them to form fully functional robust analysis devices. (topical review)

  5. Physics Metacognition Inventory Part II: Confirmatory factor analysis and Rasch analysis

    Science.gov (United States)

    Taasoobshirazi, Gita; Bailey, MarLynn; Farley, John

    2015-11-01

    The Physics Metacognition Inventory was developed to measure physics students' metacognition for problem solving. In one of our earlier studies, an exploratory factor analysis provided evidence of preliminary construct validity, revealing six components of students' metacognition when solving physics problems including knowledge of cognition, planning, monitoring, evaluation, debugging, and information management. The college students' scores on the inventory were found to be reliable and related to students' physics motivation and physics grade. However, the results of the exploratory factor analysis indicated that the questionnaire could be revised to improve its construct validity. The goal of this study was to revise the questionnaire and establish its construct validity through a confirmatory factor analysis. In addition, a Rasch analysis was applied to the data to better understand the psychometric properties of the inventory and to further evaluate the construct validity. Results indicated that the final, revised inventory is a valid, reliable, and efficient tool for assessing student metacognition for physics problem solving.

  6. Abstract Interfaces for Data Analysis Component Architecture for Data Analysis Tools

    CERN Document Server

    Barrand, G; Dönszelmann, M; Johnson, A; Pfeiffer, A

    2001-01-01

    The fast turnover of software technologies, in particular in the domain of interactivity (covering user interface and visualisation), makes it difficult for a small group of people to produce complete and polished software-tools before the underlying technologies make them obsolete. At the HepVis '99 workshop, a working group has been formed to improve the production of software tools for data analysis in HENP. Beside promoting a distributed development organisation, one goal of the group is to systematically design a set of abstract interfaces based on using modern OO analysis and OO design techniques. An initial domain analysis has come up with several categories (components) found in typical data analysis tools: Histograms, Ntuples, Functions, Vectors, Fitter, Plotter, Analyzer and Controller. Special emphasis was put on reducing the couplings between the categories to a minimum, thus optimising re-use and maintainability of any component individually. The interfaces have been defined in Java and C++ and i...

  7. Virtual data in CMS analysis

    International Nuclear Information System (INIS)

    Arbree, A.

    2003-01-01

    The use of virtual data for enhancing the collaboration between large groups of scientists is explored in several ways: by defining ''virtual'' parameter spaces which can be searched and shared in an organized way by a collaboration of scientists in the course of their analysis; by providing a mechanism to log the provenance of results and the ability to trace them back to the various stages in the analysis of real or simulated data; by creating ''check points'' in the course of an analysis to permit collaborators to explore their own analysis branches by refining selections, improving the signal to background ratio, varying the estimation of parameters, etc.; by facilitating the audit of an analysis and the reproduction of its results by a different group, or in a peer review context. We describe a prototype for the analysis of data from the CMS experiment based on the virtual data system Chimera and the object-oriented data analysis framework ROOT. The Chimera system is used to chain together several steps in the analysis process including the Monte Carlo generation of data, the simulation of detector response, the reconstruction of physics objects and their subsequent analysis, histogramming and visualization using the ROOT framework

  8. Contrast and Critique of Two Approaches to Discourse Analysis: Conversation Analysis and Speech Act Theory

    Directory of Open Access Journals (Sweden)

    Nguyen Van Han

    2014-08-01

    Full Text Available Discourse analysis, as Murcia and Olshtain (2000 assume, is a vast study of language in use that extends beyond sentence level, and it involves a more cognitive and social perspective on language use and communication exchanges. Holding a wide range of phenomena about language with society, culture and thought, discourse analysis contains various approaches: speech act, pragmatics, conversation analysis, variation analysis, and critical discourse analysis. Each approach works in its different domain to discourse. For one dimension, it shares the same assumptions or general problems in discourse analysis with the other approaches: for instance, the explanation on how we organize language into units beyond sentence boundaries, or how language is used to convey information about the world, ourselves and human relationships (Schiffrin 1994: viii. For other dimensions, each approach holds its distinctive characteristics contributing to the vastness of discourse analysis. This paper will mainly discuss two approaches to discourse analysis- conversation analysis and speech act theory- and will attempt to point out some similarities as well as contrasting features between the two approaches, followed by a short reflection on their strengths and weaknesses in the essence of each approach. The organizational and discourse features in the exchanges among three teachers at the College of Finance and Customs in Vietnam will be analysed in terms of conversation analysis and speech act theory.

  9. ADAGE signature analysis: differential expression analysis with data-defined gene sets.

    Science.gov (United States)

    Tan, Jie; Huyck, Matthew; Hu, Dongbo; Zelaya, René A; Hogan, Deborah A; Greene, Casey S

    2017-11-22

    Gene set enrichment analysis and overrepresentation analyses are commonly used methods to determine the biological processes affected by a differential expression experiment. This approach requires biologically relevant gene sets, which are currently curated manually, limiting their availability and accuracy in many organisms without extensively curated resources. New feature learning approaches can now be paired with existing data collections to directly extract functional gene sets from big data. Here we introduce a method to identify perturbed processes. In contrast with methods that use curated gene sets, this approach uses signatures extracted from public expression data. We first extract expression signatures from public data using ADAGE, a neural network-based feature extraction approach. We next identify signatures that are differentially active under a given treatment. Our results demonstrate that these signatures represent biological processes that are perturbed by the experiment. Because these signatures are directly learned from data without supervision, they can identify uncurated or novel biological processes. We implemented ADAGE signature analysis for the bacterial pathogen Pseudomonas aeruginosa. For the convenience of different user groups, we implemented both an R package (ADAGEpath) and a web server ( http://adage.greenelab.com ) to run these analyses. Both are open-source to allow easy expansion to other organisms or signature generation methods. We applied ADAGE signature analysis to an example dataset in which wild-type and ∆anr mutant cells were grown as biofilms on the Cystic Fibrosis genotype bronchial epithelial cells. We mapped active signatures in the dataset to KEGG pathways and compared with pathways identified using GSEA. The two approaches generally return consistent results; however, ADAGE signature analysis also identified a signature that revealed the molecularly supported link between the MexT regulon and Anr. We designed

  10. Gait Analysis Using Wearable Sensors

    Directory of Open Access Journals (Sweden)

    Hutian Feng

    2012-02-01

    Full Text Available Gait analysis using wearable sensors is an inexpensive, convenient, and efficient manner of providing useful information for multiple health-related applications. As a clinical tool applied in the rehabilitation and diagnosis of medical conditions and sport activities, gait analysis using wearable sensors shows great prospects. The current paper reviews available wearable sensors and ambulatory gait analysis methods based on the various wearable sensors. After an introduction of the gait phases, the principles and features of wearable sensors used in gait analysis are provided. The gait analysis methods based on wearable sensors is divided into gait kinematics, gait kinetics, and electromyography. Studies on the current methods are reviewed, and applications in sports, rehabilitation, and clinical diagnosis are summarized separately. With the development of sensor technology and the analysis method, gait analysis using wearable sensors is expected to play an increasingly important role in clinical applications.

  11. Gait Analysis Using Wearable Sensors

    Science.gov (United States)

    Tao, Weijun; Liu, Tao; Zheng, Rencheng; Feng, Hutian

    2012-01-01

    Gait analysis using wearable sensors is an inexpensive, convenient, and efficient manner of providing useful information for multiple health-related applications. As a clinical tool applied in the rehabilitation and diagnosis of medical conditions and sport activities, gait analysis using wearable sensors shows great prospects. The current paper reviews available wearable sensors and ambulatory gait analysis methods based on the various wearable sensors. After an introduction of the gait phases, the principles and features of wearable sensors used in gait analysis are provided. The gait analysis methods based on wearable sensors is divided into gait kinematics, gait kinetics, and electromyography. Studies on the current methods are reviewed, and applications in sports, rehabilitation, and clinical diagnosis are summarized separately. With the development of sensor technology and the analysis method, gait analysis using wearable sensors is expected to play an increasingly important role in clinical applications. PMID:22438763

  12. Safety analysis for research reactors

    International Nuclear Information System (INIS)

    2008-01-01

    The aim of safety analysis for research reactors is to establish and confirm the design basis for items important to safety using appropriate analytical tools. The design, manufacture, construction and commissioning should be integrated with the safety analysis to ensure that the design intent has been incorporated into the as-built reactor. Safety analysis assesses the performance of the reactor against a broad range of operating conditions, postulated initiating events and other circumstances, in order to obtain a complete understanding of how the reactor is expected to perform in these situations. Safety analysis demonstrates that the reactor can be kept within the safety operating regimes established by the designer and approved by the regulatory body. This analysis can also be used as appropriate in the development of operating procedures, periodic testing and inspection programmes, proposals for modifications and experiments and emergency planning. The IAEA Safety Requirements publication on the Safety of Research Reactors states that the scope of safety analysis is required to include analysis of event sequences and evaluation of the consequences of the postulated initiating events and comparison of the results of the analysis with radiological acceptance criteria and design limits. This Safety Report elaborates on the requirements established in IAEA Safety Standards Series No. NS-R-4 on the Safety of Research Reactors, and the guidance given in IAEA Safety Series No. 35-G1, Safety Assessment of Research Reactors and Preparation of the Safety Analysis Report, providing detailed discussion and examples of related topics. Guidance is given in this report for carrying out safety analyses of research reactors, based on current international good practices. The report covers all the various steps required for a safety analysis; that is, selection of initiating events and acceptance criteria, rules and conventions, types of safety analysis, selection of

  13. Fast neutron activation analysis

    International Nuclear Information System (INIS)

    Pepelnik, R.

    1986-01-01

    Since 1981 numerous 14 MeV neutron activation analyses were performed at Korona. On the basis of that work the advantages of this analysis technique and therewith obtained results are compared with other analytical methods. The procedure of activation analysis, the characteristics of Korona, some analytical investigations in environmental research and material physics, as well as sources of systematic errors in trace analysis are described. (orig.) [de

  14. The ATLAS Analysis Architecture

    International Nuclear Information System (INIS)

    Cranmer, K.S.

    2008-01-01

    We present an overview of the ATLAS analysis architecture including the relevant aspects of the computing model and the major architectural aspects of the Athena framework. Emphasis will be given to the interplay between the analysis use cases and the technical aspects of the architecture including the design of the event data model, transient-persistent separation, data reduction strategies, analysis tools, and ROOT interoperability

  15. Dynamical coupled-channel analysis at EBAC. (Excited Baryon Analysis Center)

    International Nuclear Information System (INIS)

    Lee, T.-S.H.; Thomas Jefferson National Accelerator Facility, Newport News, VA

    2008-01-01

    In this contribution, the author reports on the dynamical coupled-channels analysis being pursued at the Excited Baryon Analysis Center (EBAC) of Jefferson Laboratory. EBAC was established in January 2006. Its objective is to extract the parameters associated with the excited states (N*) of the nucleon from the world data of meson production reactions, and to also develop theoretical interpretations of the extracted N* parameters

  16. Citation analysis of meta-analysis articles on posttraumatic stress disorder.

    Science.gov (United States)

    Liao, Xi-Ming; Chen, Ping-Yan

    2011-04-01

    In the past two decades enormously scientific researches on posttraumatic stress disorder (PTSD) have been undertaken and many related meta-analyses have been published. Citation analysis was used to get comprehensive perspectives of meta-analysis articles (MA articles) on PTSD for the purpose of facilitating the researchers, physicians and policy-makers to understand the PTSD. MA articles on PTSD in any languages from January 1980 to March 2009 were included if they presented meta-analytical methods and received at least one citation recorded in the Web of Science (WoS). Whereas studies, in which any effect sizes of PTSD were not distinguished from other psychological disorders, were excluded. Citations to and by identified MA articles were documented basing on records in WoS. Citation analysis was used to examine distribution patterns of characteristics and citation impact of MA articles on PTSD. Canonical analysis was used to explore the relationship between the characteristics of MA articles and citation impact. Thirty-four MA articles published during 1998 and 2008 were identified and revealed multiple study topics on PTSD: 10 (29.4%) were about epidemiology, 13 (38.2%) about treatment or intervention, 6 (17.6%) about pathophysiology or neurophysiology or neuroendocrine, 3 (8.8%) about childhood and 2 (5.9%) about psychosocial adversity. Two articles cited most frequently with 456 and 145 counts were published in Journal of Consulting and Clinical Psychology by Brewin (2000) and Psychological Bulletin by Ozer (2003), respectively. Mean cited count was 7.48 ± 10.56 and mean age (year 2009 minus article publication year) was (4.24 ± 2.91) years. They had been cited approximately by 67 disciplines and by authors from 42 countries or territories. Characteristics of meta-analysis highly correlated with citation impact and reflected by canonical correlation of 0.899 (P < 0.000 01). The age of MA articles predicted their citation impact. Citation analysis would

  17. Clustering analysis

    International Nuclear Information System (INIS)

    Romli

    1997-01-01

    Cluster analysis is the name of group of multivariate techniques whose principal purpose is to distinguish similar entities from the characteristics they process.To study this analysis, there are several algorithms that can be used. Therefore, this topic focuses to discuss the algorithms, such as, similarity measures, and hierarchical clustering which includes single linkage, complete linkage and average linkage method. also, non-hierarchical clustering method, which is popular name K -mean method ' will be discussed. Finally, this paper will be described the advantages and disadvantages of every methods

  18. Reentry analysis

    International Nuclear Information System (INIS)

    Biehl, F.A.

    1984-05-01

    This paper presents the criteria, previous nuclear experience in space, analysis techniques, and possible breakup enhancement devices applicable to an acceptable SP-100 reentry from space. Reactor operation in nuclear-safe orbit will minimize the radiological risk; the remaining safeguards criteria need to be defined. A simple analytical point mass reentry technique and a more comprehensive analysis method that considers vehicle dynamics and orbit insertion malfunctions are presented. Vehicle trajectory, attitude, and possible breakup enhancement devices will be integrated in the simulation as required to ensure an adequate representation of the reentry process

  19. Cluster analysis

    CERN Document Server

    Everitt, Brian S; Leese, Morven; Stahl, Daniel

    2011-01-01

    Cluster analysis comprises a range of methods for classifying multivariate data into subgroups. By organizing multivariate data into such subgroups, clustering can help reveal the characteristics of any structure or patterns present. These techniques have proven useful in a wide range of areas such as medicine, psychology, market research and bioinformatics.This fifth edition of the highly successful Cluster Analysis includes coverage of the latest developments in the field and a new chapter dealing with finite mixture models for structured data.Real life examples are used throughout to demons

  20. NGNP Data Management and Analysis System Analysis and Web Delivery Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Cynthia D. Gentillon

    2011-09-01

    Projects for the Very High Temperature Reactor (VHTR) Technology Development Office provide data in support of Nuclear Regulatory Commission licensing of the very high temperature reactor. Fuel and materials to be used in the reactor are tested and characterized to quantify performance in high-temperature and high-fluence environments. The NGNP Data Management and Analysis System (NDMAS) at the Idaho National Laboratory has been established to ensure that VHTR data are (1) qualified for use, (2) stored in a readily accessible electronic form, and (3) analyzed to extract useful results. This document focuses on the third NDMAS objective. It describes capabilities for displaying the data in meaningful ways and for data analysis to identify useful relationships among the measured quantities. The capabilities are described from the perspective of NDMAS users, starting with those who just view experimental data and analytical results on the INL NDMAS web portal. Web display and delivery capabilities are described in detail. Also the current web pages that show Advanced Gas Reactor, Advanced Graphite Capsule, and High Temperature Materials test results are itemized. Capabilities available to NDMAS developers are more extensive, and are described using a second series of examples. Much of the data analysis efforts focus on understanding how thermocouple measurements relate to simulated temperatures and other experimental parameters. Statistical control charts and correlation monitoring provide an ongoing assessment of instrument accuracy. Data analysis capabilities are virtually unlimited for those who use the NDMAS web data download capabilities and the analysis software of their choice. Overall, the NDMAS provides convenient data analysis and web delivery capabilities for studying a very large and rapidly increasing database of well-documented, pedigreed data.

  1. Blind Analysis in Particle Physics

    International Nuclear Information System (INIS)

    Roodman, A

    2003-01-01

    A review of the blind analysis technique, as used in particle physics measurements, is presented. The history of blind analyses in physics is briefly discussed. Next the dangers of and the advantages of a blind analysis are described. Three distinct kinds of blind analysis in particle physics are presented in detail. Finally, the BABAR collaboration's experience with the blind analysis technique is discussed

  2. PIXE analysis of thin samples

    International Nuclear Information System (INIS)

    Kiss, Ildiko; Koltay, Ede; Szabo, Gyula; Laszlo, S.; Meszaros, A.

    1985-01-01

    Particle-induced X-ray emission (PIXE) multielemental analysis of thin film samples are reported. Calibration methods of K and L X-lines are discussed. Application of PIXE analysis to aerosol monitoring, multielement aerosol analysis is described. Results of PIXE analysis of samples from two locations in Hungary are compared with the results of aerosol samples from Scandinavia and the USA. (D.Gy.)

  3. Proton exciting X ray analysis

    International Nuclear Information System (INIS)

    Ma Xinpei

    1986-04-01

    The analyzing capability of proton exciting X ray analysis for different elements in organisms was discussed, and dealing with examples of trace element analysis in the human body and animal organisms, such as blood serum, urine, and hair. The sensitivity, accuracy, and capability of multielement analysis were discussed. Its strong points for the trace element analysis in biomedicine were explained

  4. COMPARATIVE ANALYSIS BETWEEN THE FUNDAMENTAL AND TECHNICAL ANALYSIS OF STOCKS

    Directory of Open Access Journals (Sweden)

    Nada Petrusheva

    2016-04-01

    Full Text Available In the world of investing and trading, in order to have a definite advantage and constantly create profit, you need to have a strategic approach. Generally speaking, the two main schools of thought and strategies in financial markets are fundamental and technical analysis. Fundamental and technical analysis differ in several aspects, such as the way of functioning and execution, the time horizon used, the tools used and their objective. These differences lead to certain advantages and disadvantages of each of the analyses. Fundamental and technical analysis are also a subject of critical reviews by the academic and scientific community and many of these reviews concern the methods of their application, i.e. the possibility of combining the two analyses and using them complementarily to fully utilize their strengths and advantages.

  5. Canister storage building hazard analysis report

    International Nuclear Information System (INIS)

    POWERS, T.B.

    1999-01-01

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis was performed in accordance with the DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', and meets the intent of HNF-PRO-704, ''Hazard and Accident Analysis Process''. This hazard analysis implements the requirements of DOE Order 5480.23, ''Nuclear Safety Analysis Reports''

  6. Drift Degradation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Dwayne C. Kicker

    2001-09-28

    A statistical description of the probable block sizes formed by fractures around the emplacement drifts has been developed for each of the lithologic units of the repository host horizon. A range of drift orientations with the drift azimuth varied in 15{sup o} increments has been considered in the static analysis. For the quasi-static seismic analysis, and the time-dependent and thermal effects analysis, two drift orientations have been considered: a drift azimuth of 105{sup o} and the current emplacement drift azimuth of 75{sup o}. The change in drift profile resulting from progressive deterioration of the emplacement drifts has been assessed both with and without backfill. Drift profiles have been determined for four different time increments, including static (i.e., upon excavation), 200 years, 2,000 years, and 10,000 years. The effect of seismic events on rock fall has been analyzed. Block size distributions and drift profiles have been determined for three seismic levels, including a 1,000-year event, a 5,000-year event, and a 10,000-year event. Data developed in this modeling and analysis activity have been entered into the TDMS (DTN: MO0109RDDAAMRR.003). The following conclusions have resulted from this drift degradation analysis: (1) The available fracture data are suitable for supporting a detailed key block analysis of the repository host horizon rock mass. The available data from the north-south Main Drift and the east-west Cross Drift provide a sufficient representative fracture sample of the repository emplacement drift horizon. However, the Tptpln fracture data are only available from a relatively small section of the Cross Drift, resulting in a smaller fracture sample size compared to the other lithologic units. This results in a lower degree of confidence that the key block data based on the Tptpln data set is actually representative of the overall Tptpln key block population. (2) The seismic effect on the rock fall size distribution for all events

  7. Longitudinal analysis of meta-analysis literatures in the database of ISI Web of Science.

    Science.gov (United States)

    Zhu, Changtai; Jiang, Ting; Cao, Hao; Sun, Wenguang; Chen, Zhong; Liu, Jinming

    2015-01-01

    The meta-analysis is regarded as an important evidence for making scientific decision. The database of ISI Web of Science collected a great number of high quality literatures including meta-analysis literatures. However, it is significant to understand the general characteristics of meta-analysis literatures to outline the perspective of meta-analysis. In this present study, we summarized and clarified some features on these literatures in the database of ISI Web of Science. We retrieved the meta-analysis literatures in the database of ISI Web of Science including SCI-E, SSCI, A&HCI, CPCI-S, CPCI-SSH, CCR-E, and IC. The annual growth rate, literature category, language, funding, index citation, agencies and countries/territories of the meta-analysis literatures were analyzed, respectively. A total of 95,719 records, which account for 0.38% (99% CI: 0.38%-0.39%) of all literatures, were found in the database. From 1997 to 2012, the annual growth rate of meta-analysis literatures was 18.18%. The literatures involved in many categories, languages, fundings, citations, publication agencies, and countries/territories. Interestingly, the index citation frequencies of the meta-analysis were significantly higher than that of other type literatures such as multi-centre study, randomize controlled trial, cohort study, case control study, and cases report (Panalysis has been becoming more and more prominent in recent years. In future, in order to promote the validity of meta-analysis, the CONSORT and PRISMA standard should be continuously popularized in the field of evidence-based medicine.

  8. IPAD: the Integrated Pathway Analysis Database for Systematic Enrichment Analysis.

    Science.gov (United States)

    Zhang, Fan; Drabier, Renee

    2012-01-01

    Next-Generation Sequencing (NGS) technologies and Genome-Wide Association Studies (GWAS) generate millions of reads and hundreds of datasets, and there is an urgent need for a better way to accurately interpret and distill such large amounts of data. Extensive pathway and network analysis allow for the discovery of highly significant pathways from a set of disease vs. healthy samples in the NGS and GWAS. Knowledge of activation of these processes will lead to elucidation of the complex biological pathways affected by drug treatment, to patient stratification studies of new and existing drug treatments, and to understanding the underlying anti-cancer drug effects. There are approximately 141 biological human pathway resources as of Jan 2012 according to the Pathguide database. However, most currently available resources do not contain disease, drug or organ specificity information such as disease-pathway, drug-pathway, and organ-pathway associations. Systematically integrating pathway, disease, drug and organ specificity together becomes increasingly crucial for understanding the interrelationships between signaling, metabolic and regulatory pathway, drug action, disease susceptibility, and organ specificity from high-throughput omics data (genomics, transcriptomics, proteomics and metabolomics). We designed the Integrated Pathway Analysis Database for Systematic Enrichment Analysis (IPAD, http://bioinfo.hsc.unt.edu/ipad), defining inter-association between pathway, disease, drug and organ specificity, based on six criteria: 1) comprehensive pathway coverage; 2) gene/protein to pathway/disease/drug/organ association; 3) inter-association between pathway, disease, drug, and organ; 4) multiple and quantitative measurement of enrichment and inter-association; 5) assessment of enrichment and inter-association analysis with the context of the existing biological knowledge and a "gold standard" constructed from reputable and reliable sources; and 6) cross-linking of

  9. Traffic analysis toolbox volume XI : weather and traffic analysis, modeling and simulation.

    Science.gov (United States)

    2010-12-01

    This document presents a weather module for the traffic analysis tools program. It provides traffic engineers, transportation modelers and decisions makers with a guide that can incorporate weather impacts into transportation system analysis and mode...

  10. Reliability on intra-laboratory and inter-laboratory data of hair mineral analysis comparing with blood analysis.

    Science.gov (United States)

    Namkoong, Sun; Hong, Seung Phil; Kim, Myung Hwa; Park, Byung Cheol

    2013-02-01

    Nowadays, although its clinical value remains controversial institutions utilize hair mineral analysis. Arguments about the reliability of hair mineral analysis persist, and there have been evaluations of commercial laboratories performing hair mineral analysis. The objective of this study was to assess the reliability of intra-laboratory and inter-laboratory data at three commercial laboratories conducting hair mineral analysis, compared to serum mineral analysis. Two divided hair samples taken from near the scalp were submitted for analysis at the same time, to all laboratories, from one healthy volunteer. Each laboratory sent a report consisting of quantitative results and their interpretation of health implications. Differences among intra-laboratory and interlaboratory data were analyzed using SPSS version 12.0 (SPSS Inc., USA). All the laboratories used identical methods for quantitative analysis, and they generated consistent numerical results according to Friedman analysis of variance. However, the normal reference ranges of each laboratory varied. As such, each laboratory interpreted the patient's health differently. On intra-laboratory data, Wilcoxon analysis suggested they generated relatively coherent data, but laboratory B could not in one element, so its reliability was doubtful. In comparison with the blood test, laboratory C generated identical results, but not laboratory A and B. Hair mineral analysis has its limitations, considering the reliability of inter and intra laboratory analysis comparing with blood analysis. As such, clinicians should be cautious when applying hair mineral analysis as an ancillary tool. Each laboratory included in this study requires continuous refinement from now on for inducing standardized normal reference levels.

  11. Query-Driven Visualization and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ruebel, Oliver; Bethel, E. Wes; Prabhat, Mr.; Wu, Kesheng

    2012-11-01

    This report focuses on an approach to high performance visualization and analysis, termed query-driven visualization and analysis (QDV). QDV aims to reduce the amount of data that needs to be processed by the visualization, analysis, and rendering pipelines. The goal of the data reduction process is to separate out data that is "scientifically interesting'' and to focus visualization, analysis, and rendering on that interesting subset. The premise is that for any given visualization or analysis task, the data subset of interest is much smaller than the larger, complete data set. This strategy---extracting smaller data subsets of interest and focusing of the visualization processing on these subsets---is complementary to the approach of increasing the capacity of the visualization, analysis, and rendering pipelines through parallelism. This report discusses the fundamental concepts in QDV, their relationship to different stages in the visualization and analysis pipelines, and presents QDV's application to problems in diverse areas, ranging from forensic cybersecurity to high energy physics.

  12. Bayesian data analysis for newcomers.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    This article explains the foundational concepts of Bayesian data analysis using virtually no mathematical notation. Bayesian ideas already match your intuitions from everyday reasoning and from traditional data analysis. Simple examples of Bayesian data analysis are presented that illustrate how the information delivered by a Bayesian analysis can be directly interpreted. Bayesian approaches to null-value assessment are discussed. The article clarifies misconceptions about Bayesian methods that newcomers might have acquired elsewhere. We discuss prior distributions and explain how they are not a liability but an important asset. We discuss the relation of Bayesian data analysis to Bayesian models of mind, and we briefly discuss what methodological problems Bayesian data analysis is not meant to solve. After you have read this article, you should have a clear sense of how Bayesian data analysis works and the sort of information it delivers, and why that information is so intuitive and useful for drawing conclusions from data.

  13. Computer system for environmental sample analysis and data storage and analysis

    International Nuclear Information System (INIS)

    Brauer, F.P.; Fager, J.E.

    1976-01-01

    A mini-computer based environmental sample analysis and data storage system has been developed. The system is used for analytical data acquisition, computation, storage of analytical results, and tabulation of selected or derived results for data analysis, interpretation and reporting. This paper discussed the structure, performance and applications of the system

  14. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    Directory of Open Access Journals (Sweden)

    Jyri Pakarinen

    2010-01-01

    Full Text Available Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  15. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    Science.gov (United States)

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  16. Harmonic and geometric analysis

    CERN Document Server

    Citti, Giovanna; Pérez, Carlos; Sarti, Alessandro; Zhong, Xiao

    2015-01-01

    This book presents an expanded version of four series of lectures delivered by the authors at the CRM. Harmonic analysis, understood in a broad sense, has a very wide interplay with partial differential equations and in particular with the theory of quasiconformal mappings and its applications. Some areas in which real analysis has been extremely influential are PDE's and geometric analysis. Their foundations and subsequent developments made extensive use of the Calderón–Zygmund theory, especially the Lp inequalities for Calderón–Zygmund operators (Beurling transform and Riesz transform, among others) and the theory of Muckenhoupt weights.  The first chapter is an application of harmonic analysis and the Heisenberg group to understanding human vision, while the second and third chapters cover some of the main topics on linear and multilinear harmonic analysis. The last serves as a comprehensive introduction to a deep result from De Giorgi, Moser and Nash on the regularity of elliptic partial differen...

  17. Biosensors for Cell Analysis.

    Science.gov (United States)

    Zhou, Qing; Son, Kyungjin; Liu, Ying; Revzin, Alexander

    2015-01-01

    Biosensors first appeared several decades ago to address the need for monitoring physiological parameters such as oxygen or glucose in biological fluids such as blood. More recently, a new wave of biosensors has emerged in order to provide more nuanced and granular information about the composition and function of living cells. Such biosensors exist at the confluence of technology and medicine and often strive to connect cell phenotype or function to physiological or pathophysiological processes. Our review aims to describe some of the key technological aspects of biosensors being developed for cell analysis. The technological aspects covered in our review include biorecognition elements used for biosensor construction, methods for integrating cells with biosensors, approaches to single-cell analysis, and the use of nanostructured biosensors for cell analysis. Our hope is that the spectrum of possibilities for cell analysis described in this review may pique the interest of biomedical scientists and engineers and may spur new collaborations in the area of using biosensors for cell analysis.

  18. K Basin safety analysis

    International Nuclear Information System (INIS)

    Porten, D.R.; Crowe, R.D.

    1994-01-01

    The purpose of this accident safety analysis is to document in detail, analyses whose results were reported in summary form in the K Basins Safety Analysis Report WHC-SD-SNF-SAR-001. The safety analysis addressed the potential for release of radioactive and non-radioactive hazardous material located in the K Basins and their supporting facilities. The safety analysis covers the hazards associated with normal K Basin fuel storage and handling operations, fuel encapsulation, sludge encapsulation, and canister clean-up and disposal. After a review of the Criticality Safety Evaluation of the K Basin activities, the following postulated events were evaluated: Crane failure and casks dropped into loadout pit; Design basis earthquake; Hypothetical loss of basin water accident analysis; Combustion of uranium fuel following dryout; Crane failure and cask dropped onto floor of transfer area; Spent ion exchange shipment for burial; Hydrogen deflagration in ion exchange modules and filters; Release of Chlorine; Power availability and reliability; and Ashfall

  19. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-II analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This presentation will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  20. Dual-use tools and systematics-aware analysis workflows in the ATLAS Run-2 analysis model

    CERN Document Server

    FARRELL, Steven; The ATLAS collaboration; Calafiura, Paolo; Delsart, Pierre-Antoine; Elsing, Markus; Koeneke, Karsten; Krasznahorkay, Attila; Krumnack, Nils; Lancon, Eric; Lavrijsen, Wim; Laycock, Paul; Lei, Xiaowen; Strandberg, Sara Kristina; Verkerke, Wouter; Vivarelli, Iacopo; Woudstra, Martin

    2015-01-01

    The ATLAS analysis model has been overhauled for the upcoming run of data collection in 2015 at 13 TeV. One key component of this upgrade was the Event Data Model (EDM), which now allows for greater flexibility in the choice of analysis software framework and provides powerful new features that can be exploited by analysis software tools. A second key component of the upgrade is the introduction of a dual-use tool technology, which provides abstract interfaces for analysis software tools to run in either the Athena framework or a ROOT-based framework. The tool interfaces, including a new interface for handling systematic uncertainties, have been standardized for the development of improved analysis workflows and consolidation of high-level analysis tools. This paper will cover the details of the dual-use tool functionality, the systematics interface, and how these features fit into a centrally supported analysis environment.

  1. Systems analysis-independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  2. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  3. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    Science.gov (United States)

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  4. Assessment of non-linear analysis finite element program (NONSAP) for inelastic analysis

    International Nuclear Information System (INIS)

    Chang, T.Y.; Prachuktam, S.; Reich, M.

    1976-11-01

    An assessment on a nonlinear structural analysis finite element program called NONSAP is given with respect to its inelastic analysis capability for pressure vessels and components. The assessment was made from the review of its theoretical basis and bench mark problem runs. It was found that NONSAP has only limited capability for inelastic analysis. However, the program was written flexible enough that it can be easily extended or modified to suit the user's need. Moreover, some of the numerical difficulties in using NONSAP are pointed out

  5. Insight and Evidence Motivating the Simplification of Dual-Analysis Hybrid Systems into Single-Analysis Hybrid Systems

    Science.gov (United States)

    Todling, Ricardo; Diniz, F. L. R.; Takacs, L. L.; Suarez, M. J.

    2018-01-01

    Many hybrid data assimilation systems currently used for NWP employ some form of dual-analysis system approach. Typically a hybrid variational analysis is responsible for creating initial conditions for high-resolution forecasts, and an ensemble analysis system is responsible for creating sample perturbations used to form the flow-dependent part of the background error covariance required in the hybrid analysis component. In many of these, the two analysis components employ different methodologies, e.g., variational and ensemble Kalman filter. In such cases, it is not uncommon to have observations treated rather differently between the two analyses components; recentering of the ensemble analysis around the hybrid analysis is used to compensated for such differences. Furthermore, in many cases, the hybrid variational high-resolution system implements some type of four-dimensional approach, whereas the underlying ensemble system relies on a three-dimensional approach, which again introduces discrepancies in the overall system. Connected to these is the expectation that one can reliably estimate observation impact on forecasts issued from hybrid analyses by using an ensemble approach based on the underlying ensemble strategy of dual-analysis systems. Just the realization that the ensemble analysis makes substantially different use of observations as compared to their hybrid counterpart should serve as enough evidence of the implausibility of such expectation. This presentation assembles numerous anecdotal evidence to illustrate the fact that hybrid dual-analysis systems must, at the very minimum, strive for consistent use of the observations in both analysis sub-components. Simpler than that, this work suggests that hybrid systems can reliably be constructed without the need to employ a dual-analysis approach. In practice, the idea of relying on a single analysis system is appealing from a cost-maintenance perspective. More generally, single-analysis systems avoid

  6. Distributed Analysis in CMS

    CERN Document Server

    Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank

    2009-01-01

    The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.

  7. Energy Sector Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Arent, D.; Benioff, R.; Mosey, G.; Bird, L.; Brown, J.; Brown, E.; Vimmerstedt, L.; Aabakken, J.; Parks, K.; Lapsa, M.; Davis, S.; Olszewski, M.; Cox, D.; McElhaney, K.; Hadley, S.; Hostick, D.; Nicholls, A.; McDonald, S.; Holloman, B.

    2006-10-01

    This paper presents the results of energy market analysis sponsored by the Department of Energy's (DOE) Weatherization and International Program (WIP) within the Office of Energy Efficiency and Renewable Energy (EERE). The analysis was conducted by a team of DOE laboratory experts from the National Renewable Energy Laboratory (NREL), Oak Ridge National Laboratory (ORNL), and Pacific Northwest National Laboratory (PNNL), with additional input from Lawrence Berkeley National Laboratory (LBNL). The analysis was structured to identify those markets and niches where government can create the biggest impact by informing management decisions in the private and public sectors. The analysis identifies those markets and niches where opportunities exist for increasing energy efficiency and renewable energy use.

  8. Design-for-analysis or the unintended role of analysis in the design of piping systems

    International Nuclear Information System (INIS)

    Antaki, G.A.

    1991-01-01

    The paper discusses the evolution of piping design in the nuclear industry with its increasing reliance on dynamic analysis. While it is well recognized that the practice has evolved from ''design-by- rule '' to ''design-by-analysis,'' examples are provided of cases where the choice of analysis technique has determined the hardware configuration, which could be called ''design-for-analysis.'' The paper presents practical solutions to some of these cases and summarizes the important recent industry and regulatory developments which, if successful, will reverse the trend towards ''design-for-analysis.'' 14 refs

  9. Numerical Limit Analysis:

    DEFF Research Database (Denmark)

    Damkilde, Lars

    2007-01-01

    Limit State analysis has a long history and many prominent researchers have contributed. The theoretical foundation is based on the upper- and lower-bound theorems which give a very comprehensive and elegant formulation on complicated physical problems. In the pre-computer age Limit State analysis...... also enabled engineers to solve practical problems within reinforced concrete, steel structures and geotechnics....

  10. Reactor Safety Analysis

    International Nuclear Information System (INIS)

    Arien, B.

    2000-01-01

    The objective of SCK-CEN's programme on reactor safety is to develop expertise in probabilistic and deterministic reactor safety analysis. The research programme consists of two main activities, in particular the development of software for reliability analysis of large systems and participation in the international PHEBUS-FP programme for severe accidents. Main achievements in 1999 are reported

  11. Qualitative Content Analysis

    OpenAIRE

    Satu Elo; Maria Kääriäinen; Outi Kanste; Tarja Pölkki; Kati Utriainen; Helvi Kyngäs

    2014-01-01

    Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studie...

  12. High resolution melting (HRM) analysis of DNA--its role and potential in food analysis.

    Science.gov (United States)

    Druml, Barbara; Cichna-Markl, Margit

    2014-09-01

    DNA based methods play an increasing role in food safety control and food adulteration detection. Recent papers show that high resolution melting (HRM) analysis is an interesting approach. It involves amplification of the target of interest in the presence of a saturation dye by the polymerase chain reaction (PCR) and subsequent melting of the amplicons by gradually increasing the temperature. Since the melting profile depends on the GC content, length, sequence and strand complementarity of the product, HRM analysis is highly suitable for the detection of single-base variants and small insertions or deletions. The review gives an introduction into HRM analysis, covers important aspects in the development of an HRM analysis method and describes how HRM data are analysed and interpreted. Then we discuss the potential of HRM analysis based methods in food analysis, i.e. for the identification of closely related species and cultivars and the identification of pathogenic microorganisms. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. failure analysis of a uav flight control system using markov analysis

    African Journals Online (AJOL)

    Failure analysis of a flight control system proposed for Air Force Institute of Technology (AFIT) Unmanned Aerial Vehicle (UAV) was studied using Markov Analysis (MA). It was perceived that understanding of the number of failure states and the probability of being in those state are of paramount importance in order to ...

  14. The Potential for Meta-Analysis to Support Decision Analysis in Ecology

    Science.gov (United States)

    Mengersen, Kerrie; MacNeil, M. Aaron; Caley, M. Julian

    2015-01-01

    Meta-analysis and decision analysis are underpinned by well-developed methods that are commonly applied to a variety of problems and disciplines. While these two fields have been closely linked in some disciplines such as medicine, comparatively little attention has been paid to the potential benefits of linking them in ecology, despite reasonable…

  15. Elements of stock market analysis

    Directory of Open Access Journals (Sweden)

    Suciu, T.

    2013-12-01

    Full Text Available The paper represents a starting point in the presentation of the two types of stock/market analysis: the fundamental analysis and the technical analysis. The fundamental analysis consist in the assessment of the financial and economic status of the company together with the context and macroeconomic environment where it activates. The technical analysis deals with the demand and supply of securities and the evolution of their trend on the market, using a range of graphics and charts to illustrate the market tendencies for the quick identification of the best moments to buy or sell.

  16. An easy guide to factor analysis

    CERN Document Server

    Kline, Paul

    2014-01-01

    Factor analysis is a statistical technique widely used in psychology and the social sciences. With the advent of powerful computers, factor analysis and other multivariate methods are now available to many more people. An Easy Guide to Factor Analysis presents and explains factor analysis as clearly and simply as possible. The author, Paul Kline, carefully defines all statistical terms and demonstrates step-by-step how to work out a simple example of principal components analysis and rotation. He further explains other methods of factor analysis, including confirmatory and path analysis, a

  17. Project delay analysis of HRSG

    Science.gov (United States)

    Silvianita; Novega, A. S.; Rosyid, D. M.; Suntoyo

    2017-08-01

    Completion of HRSG (Heat Recovery Steam Generator) fabrication project sometimes is not sufficient with the targeted time written on the contract. The delay on fabrication process can cause some disadvantages for fabricator, including forfeit payment, delay on HRSG construction process up until HRSG trials delay. In this paper, the author is using semi quantitative on HRSG pressure part fabrication delay with configuration plant 1 GT (Gas Turbine) + 1 HRSG + 1 STG (Steam Turbine Generator) using bow-tie analysis method. Bow-tie analysis method is a combination from FTA (Fault tree analysis) and ETA (Event tree analysis) to develop the risk matrix of HRSG. The result from FTA analysis is use as a threat for preventive measure. The result from ETA analysis is use as impact from fabrication delay.

  18. Networks and Bargaining in Policy Analysis

    DEFF Research Database (Denmark)

    Bogason, Peter

    2006-01-01

    A duscussion of the fight between proponents of rationalistic policy analysis and more political interaction models for policy analysis. The latter group is the foundation for the many network models of policy analysis of today.......A duscussion of the fight between proponents of rationalistic policy analysis and more political interaction models for policy analysis. The latter group is the foundation for the many network models of policy analysis of today....

  19. Analysis

    DEFF Research Database (Denmark)

    Mathiesen, Brian Vad; Liu, Wen; Zhang, Xiliang

    2014-01-01

    three major technological changes: energy savings on the demand side, efficiency improvements in energy production, and the replacement of fossil fuels by various sources of renewable energy. Consequently, the analysis of these systems must include strategies for integrating renewable sources...

  20. COMPUTER METHODS OF GENETIC ANALYSIS.

    Directory of Open Access Journals (Sweden)

    A. L. Osipov

    2017-02-01

    Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.

  1. The Application of Structured Job Analysis Information Based on the Position Analysis Questionnaire (PAQ).

    Science.gov (United States)

    Position Analysis Questionnaire ( PAQ ). This job analysis instrument consists of 187 job elements organized into six divisions. In the analysis of a job...with the PAQ the relevance of the individual elements to the job are rated using any of several rating scales such as importance, or time.

  2. ROCKS & MINERALS DETERMINATION AND ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    20150204 Abaydulla Alimjan(Department of Chemistry and Environmental Sciences,Kashgar Teachers College,Kashgar 844006,China);Cheng Chunying Non-Metallic Element Composition Analysis of Non-Ferrous Metal Ores from Oytagh Town,Xinjiang(Rock and Mineral Analysis,ISSN0254-5357,CN11-2131/TD,33(1),2014,p.44-50,5illus.,4tables,28refs.)Key words:nonferrous metals ore,nonmetals,chemical analysis,thermogravimetric analysis Anions in non-ferrous ore materials

  3. A study of environmental analysis of urban river sediments using activation analysis

    International Nuclear Information System (INIS)

    Tanaka, Y.; Kuno, A.; Matsuo, M.

    2003-01-01

    Sediments of the Kitajukkengawa River (Sumida-ku, Tokyo, Japan) were analyzed by activation analyses. Concentrations of 36 elements for each sample were determined by instrumental neutron activation analysis (INAA) and neutron induced prompt gamma-ray analysis (PGA). Based on the correlation matrix between the elements in vertical distribution, principal component analysis (PCA) was performed. The degree of chemical weathering of silicate minerals was highest in the middle layer of the Kitajukkengawa River sediment and that adsorbed amount of trace metals such as Cd and Cr was increased along with chemical weathering. (author)

  4. Conducting Qualitative Data Analysis: Qualitative Data Analysis as a Metaphoric Process

    Science.gov (United States)

    Chenail, Ronald J.

    2012-01-01

    In the second of a series of "how-to" essays on conducting qualitative data analysis, Ron Chenail argues the process can best be understood as a metaphoric process. From this orientation he suggests researchers follow Kenneth Burke's notion of metaphor and see qualitative data analysis as the analyst systematically considering the "this-ness" of…

  5. Automatic analysis of the micronucleus test in primary human lymphocytes using image analysis.

    Science.gov (United States)

    Frieauff, W; Martus, H J; Suter, W; Elhajouji, A

    2013-01-01

    The in vitro micronucleus test (MNT) is a well-established test for early screening of new chemical entities in industrial toxicology. For assessing the clastogenic or aneugenic potential of a test compound, micronucleus induction in cells has been shown repeatedly to be a sensitive and a specific parameter. Various automated systems to replace the tedious and time-consuming visual slide analysis procedure as well as flow cytometric approaches have been discussed. The ROBIAS (Robotic Image Analysis System) for both automatic cytotoxicity assessment and micronucleus detection in human lymphocytes was developed at Novartis where the assay has been used to validate positive results obtained in the MNT in TK6 cells, which serves as the primary screening system for genotoxicity profiling in early drug development. In addition, the in vitro MNT has become an accepted alternative to support clinical studies and will be used for regulatory purposes as well. The comparison of visual with automatic analysis results showed a high degree of concordance for 25 independent experiments conducted for the profiling of 12 compounds. For concentration series of cyclophosphamide and carbendazim, a very good correlation between automatic and visual analysis by two examiners could be established, both for the relative division index used as cytotoxicity parameter, as well as for micronuclei scoring in mono- and binucleated cells. Generally, false-positive micronucleus decisions could be controlled by fast and simple relocation of the automatically detected patterns. The possibility to analyse 24 slides within 65h by automatic analysis over the weekend and the high reproducibility of the results make automatic image processing a powerful tool for the micronucleus analysis in primary human lymphocytes. The automated slide analysis for the MNT in human lymphocytes complements the portfolio of image analysis applications on ROBIAS which is supporting various assays at Novartis.

  6. Bridge Diagnosis by Using Nonlinear Independent Component Analysis and Displacement Analysis

    Science.gov (United States)

    Zheng, Juanqing; Yeh, Yichun; Ogai, Harutoshi

    A daily diagnosis system for bridge monitoring and maintenance is developed based on wireless sensors, signal processing, structure analysis, and displacement analysis. The vibration acceleration data of a bridge are firstly collected through the wireless sensor network by exerting. Nonlinear independent component analysis (ICA) and spectral analysis are used to extract the vibration frequencies of the bridge. After that, through a band pass filter and Simpson's rule the vibration displacement is calculated and the vibration model is obtained to diagnose the bridge. Since linear ICA algorithms work efficiently only in linear mixing environments, a nonlinear ICA model, which is more complicated, is more practical for bridge diagnosis systems. In this paper, we firstly use the post nonlinear method to change the signal data, after that perform linear separation by FastICA, and calculate the vibration displacement of the bridge. The processed data can be used to understand phenomena like corrosion and crack, and evaluate the health condition of the bridge. We apply this system to Nakajima Bridge in Yahata, Kitakyushu, Japan.

  7. Effect of Dextromethorphan-Quinidine on Agitation in Patients With Alzheimer Disease Dementia: A Randomized Clinical Trial.

    Science.gov (United States)

    Cummings, Jeffrey L; Lyketsos, Constantine G; Peskind, Elaine R; Porsteinsson, Anton P; Mintzer, Jacobo E; Scharre, Douglas W; De La Gandara, Jose E; Agronin, Marc; Davis, Charles S; Nguyen, Uyen; Shin, Paul; Tariot, Pierre N; Siffert, João

    Agitation is common among patients with Alzheimer disease; safe, effective treatments are lacking. To assess the efficacy, safety, and tolerability of dextromethorphan hydrobromide-quinidine sulfate for Alzheimer disease-related agitation. Phase 2 randomized, multicenter, double-blind, placebo-controlled trial using a sequential parallel comparison design with 2 consecutive 5-week treatment stages conducted August 2012-August 2014. Patients with probable Alzheimer disease, clinically significant agitation (Clinical Global Impressions-Severity agitation score ≥4), and a Mini-Mental State Examination score of 8 to 28 participated at 42 US study sites. Stable dosages of antidepressants, antipsychotics, hypnotics, and antidementia medications were allowed. In stage 1, 220 patients were randomized in a 3:4 ratio to receive dextromethorphan-quinidine (n = 93) or placebo (n = 127). In stage 2, patients receiving dextromethorphan-quinidine continued; those receiving placebo were stratified by response and rerandomized in a 1:1 ratio to dextromethorphan-quinidine (n = 59) or placebo (n = 60). The primary end point was change from baseline on the Neuropsychiatric Inventory (NPI) Agitation/Aggression domain (scale range, 0 [absence of symptoms] to 12 [symptoms occur daily and with marked severity]). A total of 194 patients (88.2%) completed the study. With the sequential parallel comparison design, 152 patients received dextromethorphan-quinidine and 127 received placebo during the study. Analysis combining stages 1 (all patients) and 2 (rerandomized placebo nonresponders) showed significantly reduced NPI Agitation/Aggression scores for dextromethorphan-quinidine vs placebo (ordinary least squares z statistic, -3.95; P dextromethorphan-quinidine and from 7.0 to 5.3 with placebo. Between-group treatment differences were significant in stage 1 (least squares mean, -1.5; 95% CI, -2.3 to -0.7; Pdextromethorphan-quinidine and from 6.7 to 5.8 with placebo

  8. Loss of coolant accident analysis (thermal hydraulic analysis) - Japanese industries experience

    International Nuclear Information System (INIS)

    Okabe, K.

    1995-01-01

    An overview of LOCA analysis in Japanese industry is presented. The BASH-M code, developed for large scale LOCA reflooding analysis, is given as an example of verification and improvement of US computer programs are given. The code's application to the operational safety analysis concerns the following main areas: 1D drift flux model base computer program CANAC; CANAC-based advanced training simulator; emergency operating procedures. The author considers also the code application to the following new PWR safety design concepts: use of steam generators for decay heat removal at LOCA conditions; use of horizontal type steam generator for maintaining two-phase natural circulation under the reactor coolant system submerged. 9 figs

  9. Risk analysis

    International Nuclear Information System (INIS)

    Baron, J.H.; Nunez McLeod, J.; Rivera, S.S.

    1997-01-01

    This book contains a selection of research works performed in the CEDIAC Institute (Cuyo National University) in the area of Risk Analysis, with specific orientations to the subjects of uncertainty and sensitivity studies, software reliability, severe accident modeling, etc. This volume presents important material for all those researches who want to have an insight in the risk analysis field, as a tool to solution several problems frequently found in the engineering and applied sciences field, as well as for the academic teachers who want to keep up to date, including the new developments and improvements continuously arising in this field [es

  10. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  11. Virtual Data in CMS Analysis

    CERN Document Server

    Arbree, A; Bourilkov, D; Cavanaugh, R J; Graham, G; Rodríguez, J; Wilde, M; Zhao, Y

    2003-01-01

    The use of virtual data for enhancing the collaboration between large groups of scientists is explored in several ways: - by defining ``virtual'' parameter spaces which can be searched and shared in an organized way by a collaboration of scientists in the course of their analysis - by providing a mechanism to log the provenance of results and the ability to trace them back to the various stages in the analysis of real or simulated data - by creating ``check points'' in the course of an analysis to permit collaborators to explore their own analysis branches by refining selections, improving the signal to background ratio, varying the estimation of parameters, etc. - by facilitating the audit of an analysis and the reproduction of its results by a different group, or in a peer review context. We describe a prototype for the analysis of data from the CMS experiment based on the virtual data system Chimera and the object-oriented data analysis framework ROOT. The Chimera system is used to chain together several s...

  12. Real analysis with economic applications

    CERN Document Server

    Ok, Efe A

    2011-01-01

    There are many mathematics textbooks on real analysis, but they focus on topics not readily helpful for studying economic theory or they are inaccessible to most graduate students of economics. Real Analysis with Economic Applications aims to fill this gap by providing an ideal textbook and reference on real analysis tailored specifically to the concerns of such students. The emphasis throughout is on topics directly relevant to economic theory. In addition to addressing the usual topics of real analysis, this book discusses the elements of order theory, convex analysis, optimization, correspondences, linear and nonlinear functional analysis, fixed-point theory, dynamic programming, and calculus of variations. Efe Ok complements the mathematical development with applications that provide concise introductions to various topics from economic theory, including individual decision theory and games, welfare economics, information theory, general equilibrium and finance, and intertemporal economics. Moreover, a...

  13. Analysis from concepts to applications

    CERN Document Server

    Penot, Jean-Paul

    2016-01-01

    This textbook covers the main results and methods of real analysis in a single volume. Taking a progressive approach to equations and transformations, this book starts with the very foundations of real analysis (set theory, order, convergence, and measure theory) before presenting powerful results that can be applied to concrete problems. In addition to classical results of functional analysis, differential calculus and integration, Analysis discusses topics such as convex analysis, dissipative operators and semigroups which are often absent from classical treatises. Acknowledging that analysis has significantly contributed to the understanding and development of the present world, the book further elaborates on techniques which pervade modern civilization, including wavelets in information theory, the Radon transform in medical imaging and partial differential equations in various mechanical and physical phenomena. Advanced undergraduate and graduate students, engineers as well as practitioners wishing to fa...

  14. Pathway analysis of IMC

    DEFF Research Database (Denmark)

    Skrypnyuk, Nataliya; Nielson, Flemming; Pilegaard, Henrik

    2009-01-01

    We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced into the...... into the syntax of IMC in order to make our analysis feasible. Finally we describe the analysis itself together with several theoretical results that we have proved for it.......We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced...

  15. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  16. B plant mission analysis report

    International Nuclear Information System (INIS)

    Lund, D.P.

    1995-01-01

    This report further develops the mission for B Plant originally defined in WHC-EP-0722, ''System Engineering Functions and Requirements for the Hanford Cleanup Mission: First Issue.'' The B Plant mission analysis will be the basis for a functional analysis that breaks down the B Plant mission statement into the necessary activities to accomplish the mission. These activities are the product of the functional analysis and will then be used in subsequent steps of the systems engineering process, such as identifying requirements and allocating those requirements to B Plant functions. The information in this mission analysis and the functional and requirements analysis are a part of the B Plant technical baseline

  17. Non-commutative analysis

    CERN Document Server

    Jorgensen, Palle

    2017-01-01

    The book features new directions in analysis, with an emphasis on Hilbert space, mathematical physics, and stochastic processes. We interpret 'non-commutative analysis' broadly to include representations of non-Abelian groups, and non-Abelian algebras; emphasis on Lie groups and operator algebras (C* algebras and von Neumann algebras.)A second theme is commutative and non-commutative harmonic analysis, spectral theory, operator theory and their applications. The list of topics includes shift invariant spaces, group action in differential geometry, and frame theory (over-complete bases) and their applications to engineering (signal processing and multiplexing), projective multi-resolutions, and free probability algebras.The book serves as an accessible introduction, offering a timeless presentation, attractive and accessible to students, both in mathematics and in neighboring fields.

  18. Reactor safety analysis

    International Nuclear Information System (INIS)

    Arien, B.

    1998-01-01

    Risk assessments of nuclear installations require accurate safety and reliability analyses to estimate the consequences of accidental events and their probability of occurrence. The objective of the work performed in this field at the Belgian Nuclear Research Centre SCK-CEN is to develop expertise in probabilistic and deterministic reactor safety analysis. The four main activities of the research project on reactor safety analysis are: (1) the development of software for the reliable analysis of large systems; (2) the development of an expert system for the aid to diagnosis; (3) the development and the application of a probabilistic reactor-dynamics method, and (4) to participate in the international PHEBUS-FP programme for severe accidents. Progress in research during 1997 is described

  19. Fuzzy data analysis

    CERN Document Server

    Bandemer, Hans

    1992-01-01

    Fuzzy data such as marks, scores, verbal evaluations, imprecise observations, experts' opinions and grey tone pictures, are quite common. In Fuzzy Data Analysis the authors collect their recent results providing the reader with ideas, approaches and methods for processing such data when looking for sub-structures in knowledge bases for an evaluation of functional relationship, e.g. in order to specify diagnostic or control systems. The modelling presented uses ideas from fuzzy set theory and the suggested methods solve problems usually tackled by data analysis if the data are real numbers. Fuzzy Data Analysis is self-contained and is addressed to mathematicians oriented towards applications and to practitioners in any field of application who have some background in mathematics and statistics.

  20. Physics analysis workstation

    International Nuclear Information System (INIS)

    Johnstad, H.

    1989-06-01

    The Physics Analysis Workstation (PAW) is a high-level program providing data presentation and statistical or mathematical analysis. PAW has been developed at CERN as an instrument to assist physicists in the analysis and presentation of their data. The program is interfaced to a high level graphics package, based on basic underlying graphics. 3-D graphics capabilities are being implemented. The major objects in PAW are 1 or 2 dimensional binned event data with fixed number of entries per event, vectors, functions, graphics pictures, and macros. Command input is handled by an integrated user interface package, which allows for a variety of choices for input, either with typed commands, or in a tree structure menu driven mode. 6 refs., 1 fig

  1. Malware analysis and reverse engineering

    OpenAIRE

    Šváb, Martin

    2014-01-01

    Focus of this thesis is reverse engineering in information technology closely linked with the malware analysis. It explains fundamentals of IA-32 processors architecture and basics of operating system Microsoft Windows. Main part of this thesis is dedicated to the malware analysis, including description of creating a tool for simplification of static part of the analysis. In Conclusion various approaches to the malware analysis, which were described in previous part of the thesis, are practic...

  2. [Cluster analysis in biomedical researches].

    Science.gov (United States)

    Akopov, A S; Moskovtsev, A A; Dolenko, S A; Savina, G D

    2013-01-01

    Cluster analysis is one of the most popular methods for the analysis of multi-parameter data. The cluster analysis reveals the internal structure of the data, group the separate observations on the degree of their similarity. The review provides a definition of the basic concepts of cluster analysis, and discusses the most popular clustering algorithms: k-means, hierarchical algorithms, Kohonen networks algorithms. Examples are the use of these algorithms in biomedical research.

  3. Semi-classical signal analysis

    KAUST Repository

    Laleg-Kirati, Taous-Meriem

    2012-09-30

    This study introduces a new signal analysis method, based on a semi-classical approach. The main idea in this method is to interpret a pulse-shaped signal as a potential of a Schrödinger operator and then to use the discrete spectrum of this operator for the analysis of the signal. We present some numerical examples and the first results obtained with this method on the analysis of arterial blood pressure waveforms. © 2012 Springer-Verlag London Limited.

  4. A Large Dimensional Analysis of Regularized Discriminant Analysis Classifiers

    KAUST Repository

    Elkhalil, Khalil

    2017-11-01

    This article carries out a large dimensional analysis of standard regularized discriminant analysis classifiers designed on the assumption that data arise from a Gaussian mixture model with different means and covariances. The analysis relies on fundamental results from random matrix theory (RMT) when both the number of features and the cardinality of the training data within each class grow large at the same pace. Under mild assumptions, we show that the asymptotic classification error approaches a deterministic quantity that depends only on the means and covariances associated with each class as well as the problem dimensions. Such a result permits a better understanding of the performance of regularized discriminant analsysis, in practical large but finite dimensions, and can be used to determine and pre-estimate the optimal regularization parameter that minimizes the misclassification error probability. Despite being theoretically valid only for Gaussian data, our findings are shown to yield a high accuracy in predicting the performances achieved with real data sets drawn from the popular USPS data base, thereby making an interesting connection between theory and practice.

  5. Glucocorticosteroids for sepsis : systematic review with meta-analysis and trial sequential analysis

    NARCIS (Netherlands)

    Volbeda, M.; Wetterslev, J.; Gluud, C.; Zijlstra, J. G.; van der Horst, I. C. C.; Keus, F.

    Glucocorticosteroids (steroids) are widely used for sepsis patients. However, the potential benefits and harms of both high and low dose steroids remain unclear. A systematic review of randomised clinical trials with meta-analysis and trial sequential analysis (TSA) might shed light on this

  6. Functional Principal Component Analysis and Randomized Sparse Clustering Algorithm for Medical Image Analysis

    Science.gov (United States)

    Lin, Nan; Jiang, Junhai; Guo, Shicheng; Xiong, Momiao

    2015-01-01

    Due to the advancement in sensor technology, the growing large medical image data have the ability to visualize the anatomical changes in biological tissues. As a consequence, the medical images have the potential to enhance the diagnosis of disease, the prediction of clinical outcomes and the characterization of disease progression. But in the meantime, the growing data dimensions pose great methodological and computational challenges for the representation and selection of features in image cluster analysis. To address these challenges, we first extend the functional principal component analysis (FPCA) from one dimension to two dimensions to fully capture the space variation of image the signals. The image signals contain a large number of redundant features which provide no additional information for clustering analysis. The widely used methods for removing the irrelevant features are sparse clustering algorithms using a lasso-type penalty to select the features. However, the accuracy of clustering using a lasso-type penalty depends on the selection of the penalty parameters and the threshold value. In practice, they are difficult to determine. Recently, randomized algorithms have received a great deal of attentions in big data analysis. This paper presents a randomized algorithm for accurate feature selection in image clustering analysis. The proposed method is applied to both the liver and kidney cancer histology image data from the TCGA database. The results demonstrate that the randomized feature selection method coupled with functional principal component analysis substantially outperforms the current sparse clustering algorithms in image cluster analysis. PMID:26196383

  7. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  8. Correspondence analysis of longitudinal data

    NARCIS (Netherlands)

    Van der Heijden, P.G.M.|info:eu-repo/dai/nl/073087998

    2005-01-01

    Correspondence analysis is an exploratory tool for the analysis of associations between categorical variables, the results of which may be displayed graphically. For longitudinal data with two time points, an analysis of the transition matrix (showing the relative frequencies for pairs of

  9. Critical Analysis of Multimodal Discourse

    DEFF Research Database (Denmark)

    van Leeuwen, Theo

    2013-01-01

    This is an encyclopaedia article which defines the fields of critical discourse analysis and multimodality studies, argues that within critical discourse analysis more attention should be paid to multimodality, and within multimodality to critical analysis, and ends reviewing a few examples of re...

  10. Wavelet analysis for nonstationary signals

    International Nuclear Information System (INIS)

    Penha, Rosani Maria Libardi da

    1999-01-01

    Mechanical vibration signals play an important role in anomalies identification resulting of equipment malfunctioning. Traditionally, Fourier spectral analysis is used where the signals are assumed to be stationary. However, occasional transient impulses and start-up process are examples of nonstationary signals that can be found in mechanical vibrations. These signals can provide important information about the equipment condition, as early fault detection. The Fourier analysis can not adequately be applied to nonstationary signals because the results provide data about the frequency composition averaged over the duration of the signal. In this work, two methods for nonstationary signal analysis are used: Short Time Fourier Transform (STFT) and wavelet transform. The STFT is a method of adapting Fourier spectral analysis for nonstationary application to time-frequency domain. To have a unique resolution throughout the entire time-frequency domain is its main limitation. The wavelet transform is a new analysis technique suitable to nonstationary signals, which handles the STFT drawbacks, providing multi-resolution frequency analysis and time localization in a unique time-scale graphic. The multiple frequency resolutions are obtained by scaling (dilatation/compression) the wavelet function. A comparison of the conventional Fourier transform, STFT and wavelet transform is made applying these techniques to: simulated signals, arrangement rotor rig vibration signal and rotate machine vibration signal Hanning window was used to STFT analysis. Daubechies and harmonic wavelets were used to continuos, discrete and multi-resolution wavelet analysis. The results show the Fourier analysis was not able to detect changes in the signal frequencies or discontinuities. The STFT analysis detected the changes in the signal frequencies, but with time-frequency resolution problems. The wavelet continuos and discrete transform demonstrated to be a high efficient tool to detect

  11. Analysis and Application of Reliability

    International Nuclear Information System (INIS)

    Jeong, Hae Seong; Park, Dong Ho; Kim, Jae Ju

    1999-05-01

    This book tells of analysis and application of reliability, which includes definition, importance and historical background of reliability, function of reliability and failure rate, life distribution and assumption of reliability, reliability of unrepaired system, reliability of repairable system, sampling test of reliability, failure analysis like failure analysis by FEMA and FTA, and cases, accelerated life testing such as basic conception, acceleration and acceleration factor, and analysis of accelerated life testing data, maintenance policy about alternation and inspection.

  12. Comparing methods of classifying life courses: Sequence analysis and latent class analysis

    NARCIS (Netherlands)

    Elzinga, C.H.; Liefbroer, Aart C.; Han, Sapphire

    2017-01-01

    We compare life course typology solutions generated by sequence analysis (SA) and latent class analysis (LCA). First, we construct an analytic protocol to arrive at typology solutions for both methodologies and present methods to compare the empirical quality of alternative typologies. We apply this

  13. Comparing methods of classifying life courses: sequence analysis and latent class analysis

    NARCIS (Netherlands)

    Han, Y.; Liefbroer, A.C.; Elzinga, C.

    2017-01-01

    We compare life course typology solutions generated by sequence analysis (SA) and latent class analysis (LCA). First, we construct an analytic protocol to arrive at typology solutions for both methodologies and present methods to compare the empirical quality of alternative typologies. We apply this

  14. Kosice meteorite analysis

    International Nuclear Information System (INIS)

    Sitek, J.; Degmova, J.; Dekan, J.

    2011-01-01

    Meteorite Kosice fell down 28 th of February 2010 near the Kosice and represents an unique find, because the last fall of meteorite was observed in Slovakia at the year 1895. It supposes that for this kind of meteorite the orbit in cosmic space could be calculated. This is one of most important part because until now 13 places of meteorite find are known in the world of which cosmic orbit in space have been calculated. Slovakia is member of international bolide net, dealing with meteorite analysis in Middle Europe .Analysis of Kosice meteorite will also concern at the long live and short live nuclides. Results should be a contribution to determination of radiation and formative ages. From structural analysis of meteorite it will be possible to compare it with similar types of meteorites. In this work Moessbauer spectroscopy will be used for phase analysis from point of view iron contain components with the aim to identify magnetic and non magnetic fractions. From the analysis of magnetic part we can find that the first sextet with hyperfine magnetic field 33.5 T corresponds to bcc Fe-Ni alloy (kamacite) and second with field 31.5 T to FeS (triolite). Meteorites with mentioned composition belong to the mineral group of chondrites. Comparing our parameters with results of measurements at the similar meteorites we can conclude that Kosice meteorite contains the same components. According all Moessbauer parameters we can also include this meteorite in the mineral group of chondrites. (authors)

  15. Control system design and analysis using the INteractive Controls Analysis (INCA) program

    Science.gov (United States)

    Bauer, Frank H.; Downing, John P.

    1987-01-01

    The INteractive Controls Analysis (INCA) program was developed at the Goddard Space Flight Center to provide a user friendly efficient environment for the design and analysis of linear control systems. Since its inception, INCA has found extensive use in the design, development, and analysis of control systems for spacecraft, instruments, robotics, and pointing systems. Moreover, the results of the analytic tools imbedded in INCA have been flight proven with at least three currently orbiting spacecraft. This paper describes the INCA program and illustrates, using a flight proven example, how the package can perform complex design analyses with relative ease.

  16. A Beginner’s Guide to Factor Analysis: Focusing on Exploratory Factor Analysis

    Directory of Open Access Journals (Sweden)

    An Gie Yong

    2013-10-01

    Full Text Available The following paper discusses exploratory factor analysis and gives an overview of the statistical technique and how it is used in various research designs and applications. A basic outline of how the technique works and its criteria, including its main assumptions are discussed as well as when it should be used. Mathematical theories are explored to enlighten students on how exploratory factor analysis works, an example of how to run an exploratory factor analysis on SPSS is given, and finally a section on how to write up the results is provided. This will allow readers to develop a better understanding of when to employ factor analysis and how to interpret the tables and graphs in the output.

  17. Oncological image analysis.

    Science.gov (United States)

    Brady, Sir Michael; Highnam, Ralph; Irving, Benjamin; Schnabel, Julia A

    2016-10-01

    Cancer is one of the world's major healthcare challenges and, as such, an important application of medical image analysis. After a brief introduction to cancer, we summarise some of the major developments in oncological image analysis over the past 20 years, but concentrating those in the authors' laboratories, and then outline opportunities and challenges for the next decade. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Medical decision making tools: Bayesian analysis and ROC analysis

    International Nuclear Information System (INIS)

    Lee, Byung Do

    2006-01-01

    During the diagnostic process of the various oral and maxillofacial lesions, we should consider the following: 'When should we order diagnostic tests? What tests should be ordered? How should we interpret the results clinically? And how should we use this frequently imperfect information to make optimal medical decision?' For the clinicians to make proper judgement, several decision making tools are suggested. This article discusses the concept of the diagnostic accuracy (sensitivity and specificity values) with several decision making tools such as decision matrix, ROC analysis and Bayesian analysis. The article also explain the introductory concept of ORAD program

  19. Compatibility analysis of DUPIC fuel (Part II) - Reactor physics design and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Chang Joon; Choi, Hang Bok; Rhee, Bo Wook; Roh, Gyu Hong; Kim, Do Hun [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    The compatibility analysis of the DUPIC fuel in a CANDU reactor has been assessed. This study includes the fuel composition adjustment, comparison of lattice properties, performance analysis of reactivity devices, determination of regional over-power (ROP) trip setpoint, and uncertainty estimation of core performance parameters. For the DUPIC fuel composition adjustment, three options have been proposed, which can produce uniform neutronic characteristics of the DUPIC fuel. The lattice analysis has shown that the characteristics of the DUPIC fuel is compatible with those of natural uranium fuel. The reactivity devices of the CANDU-6 reactor maintain their functional requirements even for the DUPIC fuel system. The ROP analysis has shown that the trip setpoint is not sacrificed for the DUPIC fuel system owing to the power shape that enhances more thermal margin. The uncertainty analysis of the core performance parameter has shown that the uncertainty associated with the fuel composition variation is reduced appreciably, which is primarily due to the fuel composition adjustment and secondly the on-power refueling feature and spatial control function of the CANDU reactor. The reactor physics calculation has also shown that it is feasible to use spent PWR fuel directly in CANDU reactors without deteriorating the CANDU-6 core physics design requirements. 29 refs., 67 figs., 60 tabs. (Author)

  20. Analysis of some Egyptian cosmetic samples by fast neutron activation analysis

    International Nuclear Information System (INIS)

    Medhat, M.E.; Ali, M.A.; Hassan, M.F.

    2001-01-01

    A description of D-T neutron generator (NG) is presented. This generator can be used for fast neutron activation analysis applied to determine some selected elements, especially light elements, in different materials. The concentrations of the elements Na, Mg, Al, Si, K, Cl, Ca and Fe were determined in two domestic brands of face powder by using 14 MeV neutron activation analysis

  1. Building energy analysis tool

    Science.gov (United States)

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  2. Liquid Effluents Program mission analysis

    International Nuclear Information System (INIS)

    Lowe, S.S.

    1994-01-01

    Systems engineering is being used to identify work to cleanup the Hanford Site. The systems engineering process transforms an identified mission need into a set of performance parameters and a preferred system configuration. Mission analysis is the first step in the process. Mission analysis supports early decision-making by clearly defining the program objectives, and evaluating the feasibility and risks associated with achieving those objectives. The results of the mission analysis provide a consistent basis for subsequent systems engineering work. A mission analysis was performed earlier for the overall Hanford Site. This work was continued by a ''capstone'' team which developed a top-level functional analysis. Continuing in a top-down manner, systems engineering is now being applied at the program and project levels. A mission analysis was conducted for the Liquid Effluents Program. The results are described herein. This report identifies the initial conditions and acceptable final conditions, defines the programmatic and physical interfaces and sources of constraints, estimates the resources to carry out the mission, and establishes measures of success. The mission analysis reflects current program planning for the Liquid Effluents Program as described in Liquid Effluents FY 1995 Multi-Year Program Plan

  3. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  4. Dynamic analysis program for frame structure

    International Nuclear Information System (INIS)

    Ando, Kozo; Chiba, Toshio

    1975-01-01

    A general purpose computer program named ISTRAN/FD (Isub(HI) STRucture ANalysis/Frame structure, Dynamic analysis) has been developed for dynamic analysis of three-dimensional frame structures. This program has functions of free vibration analysis, seismic response analysis, graphic display by plotter and CRT, etc. This paper introduces ISTRAN/FD; examples of its application are shown with various problems : idealization of the cantilever, dynamic analysis of the main tower of the suspension bridge, three-dimensional vibration in the plate girder bridge, seismic response in the boiler steel structure, and dynamic properties of the underground LNG tank. In this last example, solid elements, in addition to beam elements, are especially used for the analysis. (auth.)

  5. Bridging ImmunoGenomic Data Analysis Workflow Gaps (BIGDAWG): An integrated case-control analysis pipeline.

    Science.gov (United States)

    Pappas, Derek J; Marin, Wesley; Hollenbach, Jill A; Mack, Steven J

    2016-03-01

    Bridging ImmunoGenomic Data-Analysis Workflow Gaps (BIGDAWG) is an integrated data-analysis pipeline designed for the standardized analysis of highly-polymorphic genetic data, specifically for the HLA and KIR genetic systems. Most modern genetic analysis programs are designed for the analysis of single nucleotide polymorphisms, but the highly polymorphic nature of HLA and KIR data require specialized methods of data analysis. BIGDAWG performs case-control data analyses of highly polymorphic genotype data characteristic of the HLA and KIR loci. BIGDAWG performs tests for Hardy-Weinberg equilibrium, calculates allele frequencies and bins low-frequency alleles for k×2 and 2×2 chi-squared tests, and calculates odds ratios, confidence intervals and p-values for each allele. When multi-locus genotype data are available, BIGDAWG estimates user-specified haplotypes and performs the same binning and statistical calculations for each haplotype. For the HLA loci, BIGDAWG performs the same analyses at the individual amino-acid level. Finally, BIGDAWG generates figures and tables for each of these comparisons. BIGDAWG obviates the error-prone reformatting needed to traffic data between multiple programs, and streamlines and standardizes the data-analysis process for case-control studies of highly polymorphic data. BIGDAWG has been implemented as the bigdawg R package and as a free web application at bigdawg.immunogenomics.org. Copyright © 2015 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.

  6. A Technical Analysis Information Fusion Approach for Stock Price Analysis and Modeling

    Science.gov (United States)

    Lahmiri, Salim

    In this paper, we address the problem of technical analysis information fusion in improving stock market index-level prediction. We present an approach for analyzing stock market price behavior based on different categories of technical analysis metrics and a multiple predictive system. Each category of technical analysis measures is used to characterize stock market price movements. The presented predictive system is based on an ensemble of neural networks (NN) coupled with particle swarm intelligence for parameter optimization where each single neural network is trained with a specific category of technical analysis measures. The experimental evaluation on three international stock market indices and three individual stocks show that the presented ensemble-based technical indicators fusion system significantly improves forecasting accuracy in comparison with single NN. Also, it outperforms the classical neural network trained with index-level lagged values and NN trained with stationary wavelet transform details and approximation coefficients. As a result, technical information fusion in NN ensemble architecture helps improving prediction accuracy.

  7. EMPIRICAL RESEARCH AND CONGREGATIONAL ANALYSIS ...

    African Journals Online (AJOL)

    empirical research has made to the process of congregational analysis. 1 Part of this ... contextual congegrational analysis – meeting social and divine desires”) at the IAPT .... methodology of a congregational analysis should be regarded as a process. ... essential to create space for a qualitative and quantitative approach.

  8. Canister storage building hazard analysis report

    International Nuclear Information System (INIS)

    Krahn, D.E.; Garvin, L.J.

    1997-01-01

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the final CSB safety analysis report (SAR) and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Report, and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report

  9. Sensory analysis of pet foods.

    Science.gov (United States)

    Koppel, Kadri

    2014-08-01

    Pet food palatability depends first and foremost on the pet and is related to the pet food sensory properties such as aroma, texture and flavor. Sensory analysis of pet foods may be conducted by humans via descriptive or hedonic analysis, pets via acceptance or preference tests, and through a number of instrumental analysis methods. Sensory analysis of pet foods provides additional information on reasons behind palatable and unpalatable foods as pets lack linguistic capabilities. Furthermore, sensory analysis may be combined with other types of information such as personality and environment factors to increase understanding of acceptable pet foods. Most pet food flavor research is proprietary and, thus, there are a limited number of publications available. Funding opportunities for pet food studies would increase research and publications and this would help raise public awareness of pet food related issues. This mini-review addresses current pet food sensory analysis literature and discusses future challenges and possibilities. © 2014 Society of Chemical Industry.

  10. Thermal Power Plant Performance Analysis

    CERN Document Server

    2012-01-01

    The analysis of the reliability and availability of power plants is frequently based on simple indexes that do not take into account the criticality of some failures used for availability analysis. This criticality should be evaluated based on concepts of reliability which consider the effect of a component failure on the performance of the entire plant. System reliability analysis tools provide a root-cause analysis leading to the improvement of the plant maintenance plan.   Taking in view that the power plant performance can be evaluated not only based on  thermodynamic related indexes, such as heat-rate, Thermal Power Plant Performance Analysis focuses on the presentation of reliability-based tools used to define performance of complex systems and introduces the basic concepts of reliability, maintainability and risk analysis aiming at their application as tools for power plant performance improvement, including: ·         selection of critical equipment and components, ·         defini...

  11. Portable and Automatic Moessbauer Analysis

    International Nuclear Information System (INIS)

    Souza, P. A. de; Garg, V. K.; Klingelhoefer, G.; Gellert, R.; Guetlich, P.

    2002-01-01

    A portable Moessbauer spectrometer, developed for extraterrestrial applications, opens up new industrial applications of MBS. But for industrial applications, an available tool for fast data analysis is also required, and it should be easy to handle. The analysis of Moessbauer spectra and their parameters is a barrier for the popularity of this wide-applicable spectroscopic technique in industry. Based on experience, the analysis of a Moessbauer spectrum is time-consuming and requires the dedication of a specialist. However, the analysis of Moessbauer spectra, from the fitting to the identification of the sample phases, can be faster using by genetic algorithms, fuzzy logic and artificial neural networks. Industrial applications are very specific ones and the data analysis can be performed using these algorithms. In combination with an automatic analysis, the Moessbauer spectrometer can be used as a probe instrument which covers the main industrial needs for an on-line monitoring of its products, processes and case studies. Some of these real industrial applications will be discussed.

  12. MGR External Events Hazards Analysis

    International Nuclear Information System (INIS)

    Booth, L.

    1999-01-01

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses

  13. Foundations of Risk Analysis

    CERN Document Server

    Aven, Terje

    2012-01-01

    Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and

  14. Nuclear analysis techniques and environmental sciences

    International Nuclear Information System (INIS)

    1997-10-01

    31 theses are collected in this book. It introduced molecular activation analysis micro-PIXE and micro-probe analysis, x-ray fluorescence analysis and accelerator mass spectrometry. The applications about these nuclear analysis techniques are presented and reviewed for environmental sciences

  15. Alternatives to Center of Gravity Analysis

    Science.gov (United States)

    2013-04-04

    20 Figure 11. SWOT Analysis ...COMPARISON BETWEEN COG ANALYSIS AND SMTS ......................................................22 Benefits of using SMT in COG Analysis ...Opportunities, and Threats ( SWOT ) analysis . SWOT identifies external and internal factors that impinge on the business (Figure 11). SWOT can be as

  16. 40 CFR 763.87 - Analysis.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Analysis. 763.87 Section 763.87... Asbestos-Containing Materials in Schools § 763.87 Analysis. (a) Local education agencies shall have bulk samples, collected under § 763.86 and submitted for analysis, analyzed for asbestos using laboratories...

  17. Portfolio Analysis for Vector Calculus

    Science.gov (United States)

    Kaplan, Samuel R.

    2015-01-01

    Classic stock portfolio analysis provides an applied context for Lagrange multipliers that undergraduate students appreciate. Although modern methods of portfolio analysis are beyond the scope of vector calculus, classic methods reinforce the utility of this material. This paper discusses how to introduce classic stock portfolio analysis in a…

  18. Combining network analysis with Cognitive Work Analysis: insights into social organisational and cooperation analysis.

    Science.gov (United States)

    Houghton, Robert J; Baber, Chris; Stanton, Neville A; Jenkins, Daniel P; Revell, Kirsten

    2015-01-01

    Cognitive Work Analysis (CWA) allows complex, sociotechnical systems to be explored in terms of their potential configurations. However, CWA does not explicitly analyse the manner in which person-to-person communication is performed in these configurations. Consequently, the combination of CWA with Social Network Analysis provides a means by which CWA output can be analysed to consider communication structure. The approach is illustrated through a case study of a military planning team. The case study shows how actor-to-actor and actor-to-function mapping can be analysed, in terms of centrality, to produce metrics of system structure under different operating conditions. In this paper, a technique for building social network diagrams from CWA is demonstrated.The approach allows analysts to appreciate the potential impact of organisational structure on a command system.

  19. Structural analysis of fuel handling systems

    Energy Technology Data Exchange (ETDEWEB)

    Lee, L S.S. [Atomic Energy of Canada Ltd., Mississauga, ON (Canada)

    1997-12-31

    The purpose of this paper has three aspects: (i) to review `why` and `what` types of structural analysis, testing and report are required for the fuel handling systems according to the codes, or needed for design of a product, (ii) to review the input requirements for analysis and the analysis procedures, and (iii) to improve the communication between the analysis and other elements of the product cycle. The required or needed types of analysis and report may be categorized into three major groups: (i) Certified Stress Reports for design by analysis, (ii) Design Reports not required for certification and registration, but are still required by codes, and (iii) Design Calculations required by codes or needed for design. Input requirements for structural analysis include: design, code classification, loadings, and jurisdictionary boundary. Examples of structural analysis for the fueling machine head and support structure are given. For improving communication between the structural analysis and the other elements of the product cycle, some areas in the specification of design requirements and load rating are discussed. (author). 6 refs., 1 tab., 4 figs.

  20. Structural analysis of fuel handling systems

    International Nuclear Information System (INIS)

    Lee, L.S.S.

    1996-01-01

    The purpose of this paper has three aspects: (i) to review 'why' and 'what' types of structural analysis, testing and report are required for the fuel handling systems according to the codes, or needed for design of a product, (ii) to review the input requirements for analysis and the analysis procedures, and (iii) to improve the communication between the analysis and other elements of the product cycle. The required or needed types of analysis and report may be categorized into three major groups: (i) Certified Stress Reports for design by analysis, (ii) Design Reports not required for certification and registration, but are still required by codes, and (iii) Design Calculations required by codes or needed for design. Input requirements for structural analysis include: design, code classification, loadings, and jurisdictionary boundary. Examples of structural analysis for the fueling machine head and support structure are given. For improving communication between the structural analysis and the other elements of the product cycle, some areas in the specification of design requirements and load rating are discussed. (author). 6 refs., 1 tab., 4 figs

  1. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  2. Trace analysis of semiconductor materials

    CERN Document Server

    Cali, J Paul; Gordon, L

    1964-01-01

    Trace Analysis of Semiconductor Materials is a guidebook concerned with procedures of ultra-trace analysis. This book discusses six distinct techniques of trace analysis. These techniques are the most common and can be applied to various problems compared to other methods. Each of the four chapters basically includes an introduction to the principles and general statements. The theoretical basis for the technique involved is then briefly discussed. Practical applications of the techniques and the different instrumentations are explained. Then, the applications to trace analysis as pertaining

  3. Activation analysis in national economy

    International Nuclear Information System (INIS)

    1974-01-01

    The collected papers are based on the materials of the III All-Union Activation Analysis Meeting. The papers selected deal with the theoretical questions of the activation analysis, its hardware, latest developments in the field of automatic analysis and computer methods employment in the treatment of analytical information. Described are the new techniques for determination of a large number of elements in samples of biological and geological origin. Some results of the use of the activation analysis in various fields of science and technology are provided. The volume reflects the present status of activation analysis techniques in the USSR and might be of interest both for specialists, and for those involved in obtaining and using information on the composition of substances. (auth.)

  4. Global optimization and sensitivity analysis

    International Nuclear Information System (INIS)

    Cacuci, D.G.

    1990-01-01

    A new direction for the analysis of nonlinear models of nuclear systems is suggested to overcome fundamental limitations of sensitivity analysis and optimization methods currently prevalent in nuclear engineering usage. This direction is toward a global analysis of the behavior of the respective system as its design parameters are allowed to vary over their respective design ranges. Presented is a methodology for global analysis that unifies and extends the current scopes of sensitivity analysis and optimization by identifying all the critical points (maxima, minima) and solution bifurcation points together with corresponding sensitivities at any design point of interest. The potential applicability of this methodology is illustrated with test problems involving multiple critical points and bifurcations and comprising both equality and inequality constraints

  5. Analysis of Some Egyptian Cosmetic Samples by Fast Neutron Activation Analysis

    CERN Document Server

    Medhat, M E; Fayez-Hassan, M

    2001-01-01

    A description of D-T neutron generator (NG) is presented. This generator can be used for fast neutron activation analysis applied to determine some selected elements, especially light elements, in different materials. In our work, the concentration of the elements Na, Mg, Al, Si, K, Cl, Ca and Fe, were determined in two domestic brands of face powder by using 14 MeV neutron activation analysis.

  6. Activation analysis in Greece

    International Nuclear Information System (INIS)

    Grimanis, A.P.

    1985-01-01

    A review of research and development on NAA as well as examples of applications of this method are presented, taken from work carried out over the last 21 years at the Radioanalytical Laboratory of the Department of Chemistry in the Greek Nuclear Research Center ''Demokritos''. Improved and faster radiochemical NAA methods have been developed for the determination of Au, Ni, Cl, As, Cu, U, Cr, Eu, Hg and Mo in several materials, for the simultaneous determination of Br and I; Mg, Sr and Ni; As and Cu; As, Sb and Hg; Mn, Sr and Ba; Cd and Zn; Se and As; Mo and Cr in biological materials. Instrumental NAA methods have also been developed for the determination of Ag, Cl and Na in lake waters, Al, Ca, Mg and V in wines, 7 trace elements in biological materials, 17 trace elements in sediments and 20 minor and trace elements in ceramics. A comprehensive computer program for routine activation analysis using Ge(Li) detectors have been worked out. A rather extended charged-particle activation analysis program is carried out for the last 10 years, including particle induced X-ray emission (PIXE) analysis, particle induced prompt gamma-ray emission analysis (PIGE), other nuclear reactions and proton activation analysis. A special neutron activation method, the delayed fission neutron counting method is used for the analysis of fissionable elements, as U, Th, Pu, in samples of the whole nuclear fuel cycle including geological, enriched and nuclear safeguards samples

  7. Technical Note: Introduction of variance component analysis to setup error analysis in radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Matsuo, Yukinori, E-mail: ymatsuo@kuhp.kyoto-u.ac.jp; Nakamura, Mitsuhiro; Mizowaki, Takashi; Hiraoka, Masahiro [Department of Radiation Oncology and Image-applied Therapy, Kyoto University, 54 Shogoin-Kawaharacho, Sakyo, Kyoto 606-8507 (Japan)

    2016-09-15

    Purpose: The purpose of this technical note is to introduce variance component analysis to the estimation of systematic and random components in setup error of radiotherapy. Methods: Balanced data according to the one-factor random effect model were assumed. Results: Analysis-of-variance (ANOVA)-based computation was applied to estimate the values and their confidence intervals (CIs) for systematic and random errors and the population mean of setup errors. The conventional method overestimates systematic error, especially in hypofractionated settings. The CI for systematic error becomes much wider than that for random error. The ANOVA-based estimation can be extended to a multifactor model considering multiple causes of setup errors (e.g., interpatient, interfraction, and intrafraction). Conclusions: Variance component analysis may lead to novel applications to setup error analysis in radiotherapy.

  8. Technical Note: Introduction of variance component analysis to setup error analysis in radiotherapy

    International Nuclear Information System (INIS)

    Matsuo, Yukinori; Nakamura, Mitsuhiro; Mizowaki, Takashi; Hiraoka, Masahiro

    2016-01-01

    Purpose: The purpose of this technical note is to introduce variance component analysis to the estimation of systematic and random components in setup error of radiotherapy. Methods: Balanced data according to the one-factor random effect model were assumed. Results: Analysis-of-variance (ANOVA)-based computation was applied to estimate the values and their confidence intervals (CIs) for systematic and random errors and the population mean of setup errors. The conventional method overestimates systematic error, especially in hypofractionated settings. The CI for systematic error becomes much wider than that for random error. The ANOVA-based estimation can be extended to a multifactor model considering multiple causes of setup errors (e.g., interpatient, interfraction, and intrafraction). Conclusions: Variance component analysis may lead to novel applications to setup error analysis in radiotherapy.

  9. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  10. Is risk analysis scientific?

    Science.gov (United States)

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.

  11. Nuclear analysis software. Pt. 2: Gamma spectrum analysis, activity calculations and neutron activiation analysis (GANAAS)

    International Nuclear Information System (INIS)

    1991-01-01

    A spectrum acquired with a multichannel analyzer is usually stored with a suitable device (tape, cassette tape, diskette, hard disk). Every manufacturer of multichannel analyzers uses his own method for storage, and records the spectra in his own format. Furthermore, the formats to save the spectra evolve in time: the same manufacturer can have several formats for different generations of multichannel analyzers. A similar situation prevails with the spectrum analysis programmes. They require spectra in a particular format as the input to the analysis. Again, these input formats are many and differ from each other considerably. SPEDAC set of routines was developed to provide the spectroscopist with a tool for converting the spectral formats. They can read the spectra recorded in a number of formats used in different multichannel analyzers, to a number of analysis programmes. In fact, all the major formats are represented. Another serious problem for the user of a stand-alone multichannel analyzer is the transfer of spectra from the MCA to the computer. For several well known types of MCAs, the Version 5.0 of SPEDAC offers a set of routines for spectrum transfer, using the most simple methods of interfacing. All the transfer programmes described in this manual have been carefully tested with the corresponding stand-alone multichannel analyzers

  12. Automated Communications Analysis System using Latent Semantic Analysis

    National Research Council Canada - National Science Library

    Foltz, Peter W

    2006-01-01

    ... and during the debriefing process to assess knowledge proficiency. In this report, the contractor describes prior research on communication analysis and how it can inform assessment of individual and team cognitive processing...

  13. The same analysis approach: Practical protection against the pitfalls of novel neuroimaging analysis methods.

    Science.gov (United States)

    Görgen, Kai; Hebart, Martin N; Allefeld, Carsten; Haynes, John-Dylan

    2017-12-27

    Standard neuroimaging data analysis based on traditional principles of experimental design, modelling, and statistical inference is increasingly complemented by novel analysis methods, driven e.g. by machine learning methods. While these novel approaches provide new insights into neuroimaging data, they often have unexpected properties, generating a growing literature on possible pitfalls. We propose to meet this challenge by adopting a habit of systematic testing of experimental design, analysis procedures, and statistical inference. Specifically, we suggest to apply the analysis method used for experimental data also to aspects of the experimental design, simulated confounds, simulated null data, and control data. We stress the importance of keeping the analysis method the same in main and test analyses, because only this way possible confounds and unexpected properties can be reliably detected and avoided. We describe and discuss this Same Analysis Approach in detail, and demonstrate it in two worked examples using multivariate decoding. With these examples, we reveal two sources of error: A mismatch between counterbalancing (crossover designs) and cross-validation which leads to systematic below-chance accuracies, and linear decoding of a nonlinear effect, a difference in variance. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...

  15. Statistical Energy Analysis (SEA) and Energy Finite Element Analysis (EFEA) Predictions for a Floor-Equipped Composite Cylinder

    Science.gov (United States)

    Grosveld, Ferdinand W.; Schiller, Noah H.; Cabell, Randolph H.

    2011-01-01

    Comet Enflow is a commercially available, high frequency vibroacoustic analysis software founded on Energy Finite Element Analysis (EFEA) and Energy Boundary Element Analysis (EBEA). Energy Finite Element Analysis (EFEA) was validated on a floor-equipped composite cylinder by comparing EFEA vibroacoustic response predictions with Statistical Energy Analysis (SEA) and experimental results. Statistical Energy Analysis (SEA) predictions were made using the commercial software program VA One 2009 from ESI Group. The frequency region of interest for this study covers the one-third octave bands with center frequencies from 100 Hz to 4000 Hz.

  16. Android malware and analysis

    CERN Document Server

    Dunham, Ken

    2014-01-01

    The rapid growth and development of Android-based devices has resulted in a wealth of sensitive information on mobile devices that offer minimal malware protection. This has created an immediate demand for security professionals that understand how to best approach the subject of Android malware threats and analysis.In Android Malware and Analysis, Ken Dunham, renowned global malware expert and author, teams up with international experts to document the best tools and tactics available for analyzing Android malware. The book covers both methods of malware analysis: dynamic and static.This tact

  17. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  18. An analyst's self-analysis.

    Science.gov (United States)

    Calder, K T

    1980-01-01

    I have told you why I selected the topic of self-analysis, and I have described my method for it: of recording primary data such as dreams, daydreams, memories, and symptoms and of recording associations to this primary data, followed by an attempt at analyzing this written material. I have described a dream, a memory and a daydream which is also a symptom, each of which primary data I found useful in understanding myself. Finally, I reached some conclusions regarding the uses of self-analysis, including self-analysis as a research tool.

  19. Distributed analysis at LHCb

    International Nuclear Information System (INIS)

    Williams, Mike; Egede, Ulrik; Paterson, Stuart

    2011-01-01

    The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.

  20. Analysis of metal samples

    International Nuclear Information System (INIS)

    Ramirez T, J.J.; Lopez M, J.; Sandoval J, A.R.; Villasenor S, P.; Aspiazu F, J.A.

    2001-01-01

    An elemental analysis, metallographic and of phases was realized in order to determine the oxidation states of Fe contained in three metallic pieces: block, plate and cylinder of unknown material. Results are presented from the elemental analysis which was carried out in the Tandem Accelerator of ININ by Proton induced X-ray emission (PIXE). The phase analysis was carried out by X-ray diffraction which allowed to know the type of alloy or alloys formed. The combined application of nuclear techniques with metallographic techniques allows the integral characterization of industrial metals. (Author)