Rectal absorption of homatropine [14C] methylbromide in the rat
International Nuclear Information System (INIS)
Cramer, M.B.; Cates, L.A.; Clarke, D.E.
1978-01-01
Homatropine [ 14 C]methylbromide (HMB- 14 C) was administered to rats by intramuscular injection, oral gavage and rectal suppository. Plasma concentrations of 14 C were measured over the subsequent 12 h. Peak plasma concentrations were higher and achieved more rapidly after rectal administration than by other routes whether HMB- 14 C was administered in a water-soluble suppository base or in aqueous solution. Twelve h after the suppositories were inserted and retained 28% of the 14 C had been excreted in the urine while 56% remained in the large intestine. Unlabelled HMB, given in rectal suppositories to anaesthetized rats, caused prompt blockade of the effects of vagal stimulation on pulse rate and of intravenous acetylcholine on blood pressure. These results confirm the rapid rectal absorption of the drug. (author)
Regan, E C; Ramsey, A D
1996-03-01
Regan and Price (1994) investigated the frequency of occurrence and severity of side-effects of using an immersion virtual reality system in 150 subjects: 61% of the subjects reported symptoms of malaise at some point during a 20-min immersion and 10-min post-immersion period. This paper describes a double-blind placebo-controlled study that investigated whether 300 microgram of hyoscine/scopolamine hydrobromide administered to subjects prior to immersion in virtual reality was effective in reducing side-effects experienced during immersion. It was hypothesized that the hyoscine hydrobromide would cause a significant reduction in reported symptoms. We administered 300 micrograms of hyoscine hydrobromide to 19 subjects, and 20 subjects were administered a placebo compound 40 min prior to a 20-min immersion in VR. Data on malaise were collected using a simulator sickness questionnaire and a malaise scale. A 2 x 2 Chi-square analysis comparing the numbers of subjects reporting no symptoms on the malaise scale with those reporting some symptoms in the placebo and hyoscine conditions showed the differences between the two groups to be statistically significant at the 0.01 level (Chi-square = 7.392 with 1 df, p = 0.007). This difference was clearly in the direction of fewer symptoms being reported in the hyoscine condition. The results of the study showed that the hyoscine was effective in reducing symptoms that are commonly observed during immersion in virtual reality.
International Nuclear Information System (INIS)
Mahfoud, J.
2007-01-01
A simple and accurate method was developed for the analysis of carbinoxamine maleate, dextromethorphan hydrobromide and pseudoephedrine hydrochloride content in pure form and pharmaceutical preparations using HPLC. Analysis was conducted on a silica column (6 μm) with mobile phase consisting of ethanol - ammonium acetate (0.05 M) in rate [85:15] respectively, and at detection wavelength of 276 nm and flow rate 1 ml/min. Results were linear (correlation coefficient R > 0.9996) in the range of the studied concentrations for the active materials. The relative standard deviations (n=6) of intra and interday assay were 0.931%, 1.527% for carbinoxamine maleate and 0.717%, 1.058% for dextromethorphan hydrobromide and 0.309%, 0.891% for pseudoephedrine hydrochloride, respectively. This method, proved to be easy, precise and economical, is useful for quality control of pharmaceutical drugs industrial samples. (author)
Directory of Open Access Journals (Sweden)
Amber L. Thompson
2009-11-01
Full Text Available X-ray crystallographic analysis of the title hydrobromide salt, C10H20N+·Br−, of (1R,2S,3R,5R,8aR-3-hydroxymethyl-5-methyloctahydroindolizine-1,2-diol defines the absolute and relative stereochemistry at the five chiral centres in steviamine, a new class of polyhydroxylated indolizidine alkaloid isolated from Stevia rebaudiana (Asteraceae leaves. In the crystal structure, molecules are linked by intermolecular O—H...Br and N—H...Br hydrogen bonds, forming double chains around the twofold screw axes along the b-axis direction. Intramolecular O—H...O interactions occur.
Rajan, Sekar; Colaco, Socorrina; Ramesh, N; Meyyanathan, Subramania Nainar; Elango, K
2014-02-01
This study describes the development and validation of dissolution tests for sustained release Dextromethorphan hydrobromide tablets using an HPLC method. Chromatographic separation was achieved on a C18 column utilizing 0.5% triethylamine (pH 7.5) and acetonitrile in the ratio of 50:50. The detection wavelength was 280 nm. The method was validated and response was found to be linear in the drug concentration range of 10-80 microg mL(-1). The suitable conditions were clearly decided after testing sink conditions, dissolution medium and agitation intensity. The most excellent dissolution conditions tested, for the Dextromethorphan hydrobromide was applied to appraise the dissolution profiles. The method was validated and response was found to be linear in the drug concentration range of 10-80 microg mL(-1). The method was established to have sufficient intermediate precision as similar separation was achieved on another instrument handled by different operators. Mean Recovery was 101.82%. Intra precisions for three different concentrations were 1.23, 1.10 0.72 and 1.57, 1.69, 0.95 and inter run precisions were % RSD 0.83, 1.36 and 1.57%, respectively. The method was successfully applied for dissolution study of the developed Dextromethorphan hydrobromide tablets.
Arafa, Nadia M S; Ali, Elham H A; Hassan, Mohamed Kamel
2017-11-01
Canagliflozin (CAN) is a sodium-glucose co-transporter 2 (SGLT2) inhibitor indicated to improve glycemic control in adults with type 2 diabetes mellitus. There is a little information about its effect on the cholinergic system that proposed mechanism for memory improvement occurring by SGLT2 drugs. This study aimed to estimate the effect of CAN as compared to galantamine (GAL) treatments for two weeks on scopolamine hydrobromide (SCO)-induced memory dysfunction in experimental rats. Animals divided into six groups; control (CON), CAN, GAL, SCO, SCO + CAN and SCO + GAL. Results indicated significant decrease in body weights of the CAN groups as compared to control values. Moreover, in the SCO + CAN and SCO + GAL the number of arm entry and number of correct alternation in Y maze task increased and showed improvement in the water maze task, acetylcholinesterase (AChE) activities decreased significantly, while monoamines levels significantly increased compared with the SCO group values. Results also recorded acetylcholine M1 receptor (M1 mAChR) in SCO + CAN or SCO + GAL groups in comparison with the SCO group. The study suggested that canagliflozin might improve memory dysfunction induced by scopolamine hydrobromide via cholinergic and monoamines system. Copyright © 2017 Elsevier B.V. All rights reserved.
[The protection of hydrogen-rich saline on a rat dry eye model induced by scopolamine hydrobromide].
Chu, Y Y; Hua, N; Ru, Y S; Zhao, S Z
2017-05-11
Objective: To evaluate the effect of hydrogen-rich saline (HRS) on dry eye rats induced by subcutaneous injection of scopolamine hydrobromide. Methods: Experiment research. Thirty female Wistar rats at about six weeks old were randomly divided into the normal group, dry eye group, HRS eyedrops group, normal saline eyedrops group (NS), HRS intraperitoneal injection group and NS intraperitoneal injection group, with 5 rats in each group. The dry eye was induced by subcutaneous injection of scopolamine hydrobromide in the latter five groups. The clinical signs of dry eye such as tear volume (SⅠt), tear break-up time (BUT) and corneal epithelial fluorescein staining scores were evaluated on day 7, 14, 21 and 28. On the 28th day, ten eyes in each group were enucleated and processed for paraffin sections for HE, PAS and immunohistochemistry stainings. Analysis of variance was used to test the data, and independent samples t -test was used for comparison between the two groups. Two-way repeated measure ANOVA was used to compare the difference among groups at different time points, one-way ANOVA was used to test the comparisons of the clinical signs at one time, and LSD was used to for comparison between two groups. Results: Before and after the experiment of the day 7, 14, 21, 28, the values of SIt in HRS eyedrops group and HRS intraperitoneal injection group were respectively:(3.625±1.157),(3.313±0.704),(3.250±0.535),(3.313±0.372), (3.375±0.582)mm and (3.500±1.019), (2.893±0.656), (3.321±0.668), (3.179±0.575), (3.214±0.871)mm. The values of BUT were respectively: (2.750±0.707), (2.688±0.594), (2.813±0.753), (3.000±0.756), (2.750±0.707)s and (3.000±0.679), (2.321±0.464), (2.750±0.753), (3.214±0.699), (2.679±0.608)s. The values of fluorescein staining score were respectively: (6.250±0.707), (8.875±0.641), (8.750±0.707), (9.250±0.463), (8.250±1.282) and (6.000±0.679), (9.143±1.027), (8.857±0.770), (9.143±0.949), (8.500±0.760). The difference
Abdel-Haleem, Fatehy M; Saad, Mohamed; Barhoum, Ahmed; Bechelany, Mikhael; Rizk, Mahmoud S
2018-08-01
We report on highly-sensitive ion-selective electrodes (ISEs) for potentiometric determining of galantamine hydrobromide (GB) in physiological fluids. Galantamine hydrobromide (GB) was selected for this study due to its previous medical importance for treating Alzheimer's disease. Three different types of ISEs were investigated: PVC membrane electrode (PVCE), carbon-paste electrode (CPE), and coated-wire electrode (CWE). In the construction of these electrodes, galantaminium-reineckate (GR) ion-pair was used as a sensing species for GB in solutions. The modified carbon-paste electrode (MCPE) was prepared using graphene oxide (MCPE-GO) and sodium tetrakis (trifluoromethyl) phenyl borate (MCPE-STFPB) as ion-exchanger. The potentiometric modified CPEs (MCPE-GO and MCPE-STFPB) show an improved performance in term of Nernstian slope, selectivity, response time, and response stability compared to the unmodified CPE. The prepared electrodes PVCE, CWE, CPE, MCPE-GO and MCPE-STFPB show Nernstian slopes of 59.9, 59.5, 58.1, 58.3 and 57.0 mV/conc. decade, and detection limits of 5.0 × 10 -6 , 6.3 × 10 -6 , 8.0 × 10 -6 , 6.0 × 10 -6 and 8.0 × 10 -6 mol L -1 , respectively. The prepared ISEs also show high selectivity against cations (i.e. Na + , K + , NH 4 + , Ca 2+ , Al 3+ , Fe 3+ ), amino acids (i.e. glycine, L-alanine alanine), and sugars (i.e. fructose, glucose, maltose, lactose). The prepared ISEs are applicable for determining GB in spiked serums, urines, and pharmaceutical preparations, using a standard addition and a direct potentiometric method. The fast response time (<10 s), long lifetime (1-5 weeks), reversibility and stability of the measured signals facilitate the application of these sensors for routine analysis of the real samples. Copyright © 2018 Elsevier B.V. All rights reserved.
Guenin, Eric; Armogida, Marianna; Riff, Dennis
2014-09-01
Dextromethorphan hydrobromide (DM) is a widely used antitussive. This study determined, for the first time, the basic pharmacokinetic profile of DM and its active metabolite, dextrorphan (DP) in children and adolescents. Thirty-eight male and female subjects at risk for developing an upper respiratory tract infection (URTI), or symptomatic with cough due to URTI, were enrolled in this single-dose, open-label study: ages 2-5 years (Group A, n = 8), 6-11 years (Group B, n = 17), 12-17 years (Group C, n = 13). Subjects were genotyped for cytochrome P450 (CYP) 2D6 polymorphisms and characterized as poor (PM) or non-poor metabolizers (non-PM). Groups A and B were dosed using an age-weight dosing schedule (DM range 7.5-24.75 mg); a 30-mg dose was used for Group C. Average exposures to total DP increased as age group increased, and average exposure to DM was highest in the adolescent group. One subject in that group was a PM. The terminal half-life (t ½) values were longer in the adolescent group due in part to the single PM subject. No relationship between body weight and pharmacokinetic parameters was noted. This is the first evaluation of the pharmacokinetic characteristics of DM in children and adolescents. A single dose of DM in this population was safe, and well tolerated at all doses tested. The data are used to model and compare pediatric DM exposures with those of adults.
Directory of Open Access Journals (Sweden)
Jun Fan
2016-03-01
Full Text Available AIM: To explore the clinical significance of triamcinolone acetonide combined with compound anisodine hydrobromide injection for the treatment of mild(non ischemiccentral retinal vein occlusion(CRVOin the early stage.METHODS: One hundred and sixteen eyes in 116 patients with non ischemic CRVO in early stage were randomly divided into four groups, group A, group B, group C, and group D. Divided by the completely random data method, each group had 29 eyes. Group A received no treatment. Group B was given compound anisodine hydrobromide injection in subcutaneous injection besides superficial temporal artery of the eye. Group C was injected with triamcinolone acetonide beside eyeballs and Group D was given triamcinolone acetonide combined with compound anisodine hydrobromide injection. In each group, we observed and recorded the best corrected visual acuity(BCVA, using EDTRS chart, bleeding, optical coherence tomography(OCTscanning for central macular thickness(CMT, fundus fluorescence angiography(FFAimaging check for the possibility of ischemic CRVO at 1, 2, 4, 8 and 12wk respectively. The total curative effect after 3mo was being compared among the three groups.RESULTS: After 12 weeks' treatment, the mean BCVA was lower and the mean CMT was higher in group A than those before the treatment. The mean BCVA was increased and the mean CMT decreased in group B, C and D after treated for 3mo. Comparing Group D with the rest groups, the variation of BCVA and CMT had statistical significance(PP>0.05. Ischemic CRVO was found in 8 cases of group A, 6 cases of group B, 5 cases of group C, and 2 cases of group D,and the difference was not statistically significant(χ2=4.361; P=0.225. Flame-shaped bleeding was found in 14 cases of group A, 7 cases of group B, 9 cases of group C and 4 cases of group D and the difference was statistically significant(χ2=8.821; P=0.032. CONCLUSION: The combination of triamcinolone acetonide and compound anisodine hydrobromide
Directory of Open Access Journals (Sweden)
Vicente José Assencio-Ferreira
2001-06-01
Full Text Available OBJETIVO: alertar que o uso da associação dimeticona/homatropina (Espasmo Luftal® em recém-nascidos e lactentes de até dois meses, pode causar episódios disfuncionais transitórios extrapiramidais. MÉTODO: relato de 6 casos de crianças com menos de 2 meses, em uso diário da associação dimeticona/homatropina, que apresentaram sintomas agudos caracterizados por crises repetidas de curta duração com desvio tônico da cabeça para trás (opistótono, desvio do olhos para cima com olhar fixo e expressão de terror, postura mantida em hipertonia extensora dos 4 membros e emissão de choro e/ou sons guturais. RESULTADOS: os sintomas extrapiramidais desapareceram (e não retornaram após a suspensão da associação dimeticona/homatropina. Não foram constatadas anormalidades no exame neurológico, eletrencefalograma e provas sanguíneas. CONCLUSÕES: a associação dimeticona/homatropina pode determinar em crianças com menos de 2 meses, quadro disfuncional dos gânglios da base. É importante diferenciá-lo das crises epilépticas generalizadas, a fim de se evitar a utilização, errônea, de drogas antiepilépticas.OBJECTIVE: to alert that use of dimethicone plus homatropine in infants up to two months, may cause episodes of transitory extrapyramidal disturbances. METHOD: report 6 infants up to two months old, in daily use of dimethicone plus homatropine, that presented typical symptoms of the basal ganglia dysfunction, characterized by repeated crisis of short duration with tonic back shift of the head (opisthotonos, deviation of the eyes upward with looking fixed and terror expression, maintained in extending hipertonia posture of the 4 members and cry and/or guttural sounds emission. RESULTS: the extrapyramidal symptoms disappeared (and did not return after interruption of dimethicone plus homatropine. Abnormalities were not verified in neurological examination, EEG and blood tests. CONCLUSIONS: the associated dimethicone/homatropine
Directory of Open Access Journals (Sweden)
Li FQ
2011-04-01
Full Text Available Feng-Qian Li1, Cheng Yan2, Juan Bi1, Wei-Lin Lv3, Rui-Rui Ji3, Xu Chen1, Jia-Can Su3, Jin-Hong Hu31Department of Pharmaceutics, Shanghai Eighth People’s Hospital, Shanghai, People’s Republic of China; 2Department of Pharmacy, Bethune International Peace Hospital, Shijiazhuang, People’s Republic of China; 3Changhai Hospital, Second Military Medical University, Shanghai, People’s Republic of ChinaAbstract: Scopolamine hydrobromide (SH-loaded microparticles were prepared from a colloidal fluid containing ionotropic-gelated chitosan nanoparticles using a spray-drying method. The spray-dried microparticles were then formulated into orally disintegrating tablets (ODTs using a wet granulation tablet formation process. A drug entrapment efficiency of about 90% (w/w and loading capacity of 20% (w/w were achieved for the microparticles, which ranged from 2 µm to 8 µm in diameter. Results of disintegration tests showed that the formulated ODTs could be completely dissolved within 45 seconds. Drug dissolution profiles suggested that SH is released more slowly from tablets made using the microencapsulation process compared with tablets containing SH that is free or in the form of nanoparticles. The time it took for 90% of the drug to be released increased significantly from 3 minutes for conventional ODTs to 90 minutes for ODTs with crosslinked microparticles. Compared with ODTs made with noncrosslinked microparticles, it was thus possible to achieve an even lower drug release rate using tablets with appropriate chitosan crosslinking. Results obtained indicate that the development of new ODTs designed with crosslinked microparticles might be a rational way to overcome the unwanted taste of conventional ODTs and the side effects related to SH’s intrinsic characteristics.Keywords: scopolamine hydrobromide, chitosan, nanoparticles-in-microparticles system, spray-drying, orally disintegrating tablets
Crystal structure and thermochemical properties of 1-decylammonium hydrobromide (C10H21NH3Br)(s)
International Nuclear Information System (INIS)
Zhang Lijun; Di Youying; Lu Dongfei
2011-01-01
Highlights: → Crystal structure of 1-decylammonium hydrobromide was reported. → Lattice potential energy of the compound was obtained. → Molar volumes of the compound and its cation were obtained. → Ionic radius of its cation of the compound was calculated. → Molar enthalpy of dissolution at infinite dilution was determined. → Hydration enthalpies of the compound and its cation were calculated. - Abstract: The crystal structure of 1-decylammonium hydrobromide was determined by X-ray crystallography. Lattice potential energy and molar volumes of the solid compound and its cation were obtained respectively. The ionic radius of the cation can be calculated from the corresponding effective volume of the cation. The molar enthalpies of dissolution of the compound at different concentrations m/(mol . kg -1 ) at T = 298.15 K were measured by an isoperibol solution-reaction calorimeter at T = 298.15 K. According to the Pitzer's electrolyte solution theory, the molar enthalpy of dissolution of the compound at infinite dilution (Δ s H m ∞ ) and Pitzer parameters (β MX (0)L and β MX (1)L ) were obtained. The values of apparent relative molar enthalpies ( Φ L) of the title compound and relative partial molar enthalpies (L 2 -bar and L 1 -bar) of the solute and the solvent at different concentrations were derived from the experimental values of the enthalpy of dissolution of the compound. Finally, hydration enthalpies of the compound and its cation were calculated by designing a thermochemical cycle in accordance with lattice potential energy and the molar enthalpy of dissolution of the title compound at infinite dilution.
Azmi, Syed Najmul Hejaz; Al-Fazari, Ahlam; Al-Badaei, Munira; Al-Mahrazi, Ruqiya
2015-12-01
An accurate, selective and sensitive spectrofluorimetric method was developed for the determination of citalopram hydrobromide in commercial dosage forms. The method was based on the formation of a fluorescent ion-pair complex between citalopram hydrobromide and eosin Y in the presence of a disodium hydrogen phosphate/citric acid buffer solution of pH 3.4 that was extractable in dichloromethane. The extracted complex showed fluorescence intensity at λem = 554 nm after excitation at 259 nm. The calibration curve was linear over at concentrations of 2.0-26.0 µg/mL. Under optimized experimental conditions, the proposed method was validated as per ICH guidelines. The effect of common excipients used as additives was tested and the tolerance limit calculated. The limit of detection for the proposed method was 0.121 μg/mL. The proposed method was successfully applied to the determination of citalopram hydrobromide in commercial dosage forms. The results were compared with the reference RP-HPLC method. Copyright © 2015 John Wiley & Sons, Ltd.
International Nuclear Information System (INIS)
Peremans, K.; Hoybergs, Y.; Gielen, I.; Audenaert, K.; Vervaet, M.; Heeringen, C. van; Otte, A.; Goethals, I.; Dierckx, R.; Blankaert, P.
2005-01-01
Involvement of the serotonergic system in impulsive aggression has been demonstrated in both human and animal studies. The purpose of the present study was to investigate the effect of citalopram hydrobromide (a selective serotonin re-uptake inhibitor) on the 5-HT 2A receptor and brain perfusion in impulsive-aggressive dogs by means of single-photon emission computed tomography. The binding index of the radioligand 123 I-5-I-R91150 was measured before and after treatment with citalopram hydrobromide in nine impulsive-aggressive dogs. Regional perfusion was measured with 99m Tc-ethyl cysteinate dimer (ECD). Behaviour was assessed before treatment and again after 6 weeks of treatment. A correlation was found between decreased binding and behavioural improvement in eight out of nine dogs. The 5-HT 2A receptor binding index was significantly reduced after citalopram hydrobromide treatment in all cortical regions but not in the subcortical area. None of the dogs displayed alterations in perfusion on the post-treatment scans. This study supports previous findings regarding the involvement of the serotonergic system in impulsive aggression in dogs in general. More specifically, the effect of treatment on the 5-HT 2A receptor binding index could be demonstrated and the decreased binding index correlated with behavioural improvement. (orig.)
Directory of Open Access Journals (Sweden)
Ayman A. Gouda
2013-01-01
Full Text Available Three simple, sensitive, and accurate spectrophotometric methods have been developed for the determination of eletriptan hydrobromide (ELT in pure and dosage forms. The first two methods are based on charge transfer complex formation between ELT and chromogenic reagents quinalizarin (Quinz and alizarin red S (ARS producing charge transfer complexes which showed an absorption maximum at 569 and 533 nm for Quinz and ARS, respectively. The third method is based on the formation of ion-pair complex between ELT with molybdenum(V-thiocyanate inorganic complex in hydrochloric acid medium followed by extraction of the colored ion-pair with dichloromethane and measured at 470 nm. Different variables affecting the reactions were studied and optimized. Beer's law is obeyed in the concentration ranges 2.0–18, 1.0–8.0, and 2.0–32 μg mL−1 for Quinz, ARS, and Mo(V-thiocyanate, respectively. The molar absorptivity, Sandell sensitivity, detection, and quantification limits are also calculated. The correlation coefficients were ≥0.9994 with a relative standard deviation (R.S.D%. of ≤0.925. The proposed methods were successfully applied for simultaneous determination of ELT in tablets with good accuracy and precision and without interferences from common additives, and the validity is assessed by applying the standard addition technique, which is compared with those obtained using the reported method.
Directory of Open Access Journals (Sweden)
Alexey P. Sarapultsev
2016-05-01
Full Text Available Substituted thiadiazines exert a reliable therapeutic effect in treating stress, and a schematic description of their ability to influence all aspects of a stress response has been depicted. This study was conducted to pharmacologically evaluate compound L-17, a substituted thiadiazine, (2-morpholino-5-phenyl-6H-1,3,4-thiadiazine, hydrobromide for possible anti-psychotic/antidepressant activity. Compound L-17 was synthesized by cyclocondensation of α-bromoacetophenone with the original morpholine-4-carbothionic acid hydrazide. Pharmacologic evaluations were conducted using methods described by E.F. Lavretskaya (1985, and in accordance with published guidelines for studying drugs for neuroleptic activity. Compound L-17 was evaluated for various possible mechanisms of action, including its effects on cholinergic system agonists/antagonists, dopaminergic neurotransmission, the adrenergic system, and 5-HT3 serotonin receptors. One or more of these mechanisms may be responsible for the beneficial effects shown by thiadiazine compounds in experiments conducted to evaluate their activity in models of acute stress and acute myocardial infarction.
Li, Feng-Qian; Yan, Cheng; Bi, Juan; Lv, Wei-Lin; Ji, Rui-Rui; Chen, Xu; Su, Jia-Can; Hu, Jin-Hong
2011-01-01
Scopolamine hydrobromide (SH)-loaded microparticles were prepared from a colloidal fluid containing ionotropic-gelated chitosan nanoparticles using a spray-drying method. The spray-dried microparticles were then formulated into orally disintegrating tablets (ODTs) using a wet granulation tablet formation process. A drug entrapment efficiency of about 90% (w/w) and loading capacity of 20% (w/w) were achieved for the microparticles, which ranged from 2 μm to 8 μm in diameter. Results of disintegration tests showed that the formulated ODTs could be completely dissolved within 45 seconds. Drug dissolution profiles suggested that SH is released more slowly from tablets made using the microencapsulation process compared with tablets containing SH that is free or in the form of nanoparticles. The time it took for 90% of the drug to be released increased significantly from 3 minutes for conventional ODTs to 90 minutes for ODTs with crosslinked microparticles. Compared with ODTs made with noncrosslinked microparticles, it was thus possible to achieve an even lower drug release rate using tablets with appropriate chitosan crosslinking. Results obtained indicate that the development of new ODTs designed with crosslinked microparticles might be a rational way to overcome the unwanted taste of conventional ODTs and the side effects related to SH’s intrinsic characteristics. PMID:21720502
International Nuclear Information System (INIS)
Basu, S.K.; Srinivasan, M.N.; Chuttani, K.; George, S.
1992-01-01
Rate of glycolysis in vivo at different time intervals following 8 Gy[LDsub(100(30)] whole body gamma radiation (WBGR) was evaluated by estimating liver glycogen, blood sugar, serum lactic dehydrogenase (LDH) and lactic acid concentration in adult male Sprague Dawley rats. Within 1 hr of radiation exposure, a significant fall in liver glycogen was observed in rats fed food and water ad libitum. The glycogen content increased after 24 hr and had returned to control level on 7th day after radiation exposure. Blood sugar, serum LDH and blood lactate levels increased significantly as compared to non irradiated controls. Pretreatment with 5-hydroxy-L-tryptophan (5-HTP;100 mg/kg) + 2-aminoethylisothiuronium bromide hydrobromide (AET;20 mg/kg)ip 30 min before 8 Gy WBGR, modified these values and restored them to normal level on 7th day post-irradiation. (author). 24 refs
Energy Technology Data Exchange (ETDEWEB)
Peremans, K.; Hoybergs, Y.; Gielen, I. [Ghent University, Department of Medical Imaging, Faculty of Veterinary Medicine, Merelbeke (Belgium); Audenaert, K.; Vervaet, M.; Heeringen, C. van [Ghent University, Department of Psychiatry and Medical Psychology, Gent (Belgium); Otte, A.; Goethals, I.; Dierckx, R. [Ghent University Hospital, Division of Nuclear Medicine, Gent (Belgium); Blankaert, P. [Ghent University, Laboratory of Radiopharmacy, Gent (Belgium)
2005-06-01
Involvement of the serotonergic system in impulsive aggression has been demonstrated in both human and animal studies. The purpose of the present study was to investigate the effect of citalopram hydrobromide (a selective serotonin re-uptake inhibitor) on the 5-HT{sub 2A} receptor and brain perfusion in impulsive-aggressive dogs by means of single-photon emission computed tomography. The binding index of the radioligand {sup 123}I-5-I-R91150 was measured before and after treatment with citalopram hydrobromide in nine impulsive-aggressive dogs. Regional perfusion was measured with {sup 99m}Tc-ethyl cysteinate dimer (ECD). Behaviour was assessed before treatment and again after 6 weeks of treatment. A correlation was found between decreased binding and behavioural improvement in eight out of nine dogs. The 5-HT{sub 2A} receptor binding index was significantly reduced after citalopram hydrobromide treatment in all cortical regions but not in the subcortical area. None of the dogs displayed alterations in perfusion on the post-treatment scans. This study supports previous findings regarding the involvement of the serotonergic system in impulsive aggression in dogs in general. More specifically, the effect of treatment on the 5-HT{sub 2A} receptor binding index could be demonstrated and the decreased binding index correlated with behavioural improvement. (orig.)
Analysis of Dextromethorphan in Cough Drops and Syrups: A Medicinal Chemistry Laboratory
Hamilton, Todd M.; Wiseman, Frank L., Jr.
2009-01-01
Fluorescence spectroscopy is used to determine the quantity of dextromethorphan hydrobromide (DM) in over-the-counter (OTC) cough drops and syrups. This experiment is appropriate for an undergraduate medicinal chemistry laboratory course when studying OTC medicines and active ingredients. Students prepare the cough drops and syrups for analysis,…
Institute of Scientific and Technical Information of China (English)
Liu Yu-Pu; Di You-Ying; Dan Wen-Yan; He Dong-Hua; Kong Yu-Xia; Yang Wei-Wei
2011-01-01
This paper reports that 1-dodecylamine hydrobromide (1-C12H25NH3·Br)(s) has been synthesized using the liquid phase reaction method. The lattice potential energy of the compound 1-C12H25NH3·Br and the ionic volume and radius of the 1-C12H25NH3+ cation are obtained from the crystallographic data and other auxiliary ther-modynamic data. The constant-volume energy of combustion of 1-C12H25NH3·Br(s) is measured to be △cUm°(1-C12H25NH3·Br, s) =-(7369.03±3.28) kJ·mol-1 by means of an RBC-Ⅱ precision rotating-bomb combustion calorimeter at T=(298.15±0.001) K. The standard molar enthalpy of combustion of the compound is derived to be △cHm°(1-C12H25NH3·Br, s)=-(7384.52±3.28) kJ·mol-1 from the constant-volume energy of combustion. The standard molar enthalpy of formation of the compound is calculated to be △fHm°(1-C12H25NH3·Br, s)=-(1317.86±3.67) kJ·mol-1 from the standard molar enthalpy of combustion of the title compound and other auxiliary thermodynamic quantities through a thermochemical cycle.
Directory of Open Access Journals (Sweden)
Maria Lucilia Motinha Zamuner
2008-12-01
Full Text Available Um método espectrofotométrico simples foi desenvolvido para a determinação do bromidrato de fenoterol (BF em comprimidos, gotas e xarope, como princípio ativo único e associado ao ibuprofeno. O método se baseia na reação de acoplamento oxidativo do BF com o 3-metil-2-benzotiazolinona hidrazona (MBTH, na presença de sulfato cérico, como agente oxidante. A mistura de BF, MBTH e sulfato cérico, em meio ácido, produz um composto colorido (vermelho castanho com máximo de absorção a 475 nm. A curva de calibração foi linear num intervalo de concentração de 3,0 a 12,0 µg/mL, com coeficiente de correlação linear de 0,9998. Os parâmetros experimentais que afetam o desenvolvimento e a estabilidade do produto colorido foram cuidadosamente estudados e otimizados. O método foi aplicado em amostras comerciais e simuladas, obtendo-se coeficientes de variação entre 0,25 % a 0,82 % e médias de recuperação do padrão que variaram de 98 % a 102 %. O método proposto mostrou-se exato, preciso, linear e não é passível de interferência de excipientes, para as formas farmacêuticas comprimidos e gotas. Não houve interferência do ibuprofeno que consta de uma das formulações analisadas, associado ao BF. Quanto ao xarope, houve interferência do veículo sugerindo reações de seus componentes com o MBTH.A simple spectrophotometric method has been developed for the determination of fenoterol hydrobromide (FH in tablets, drops and syrup, as the only active principle and associated with ibuprofen. The method is based on the oxidative coupling reaction of the FH with 3-methyl-2-benzothiazolinone hydrazone (MBTH and ceric sulphate as oxidant reagent. The mixture of the drug, MBTH and ceric sulfate, in acid medium, produces a red brown color compound, with absorption maximum at 475 nm. The calibration curve was linear over a concentration range from 3.0 to 12.0 µg/mL, with correlation coefficient of 0.9998. The different experimental
LC for analysis of two sustained-release mixtures containing cough cold suppressant drugs.
El-Gindy, Alaa; Sallam, Shehab; Abdel-Salam, Randa A
2010-07-01
A liquid chromatographic method was applied for the analysis of two sustained-release mixtures containing dextromethorphane hydrobromide, carbinoxamine maleate with either phenylephrine hydrochloride in pharmaceutical capsules (Mix 1) or phenyl-propanolamine, methylparaben, and propylparaben, which bonds as a drug base to ion exchange resin in pharmaceutical syrup (Mix 2). The method was used for their simultaneous determination using a CN column with a mobile phase consisting of acetonitrile-12 mM ammonium acetate in the ratio of 60:40 (v/v, pH 6.0) for Mix 1 and 45:55 (v/v, pH 6.0) for Mix 2.
Ivković, Branka; Marković, Bojan; Vladimirov, Sote
2014-01-01
In this study a reversed phase HPLC method for rapid and simultaneous identification and quantification of doxylamine succinate, ephedrine sulfate, dextrometorphane hydrobromide, paracetamole and sodium benzoate in cough-cold syrup formulation was described. Separation was carried out on XTerraTM RP 18, Waters (150 mm x 4.6 mm column, 5 μm particle size). For the analysis of investigated substances gradient elution was used employing water, pH adjusted at 2.5 with 85 % ortophosphoric acid as ...
2011-12-30
... discontinued from marketing for reasons other than safety or effectiveness. ANDAs that refer to HYCODAN... Effectiveness AGENCY: Food and Drug Administration, HHS. ACTION: Notice. SUMMARY: The Food and Drug... effectiveness. This determination will allow FDA to approve abbreviated new drug applications (ANDAs) for...
Directory of Open Access Journals (Sweden)
Linlin Fang
Full Text Available An open-tubular capillary electrochromatography column was prepared by chemically immobilized β-cyclodextrin modified gold nanoparticles onto new surface with the prederivatization of (3-mercaptopropyl-trimethoxysilane. The synthesized nanoparticles and the prepared column were characterized by transmission electron microscopy, scanning electron microscopy, infrared spectroscopy and ultraviolet visible spectroscopy. When the column was employed as the chiral stationary phase, no enantioselectivity was observed for ten model basic drugs. So β-cyclodextrin was added to the background electrolyte as chiral additive to expect a possible synergistic effect occurring and resulting in a better separation. Fortunately, significant improvement in enantioselectivity was obtained for ten pairs of drug enantiomers. Then, the effects of β-cyclodextrin concentration and background electrolyte pH on the chiral separation were investigated. With the developed separation mode, all the enantiomers (except for venlafaxine were baseline separated in resolutions of 4.49, 1.68, 1.88, 1.57, 2.52, 2.33, 3.24, 1.63 and 3.90 for zopiclone, chlorphenamine maleate, brompheniramine maleate, dioxopromethazine hydrochloride, carvedilol, homatropine hydrobromide, homatropine methylbromide, venlafaxine, sibutramine hydrochloride and terbutaline sulfate, respectively. Further, the possible separation mechanism involved was discussed.
DEFF Research Database (Denmark)
Mathiesen, Brian Vad; Liu, Wen; Zhang, Xiliang
2014-01-01
three major technological changes: energy savings on the demand side, efficiency improvements in energy production, and the replacement of fossil fuels by various sources of renewable energy. Consequently, the analysis of these systems must include strategies for integrating renewable sources...
Heneedak, Hala M; Salama, Ismail; Mostafa, Samia; El-Kady, Ehab; El-Sadek, Mohamed
2015-07-01
The prerequisites for forensic confirmatory analysis by LC/MS/MS with respect to European Union guidelines are chromatographic separation, a minimum number of two MS/MS transitions to obtain the required identification points and predefined thresholds for the variability of the relative intensities of the MS/MS transitions (MRM transitions) in samples and reference standards. In the present study, a fast, sensitive and robust method to quantify tramadol, chlorpheniramine, dextromethorphan and their major metabolites, O-desmethyltramadol, dsmethyl-chlorpheniramine and dextrophan, respectively, in human plasma using ibuprofen as internal standard (IS) is described. The analytes and the IS were extracted from plasma by a liquid-liquid extraction method using ethyl acetate-diethyl-ether (1:1). Extracted samples were analyzed by ultra-high-performance liquid chromatography coupled to electrospray ionization tandem mass spectrometry (UHPLC-ESI-MS/MS). Chromatographic separation was performed by pumping the mobile phase containing acetonitrile, water and formic acid (89.2:11.7:0.1) for 2.0 min at a flow rate of 0.25 μL/min into a Hypersil-Gold C18 column, 20 × 2.0 mm (1.9 µm) from Thermoscientific, New York, USA. The calibration curve was linear for the six analytes. The intraday precision (RSD) and accuracy (RE) of the method were 3-9.8 and -1.7-4.5%, respectively. The analytical procedure herein described was used to assess the pharmacokinetics of the analytes in 24 healthy volunteers after a single oral dose containing 50 mg of tramadol hydrochloride, 3 mg chlorpheniramine maleate and 15 mg of dextromethorphan hydrobromide. Copyright © 2014 John Wiley & Sons, Ltd.
Pattern of Refractive Errors Among Ophthalmic Outpatients of ...
African Journals Online (AJOL)
The etiologic mechanism of RE can be both genetic ... system of a nonaccommodating eye fails to bring parallel rays of light to focus on the .... homatropine 1% eye‑drops. .... Abdull MM, Sivasubramaniam S, Murthy GV, Gilbert C, Abubakar T,.
Decision analysis multicriteria analysis
International Nuclear Information System (INIS)
Lombard, J.
1986-09-01
The ALARA procedure covers a wide range of decisions from the simplest to the most complex one. For the simplest one the engineering judgement is generally enough and the use of a decision aiding technique is therefore not necessary. For some decisions the comparison of the available protection option may be performed from two or a few criteria (or attributes) (protection cost, collective dose,...) and the use of rather simple decision aiding techniques, like the Cost Effectiveness Analysis or the Cost Benefit Analysis, is quite enough. For the more complex decisions, involving numerous criteria or for decisions involving large uncertainties or qualitative judgement the use of these techniques, even the extended cost benefit analysis, is not recommended and appropriate techniques like multi-attribute decision aiding techniques are more relevant. There is a lot of such particular techniques and it is not possible to present all of them. Therefore only two broad categories of multi-attribute decision aiding techniques will be presented here: decision analysis and the outranking analysis
International Nuclear Information System (INIS)
2008-05-01
This book introduces energy and resource technology development business with performance analysis, which has business division and definition, analysis of current situation of support, substance of basic plan of national energy, resource technique development, selection of analysis index, result of performance analysis by index, performance result of investigation, analysis and appraisal of energy and resource technology development business in 2007.
International Nuclear Information System (INIS)
Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok
1989-02-01
This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.
Energy Technology Data Exchange (ETDEWEB)
Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok
1989-02-15
This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.
Energy Technology Data Exchange (ETDEWEB)
Kim, Seung Jae; Seo, Seong Gyu
1995-03-15
This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.
International Nuclear Information System (INIS)
Kim, Seung Jae; Seo, Seong Gyu
1995-03-01
This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.
... page: //medlineplus.gov/ency/article/003741.htm Sensitivity analysis To use the sharing features on this page, please enable JavaScript. Sensitivity analysis determines the effectiveness of antibiotics against microorganisms (germs) ...
McShane, Edward James
2013-01-01
This text surveys practical elements of real function theory, general topology, and functional analysis. Discusses the maximality principle, the notion of convergence, the Lebesgue-Stieltjes integral, function spaces and harmonic analysis. Includes exercises. 1959 edition.
Cerebrospinal fluid analysis ... Analysis of CSF can help detect certain conditions and diseases. All of the following can be, but ... An abnormal CSF analysis result may be due to many different causes, ... Encephalitis (such as West Nile and Eastern Equine) Hepatic ...
... analysis URL of this page: //medlineplus.gov/ency/article/003627.htm Semen analysis To use the sharing features on this page, please enable JavaScript. Semen analysis measures the amount and quality of a man's semen and sperm. Semen is ...
Kantorovich, L V
1982-01-01
Functional Analysis examines trends in functional analysis as a mathematical discipline and the ever-increasing role played by its techniques in applications. The theory of topological vector spaces is emphasized, along with the applications of functional analysis to applied analysis. Some topics of functional analysis connected with applications to mathematical economics and control theory are also discussed. Comprised of 18 chapters, this book begins with an introduction to the elements of the theory of topological spaces, the theory of metric spaces, and the theory of abstract measure space
International Nuclear Information System (INIS)
Berman, M.; Bischof, L.M.; Breen, E.J.; Peden, G.M.
1994-01-01
This paper provides an overview of modern image analysis techniques pertinent to materials science. The usual approach in image analysis contains two basic steps: first, the image is segmented into its constituent components (e.g. individual grains), and second, measurement and quantitative analysis are performed. Usually, the segmentation part of the process is the harder of the two. Consequently, much of the paper concentrates on this aspect, reviewing both fundamental segmentation tools (commonly found in commercial image analysis packages) and more advanced segmentation tools. There is also a review of the most widely used quantitative analysis methods for measuring the size, shape and spatial arrangements of objects. Many of the segmentation and analysis methods are demonstrated using complex real-world examples. Finally, there is a discussion of hardware and software issues. 42 refs., 17 figs
Thiemann, Francis C.
Semiotic analysis is a method of analyzing signs (e.g., words) to reduce non-numeric data to their component parts without losing essential meanings. Semiotics dates back to Aristotle's analysis of language; it was much advanced by nineteenth-century analyses of style and logic and by Whitehead and Russell's description in this century of the role…
Indian Academy of Sciences (India)
Dimensional analysis is a useful tool which finds important applications in physics and engineering. It is most effective when there exist a maximal number of dimensionless quantities constructed out of the relevant physical variables. Though a complete theory of dimen- sional analysis was developed way back in 1914 in a.
Bravená, Helena
2009-01-01
This bacherlor thesis deals with the importance of job analysis for personnel activities in the company. The aim of this work is to find the most suitable method of job analysis in a particular enterprise, and continues creating descriptions and specifications of each job.
Validation of an HPLC method for the simultaneous determination of eletriptan and UK 120.413
Directory of Open Access Journals (Sweden)
LJILJANA ZIVANOVIC
2006-11-01
Full Text Available Arapid and sensitive RPHPLCmethod was developed for the routine control analysis of eletriptan hydrobromide and its organic impurity UK 120.413 in Relpax® tablets. The chromatography was performed at 20 °Cusing a C18 XTerraTM (5 m, 150 × 4,6 mm column at a flow rate 1.0 ml/min. The drug and its impurity were detected at 225 nm. The mobile phase consisted of TEA (1 % – methanol (67.2:32.8 v/v, the pH of which was adjusted to 6.8 with 85 % orthophosphoric acid. Quantification was accomplished by the internal standard method. The developed RP HPLC method was validated by testing: accuracy, precision, repeatibility, specificity, detection limit, quantification limit, linearity, robustness and sensitivity. High linearity of the analytical procedure was confirmed over the concentration range of 0.05 – 1.00 mg/ml for eletriptan hydrobromide and from 0.10 – 1.50 µg/ml for UK 120.413, with correlation coefficients greater than r = 0.995. The low value of the RSD expressed the good repeatability and precision of the method. Experimental design and a response surface method were used to test robustness of the analytical procedure and to evaluate the effect of variation of the method parameters, namely the mobile phase composition, pH and temperature. They showed small deviations from the method setting. The good recovery and low RSD confirm the suitability of the proposed RP HPLC method for the routine determination of eletriptan hydrobromide and its impurity UK 120.413 in Relpax® tables.
International Nuclear Information System (INIS)
Francois, P.
1996-01-01
We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs
Energy Technology Data Exchange (ETDEWEB)
Francois, P
1997-12-31
We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs.
Khabaza, I M
1960-01-01
Numerical Analysis is an elementary introduction to numerical analysis, its applications, limitations, and pitfalls. Methods suitable for digital computers are emphasized, but some desk computations are also described. Topics covered range from the use of digital computers in numerical work to errors in computations using desk machines, finite difference methods, and numerical solution of ordinary differential equations. This book is comprised of eight chapters and begins with an overview of the importance of digital computers in numerical analysis, followed by a discussion on errors in comput
International Nuclear Information System (INIS)
Warner, M.
1987-01-01
What is the current state of quantitative trace analytical chemistry? What are today's research efforts? And what challenges does the future hold? These are some of the questions addressed at a recent four-day symposium sponsored by the National Bureau of Standards (NBS) entitled Accuracy in Trace Analysis - Accomplishments, Goals, Challenges. The two plenary sessions held on the first day of the symposium reviewed the history of quantitative trace analysis, discussed the present situation from academic and industrial perspectives, and summarized future needs. The remaining three days of the symposium consisted of parallel sessions dealing with the measurement process; quantitation in materials; environmental, clinical, and nutrient analysis; and advances in analytical techniques
Goodstein, R L
2010-01-01
Recursive analysis develops natural number computations into a framework appropriate for real numbers. This text is based upon primary recursive arithmetic and presents a unique combination of classical analysis and intuitional analysis. Written by a master in the field, it is suitable for graduate students of mathematics and computer science and can be read without a detailed knowledge of recursive arithmetic.Introductory chapters on recursive convergence and recursive and relative continuity are succeeded by explorations of recursive and relative differentiability, the relative integral, and
Tao, Terence
2016-01-01
This is part one of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...
Tao, Terence
2016-01-01
This is part two of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...
International Nuclear Information System (INIS)
1988-01-01
Basic studies in nuclear analytical techniques include the examination of underlying assumptions and the development and extention of techniques involving the use of ion beams for elemental and mass analysis. 1 ref., 1 tab
Energy Technology Data Exchange (ETDEWEB)
2016-06-01
Fact sheet summarizing NREL's techno-economic analysis and life-cycle assessment capabilities to connect research with future commercial process integration, a critical step in the scale-up of biomass conversion technologies.
Gasinski, Leszek
2005-01-01
Hausdorff Measures and Capacity. Lebesgue-Bochner and Sobolev Spaces. Nonlinear Operators and Young Measures. Smooth and Nonsmooth Analysis and Variational Principles. Critical Point Theory. Eigenvalue Problems and Maximum Principles. Fixed Point Theory.
DEFF Research Database (Denmark)
Bauer-Gottwein, Peter; Riegels, Niels; Pulido-Velazquez, Manuel
2017-01-01
Hydroeconomic analysis and modeling provides a consistent and quantitative framework to assess the links between water resources systems and economic activities related to water use, simultaneously modeling water supply and water demand. It supports water managers and decision makers in assessing...... trade-offs between different water uses, different geographic regions, and various economic sectors and between the present and the future. Hydroeconomic analysis provides consistent economic performance criteria for infrastructure development and institutional reform in water policies and management...... organizations. This chapter presents an introduction to hydroeconomic analysis and modeling, and reviews the state of the art in the field. We review available economic water-valuation techniques and summarize the main types of decision problems encountered in hydroeconomic analysis. Popular solution strategies...
Schiffrin, Deborah
1990-01-01
Summarizes the current state of research in conversation analysis, referring primarily to six different perspectives that have developed from the philosophy, sociology, anthropology, and linguistics disciplines. These include pragmatics; speech act theory; interactional sociolinguistics; ethnomethodology; ethnography of communication; and…
Gorsuch, Richard L
2013-01-01
Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.
International Nuclear Information System (INIS)
1959-01-01
Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation
Energy Technology Data Exchange (ETDEWEB)
NONE
1959-07-15
Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation
DEFF Research Database (Denmark)
Brænder, Morten; Andersen, Lotte Bøgh
2014-01-01
Based on our 2013-article, ”Does Deployment to War Affect Soldiers' Public Service Motivation – A Panel Study of Soldiers Before and After their Service in Afghanistan”, we present Panel Analysis as a methodological discipline. Panels consist of multiple units of analysis, observed at two or more...... in research settings where it is not possible to distribute units of analysis randomly or where the independent variables cannot be manipulated. The greatest disadvantage in regard to using panel studies is that data may be difficult to obtain. This is most clearly vivid in regard to the use of panel surveys...... points in time. In comparison with traditional cross-sectional studies, the advantage of using panel studies is that the time dimension enables us to study effects. Whereas experimental designs may have a clear advantage in regard to causal inference, the strength of panel studies is difficult to match...
Loeb, Peter A
2016-01-01
This textbook is designed for a year-long course in real analysis taken by beginning graduate and advanced undergraduate students in mathematics and other areas such as statistics, engineering, and economics. Written by one of the leading scholars in the field, it elegantly explores the core concepts in real analysis and introduces new, accessible methods for both students and instructors. The first half of the book develops both Lebesgue measure and, with essentially no additional work for the student, general Borel measures for the real line. Notation indicates when a result holds only for Lebesgue measure. Differentiation and absolute continuity are presented using a local maximal function, resulting in an exposition that is both simpler and more general than the traditional approach. The second half deals with general measures and functional analysis, including Hilbert spaces, Fourier series, and the Riesz representation theorem for positive linear functionals on continuous functions with compact support....
Scott, L Ridgway
2011-01-01
Computational science is fundamentally changing how technological questions are addressed. The design of aircraft, automobiles, and even racing sailboats is now done by computational simulation. The mathematical foundation of this new approach is numerical analysis, which studies algorithms for computing expressions defined with real numbers. Emphasizing the theory behind the computation, this book provides a rigorous and self-contained introduction to numerical analysis and presents the advanced mathematics that underpin industrial software, including complete details that are missing from most textbooks. Using an inquiry-based learning approach, Numerical Analysis is written in a narrative style, provides historical background, and includes many of the proofs and technical details in exercises. Students will be able to go beyond an elementary understanding of numerical simulation and develop deep insights into the foundations of the subject. They will no longer have to accept the mathematical gaps that ex...
Rao, G Shanker
2006-01-01
About the Book: This book provides an introduction to Numerical Analysis for the students of Mathematics and Engineering. The book is designed in accordance with the common core syllabus of Numerical Analysis of Universities of Andhra Pradesh and also the syllabus prescribed in most of the Indian Universities. Salient features: Approximate and Numerical Solutions of Algebraic and Transcendental Equation Interpolation of Functions Numerical Differentiation and Integration and Numerical Solution of Ordinary Differential Equations The last three chapters deal with Curve Fitting, Eigen Values and Eigen Vectors of a Matrix and Regression Analysis. Each chapter is supplemented with a number of worked-out examples as well as number of problems to be solved by the students. This would help in the better understanding of the subject. Contents: Errors Solution of Algebraic and Transcendental Equations Finite Differences Interpolation with Equal Intervals Interpolation with Unequal Int...
Jacques, Ian
1987-01-01
This book is primarily intended for undergraduates in mathematics, the physical sciences and engineering. It introduces students to most of the techniques forming the core component of courses in numerical analysis. The text is divided into eight chapters which are largely self-contained. However, with a subject as intricately woven as mathematics, there is inevitably some interdependence between them. The level of difficulty varies and, although emphasis is firmly placed on the methods themselves rather than their analysis, we have not hesitated to include theoretical material when we consider it to be sufficiently interesting. However, it should be possible to omit those parts that do seem daunting while still being able to follow the worked examples and to tackle the exercises accompanying each section. Familiarity with the basic results of analysis and linear algebra is assumed since these are normally taught in first courses on mathematical methods. For reference purposes a list of theorems used in the t...
DiBenedetto, Emmanuele
2016-01-01
The second edition of this classic textbook presents a rigorous and self-contained introduction to real analysis with the goal of providing a solid foundation for future coursework and research in applied mathematics. Written in a clear and concise style, it covers all of the necessary subjects as well as those often absent from standard introductory texts. Each chapter features a “Problems and Complements” section that includes additional material that briefly expands on certain topics within the chapter and numerous exercises for practicing the key concepts. The first eight chapters explore all of the basic topics for training in real analysis, beginning with a review of countable sets before moving on to detailed discussions of measure theory, Lebesgue integration, Banach spaces, functional analysis, and weakly differentiable functions. More topical applications are discussed in the remaining chapters, such as maximal functions, functions of bounded mean oscillation, rearrangements, potential theory, a...
International Nuclear Information System (INIS)
Romli
1997-01-01
Cluster analysis is the name of group of multivariate techniques whose principal purpose is to distinguish similar entities from the characteristics they process.To study this analysis, there are several algorithms that can be used. Therefore, this topic focuses to discuss the algorithms, such as, similarity measures, and hierarchical clustering which includes single linkage, complete linkage and average linkage method. also, non-hierarchical clustering method, which is popular name K -mean method ' will be discussed. Finally, this paper will be described the advantages and disadvantages of every methods
Rockafellar, Ralph Tyrell
2015-01-01
Available for the first time in paperback, R. Tyrrell Rockafellar's classic study presents readers with a coherent branch of nonlinear mathematical analysis that is especially suited to the study of optimization problems. Rockafellar's theory differs from classical analysis in that differentiability assumptions are replaced by convexity assumptions. The topics treated in this volume include: systems of inequalities, the minimum or maximum of a convex function over a convex set, Lagrange multipliers, minimax theorems and duality, as well as basic results about the structure of convex sets and
Brezinski, C
2012-01-01
Numerical analysis has witnessed many significant developments in the 20th century. This book brings together 16 papers dealing with historical developments, survey papers and papers on recent trends in selected areas of numerical analysis, such as: approximation and interpolation, solution of linear systems and eigenvalue problems, iterative methods, quadrature rules, solution of ordinary-, partial- and integral equations. The papers are reprinted from the 7-volume project of the Journal of Computational and Applied Mathematics on '/homepage/sac/cam/na2000/index.html<
International Nuclear Information System (INIS)
Biehl, F.A.
1984-05-01
This paper presents the criteria, previous nuclear experience in space, analysis techniques, and possible breakup enhancement devices applicable to an acceptable SP-100 reentry from space. Reactor operation in nuclear-safe orbit will minimize the radiological risk; the remaining safeguards criteria need to be defined. A simple analytical point mass reentry technique and a more comprehensive analysis method that considers vehicle dynamics and orbit insertion malfunctions are presented. Vehicle trajectory, attitude, and possible breakup enhancement devices will be integrated in the simulation as required to ensure an adequate representation of the reentry process
Aggarwal, Charu C
2013-01-01
With the increasing advances in hardware technology for data collection, and advances in software technology (databases) for data organization, computer scientists have increasingly participated in the latest advancements of the outlier analysis field. Computer scientists, specifically, approach this field based on their practical experiences in managing large amounts of data, and with far fewer assumptions- the data can be of any type, structured or unstructured, and may be extremely large.Outlier Analysis is a comprehensive exposition, as understood by data mining experts, statisticians and
Everitt, Brian S; Leese, Morven; Stahl, Daniel
2011-01-01
Cluster analysis comprises a range of methods for classifying multivariate data into subgroups. By organizing multivariate data into such subgroups, clustering can help reveal the characteristics of any structure or patterns present. These techniques have proven useful in a wide range of areas such as medicine, psychology, market research and bioinformatics.This fifth edition of the highly successful Cluster Analysis includes coverage of the latest developments in the field and a new chapter dealing with finite mixture models for structured data.Real life examples are used throughout to demons
Snell, K S; Langford, W J; Maxwell, E A
1966-01-01
Elementary Analysis, Volume 2 introduces several of the ideas of modern mathematics in a casual manner and provides the practical experience in algebraic and analytic operations that lays a sound foundation of basic skills. This book focuses on the nature of number, algebraic and logical structure, groups, rings, fields, vector spaces, matrices, sequences, limits, functions and inverse functions, complex numbers, and probability. The logical structure of analysis given through the treatment of differentiation and integration, with applications to the trigonometric and logarithmic functions, is
International Nuclear Information System (INIS)
Baron, J.H.; Nunez McLeod, J.; Rivera, S.S.
1997-01-01
This book contains a selection of research works performed in the CEDIAC Institute (Cuyo National University) in the area of Risk Analysis, with specific orientations to the subjects of uncertainty and sensitivity studies, software reliability, severe accident modeling, etc. This volume presents important material for all those researches who want to have an insight in the risk analysis field, as a tool to solution several problems frequently found in the engineering and applied sciences field, as well as for the academic teachers who want to keep up to date, including the new developments and improvements continuously arising in this field [es
Alan Gallegos
2002-01-01
Watershed analyses and assessments for the Kings River Sustainable Forest Ecosystems Project were done on about 33,000 acres of the 45,500-acre Big Creek watershed and 32,000 acres of the 85,100-acre Dinkey Creek watershed. Following procedures developed for analysis of cumulative watershed effects (CWE) in the Pacific Northwest Region of the USDA Forest Service, the...
Freund, Rudolf J; Sa, Ping
2006-01-01
The book provides complete coverage of the classical methods of statistical analysis. It is designed to give students an understanding of the purpose of statistical analyses, to allow the student to determine, at least to some degree, the correct type of statistical analyses to be performed in a given situation, and have some appreciation of what constitutes good experimental design
International Nuclear Information System (INIS)
Unterberger, A.
1987-01-01
We study the Klein-Gordon symbolic calculus of operators acting on solutions of the free Klein-Gordon equation. It contracts to the Weyl calculus as c→∞. Mathematically, it may also be considered as a pseudodifferential analysis on the unit ball of R n [fr
International Nuclear Information System (INIS)
Woodard, K.
1985-01-01
The objectives of this paper are to: Provide a realistic assessment of consequences; Account for plant and site-specific characteristics; Adjust accident release characteristics to account for results of plant-containment analysis; Produce conditional risk curves for each of five health effects; and Estimate uncertainties
DEFF Research Database (Denmark)
Hjørland, Birger
2017-01-01
The domain-analytic approach to knowledge organization (KO) (and to the broader field of library and information science, LIS) is outlined. The article reviews the discussions and proposals on the definition of domains, and provides an example of a domain-analytic study in the field of art studies....... Varieties of domain analysis as well as criticism and controversies are presented and discussed....
International Nuclear Information System (INIS)
Rhoades, W.A.; Dray, B.J.
1970-01-01
The effect of Gadolinium-155 on the prompt kinetic behavior of a zirconium hydride reactor has been deduced, using experimental data from the SNAPTRAN machine. The poison material makes the temperature coefficient more positive, and the Type IV sleeves were deduced to give a positive coefficient above 1100 0 F. A thorough discussion of the data and analysis is included. (U.S.)
International Nuclear Information System (INIS)
Saadi, Radouan; Marah, Hamid
2014-01-01
This report presents results related to Tritium analysis carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal, within the framework of the RAF7011 project. It describes analytical method and instrumentation including general uncertainty estimation: Electrolytic enrichment and liquid scintillation counting; The results are expressed in Tritium Unit (TU); Low Limit Detection: 0.02 TU
Miller, Rupert G
2011-01-01
A concise summary of the statistical methods used in the analysis of survival data with censoring. Emphasizes recently developed nonparametric techniques. Outlines methods in detail and illustrates them with actual data. Discusses the theory behind each method. Includes numerous worked problems and numerical exercises.
Koornneef, M.; Alonso-Blanco, C.; Stam, P.
2006-01-01
The Mendelian analysis of genetic variation, available as induced mutants or as natural variation, requires a number of steps that are described in this chapter. These include the determination of the number of genes involved in the observed trait's variation, the determination of dominance
DEFF Research Database (Denmark)
Nielsen, Kirsten
2010-01-01
The first part of this article presents the characteristics of Hebrew poetry: features associated with rhythm and phonology, grammatical features, structural elements like parallelism, and imagery and intertextuality. The second part consists of an analysis of Psalm 121. It is argued that assonan...
Adrian Ioana; Tiberiu Socaciu
2013-01-01
The article presents specific aspects of management and models for economic analysis. Thus, we present the main types of economic analysis: statistical analysis, dynamic analysis, static analysis, mathematical analysis, psychological analysis. Also we present the main object of the analysis: the technological activity analysis of a company, the analysis of the production costs, the economic activity analysis of a company, the analysis of equipment, the analysis of labor productivity, the anal...
International Nuclear Information System (INIS)
Smith, M.; Jones, D.R.
1991-01-01
The goal of exploration is to find reserves that will earn an adequate rate of return on the capital invested. Neither exploration nor economics is an exact science. The authors must therefore explore in those trends (plays) that have the highest probability of achieving this goal. Trend analysis is a technique for organizing the available data to make these strategic exploration decisions objectively and is in conformance with their goals and risk attitudes. Trend analysis differs from resource estimation in its purpose. It seeks to determine the probability of economic success for an exploration program, not the ultimate results of the total industry effort. Thus the recent past is assumed to be the best estimate of the exploration probabilities for the near future. This information is combined with economic forecasts. The computer software tools necessary for trend analysis are (1) Information data base - requirements and sources. (2) Data conditioning program - assignment to trends, correction of errors, and conversion into usable form. (3) Statistical processing program - calculation of probability of success and discovery size probability distribution. (4) Analytical processing - Monte Carlo simulation to develop the probability distribution of the economic return/investment ratio for a trend. Limited capital (short-run) effects are analyzed using the Gambler's Ruin concept in the Monte Carlo simulation and by a short-cut method. Multiple trend analysis is concerned with comparing and ranking trends, allocating funds among acceptable trends, and characterizing program risk by using risk profiles. In summary, trend analysis is a reality check for long-range exploration planning
DEFF Research Database (Denmark)
The 19th Scandinavian Conference on Image Analysis was held at the IT University of Copenhagen in Denmark during June 15-17, 2015. The SCIA conference series has been an ongoing biannual event for more than 30 years and over the years it has nurtured a world-class regional research and development...... area within the four participating Nordic countries. It is a regional meeting of the International Association for Pattern Recognition (IAPR). We would like to thank all authors who submitted works to this year’s SCIA, the invited speakers, and our Program Committee. In total 67 papers were submitted....... The topics of the accepted papers range from novel applications of vision systems, pattern recognition, machine learning, feature extraction, segmentation, 3D vision, to medical and biomedical image analysis. The papers originate from all the Scandinavian countries and several other European countries...
Helson, Henry
2010-01-01
This second edition has been enlarged and considerably rewritten. Among the new topics are infinite product spaces with applications to probability, disintegration of measures on product spaces, positive definite functions on the line, and additional information about Weyl's theorems on equidistribution. Topics that have continued from the first edition include Minkowski's theorem, measures with bounded powers, idempotent measures, spectral sets of bounded functions and a theorem of Szego, and the Wiener Tauberian theorem. Readers of the book should have studied the Lebesgue integral, the elementary theory of analytic and harmonic functions, and the basic theory of Banach spaces. The treatment is classical and as simple as possible. This is an instructional book, not a treatise. Mathematics students interested in analysis will find here what they need to know about Fourier analysis. Physicists and others can use the book as a reference for more advanced topics.
Bray, Hubert L; Mazzeo, Rafe; Sesum, Natasa
2015-01-01
This volume includes expanded versions of the lectures delivered in the Graduate Minicourse portion of the 2013 Park City Mathematics Institute session on Geometric Analysis. The papers give excellent high-level introductions, suitable for graduate students wishing to enter the field and experienced researchers alike, to a range of the most important areas of geometric analysis. These include: the general issue of geometric evolution, with more detailed lectures on Ricci flow and Kähler-Ricci flow, new progress on the analytic aspects of the Willmore equation as well as an introduction to the recent proof of the Willmore conjecture and new directions in min-max theory for geometric variational problems, the current state of the art regarding minimal surfaces in R^3, the role of critical metrics in Riemannian geometry, and the modern perspective on the study of eigenfunctions and eigenvalues for Laplace-Beltrami operators.
Freitag, Eberhard
2005-01-01
The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...
International Nuclear Information System (INIS)
Quinn, C.A.
1983-01-01
The article deals with spectrographic analysis and the analytical methods based on it. The theory of spectrographic analysis is discussed as well as the layout of a spectrometer system. The infrared absorption spectrum of a compound is probably its most unique property. The absorption of infrared radiation depends on increasing the energy of vibration and rotation associated with a covalent bond. The infrared region is intrinsically low in energy thus the design of infrared spectrometers is always directed toward maximising energy throughput. The article also considers atomic absorption - flame atomizers, non-flame atomizers and the source of radiation. Under the section an emission spectroscopy non-electrical energy sources, electrical energy sources and electrical flames are discussed. Digital computers form a part of the development on spectrographic instrumentation
Cheng, Lizhi; Luo, Yong; Chen, Bo
2014-01-01
This book could be divided into two parts i.e. fundamental wavelet transform theory and method and some important applications of wavelet transform. In the first part, as preliminary knowledge, the Fourier analysis, inner product space, the characteristics of Haar functions, and concepts of multi-resolution analysis, are introduced followed by a description on how to construct wavelet functions both multi-band and multi wavelets, and finally introduces the design of integer wavelets via lifting schemes and its application to integer transform algorithm. In the second part, many applications are discussed in the field of image and signal processing by introducing other wavelet variants such as complex wavelets, ridgelets, and curvelets. Important application examples include image compression, image denoising/restoration, image enhancement, digital watermarking, numerical solution of partial differential equations, and solving ill-conditioned Toeplitz system. The book is intended for senior undergraduate stude...
International Nuclear Information System (INIS)
Hwang, Hun
2007-02-01
This book explains potentiometry, voltametry, amperometry and basic conception of conductometry with eleven chapters. It gives the specific descriptions on electrochemical cell and its mode, basic conception of electrochemical analysis on oxidation-reduction reaction, standard electrode potential, formal potential, faradaic current and faradaic process, mass transfer and overvoltage, potentiometry and indirect potentiometry, polarography with TAST, normal pulse and deferential pulse, voltammetry, conductometry and conductometric titration.
International Nuclear Information System (INIS)
Badwe, R.A.
1999-01-01
The primary endpoint in the majority of the studies has been either disease recurrence or death. This kind of analysis requires a special method since all patients in the study experience the endpoint. The standard method for estimating such survival distribution is Kaplan Meier method. The survival function is defined as the proportion of individuals who survive beyond certain time. Multi-variate comparison for survival has been carried out with Cox's proportional hazard model
DEFF Research Database (Denmark)
Andersen, Lars
This book contains the lecture notes for the 9th semester course on elastodynamics. The first chapter gives an overview of the basic theory of stress waves propagating in viscoelastic media. In particular, the effect of surfaces and interfaces in a viscoelastic material is studied, and different....... Thus, in Chapter 3, an alternative semi-analytic method is derived, which may be applied for the analysis of layered half-spaces subject to moving or stationary loads....
Mucha, Hans-Joachim; Sofyan, Hizir
2000-01-01
As an explorative technique, duster analysis provides a description or a reduction in the dimension of the data. It classifies a set of observations into two or more mutually exclusive unknown groups based on combinations of many variables. Its aim is to construct groups in such a way that the profiles of objects in the same groups are relatively homogenous whereas the profiles of objects in different groups are relatively heterogeneous. Clustering is distinct from classification techniques, ...
International Nuclear Information System (INIS)
Garbarino, J.R.; Steinheimer, T.R.; Taylor, H.E.
1985-01-01
This is the twenty-first biennial review of the inorganic and organic analytical chemistry of water. The format of this review differs somewhat from previous reviews in this series - the most recent of which appeared in Analytical Chemistry in April 1983. Changes in format have occurred in the presentation of material concerning review articles and the inorganic analysis of water sections. Organic analysis of water sections are organized as in previous reviews. Review articles have been compiled and tabulated in an Appendix with respect to subject, title, author(s), citation, and number of references cited. The inorganic water analysis sections are now grouped by constituent using the periodic chart; for example, alkali, alkaline earth, 1st series transition metals, etc. Within these groupings the references are roughly grouped by instrumental technique; for example, spectrophotometry, atomic absorption spectrometry, etc. Multiconstituent methods for determining analytes that cannot be grouped in this manner are compiled into a separate section sorted by instrumental technique. References used in preparing this review were compiled from nearly 60 major journals published during the period from October 1982 through September 1984. Conference proceedings, most foreign journals, most trade journals, and most government publications are excluded. References cited were obtained using the American Chemical Society's Chemical Abstracts for sections on inorganic analytical chemistry, organic analytical chemistry, water, and sewage waste. Cross-references of these sections were also included. 860 references
Energy Technology Data Exchange (ETDEWEB)
None
1980-06-01
The Energy Policy and Conservation Act (EPCA) mandated that minimum energy efficiency standards be established for classes of refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, home heating equipment, kitchen ranges and ovens, central air conditioners, and furnaces. EPCA requires that standards be designed to achieve the maximum improvement in energy efficiency that is technologically feasible and economically justified. Following the introductory chapter, Chapter Two describes the methodology used in the economic analysis and its relationship to legislative criteria for consumer product efficiency assessment; details how the CPES Value Model systematically compared and evaluated the economic impacts of regulation on the consumer, manufacturer and Nation. Chapter Three briefly displays the results of the analysis and lists the proposed performance standards by product class. Chapter Four describes the reasons for developing a baseline forecast, characterizes the baseline scenario from which regulatory impacts were calculated and summarizes the primary models, data sources and assumptions used in the baseline formulations. Chapter Five summarizes the methodology used to calculate regulatory impacts; describes the impacts of energy performance standards relative to the baseline discussed in Chapter Four. Also discussed are regional standards and other program alternatives to performance standards. Chapter Six describes the procedure for balancing consumer, manufacturer, and national impacts to select standard levels. Details of models and data bases used in the analysis are included in Appendices A through K.
Newell, Homer E
2006-01-01
When employed with skill and understanding, vector analysis can be a practical and powerful tool. This text develops the algebra and calculus of vectors in a manner useful to physicists and engineers. Numerous exercises (with answers) not only provide practice in manipulation but also help establish students' physical and geometric intuition in regard to vectors and vector concepts.Part I, the basic portion of the text, consists of a thorough treatment of vector algebra and the vector calculus. Part II presents the illustrative matter, demonstrating applications to kinematics, mechanics, and e
Brand, Louis
2006-01-01
The use of vectors not only simplifies treatments of differential geometry, mechanics, hydrodynamics, and electrodynamics, but also makes mathematical and physical concepts more tangible and easy to grasp. This text for undergraduates was designed as a short introductory course to give students the tools of vector algebra and calculus, as well as a brief glimpse into these subjects' manifold applications. The applications are developed to the extent that the uses of the potential function, both scalar and vector, are fully illustrated. Moreover, the basic postulates of vector analysis are brou
Ding, Ying; Huang, Kai; Chen, Lan; Yang, Jie; Xu, Wen-Yan; Xu, Xue-Jiao; Duan, Ru; Zhang, Jing; He, Qing
2014-03-01
A sensitive and accurate HPLC-MS/MS method was developed for the simultaneous determination of dextromethorphan, dextrorphan and chlorphenamine in human plasma. Three analytes were extracted from plasma by liquid-liquid extraction using ethyl acetate and separated on a Kromasil 60-5CN column (3 µm, 2.1 × 150 mm) with mobile phase of acetonitrile-water (containing 0.1% formic acid; 50:50, v/v) at a flow rate of 0.2 mL/min. Quantification was performed on a triple quadrupole tandem mass spectrometer in multiple reaction monitoring mode using positive electrospray ionization. The calibration curve was linear over the range of 0.01-5 ng/mL for dextromethorphan, 0.02-5 ng/mL for dextrorphan and 0.025-20 ng/mL for chlorphenamine. The lower limits of quantification for dextromethorphan, dextrorphan and chlorphenamine were 0.01, 0.02 and 0.025 ng/mL, respectively. The intra- and inter-day precisions were within 11% and accuracies were in the range of 92.9-102.5%. All analytes were proved to be stable during sample storage, preparation and analytic procedures. This method was first applied to the pharmacokinetic study in healthy Chinese volunteers after a single oral dose of the formulation containing dextromethorphan hydrobromide (18 mg) and chlorpheniramine malaeate (8 mg). Copyright © 2013 John Wiley & Sons, Ltd.
Abbott, Stephen
2015-01-01
This lively introductory text exposes the student to the rewards of a rigorous study of functions of a real variable. In each chapter, informal discussions of questions that give analysis its inherent fascination are followed by precise, but not overly formal, developments of the techniques needed to make sense of them. By focusing on the unifying themes of approximation and the resolution of paradoxes that arise in the transition from the finite to the infinite, the text turns what could be a daunting cascade of definitions and theorems into a coherent and engaging progression of ideas. Acutely aware of the need for rigor, the student is much better prepared to understand what constitutes a proper mathematical proof and how to write one. Fifteen years of classroom experience with the first edition of Understanding Analysis have solidified and refined the central narrative of the second edition. Roughly 150 new exercises join a selection of the best exercises from the first edition, and three more project-sty...
DEFF Research Database (Denmark)
Moore, R; Brødsgaard, I; Miller, ML
1997-01-01
A quantitative method for validating qualitative interview results and checking sample parameters is described and illustrated using common pain descriptions among a sample of Anglo-American and mandarin Chinese patients and dentists matched by age and gender. Assumptions were that subjects were ...... of covalidating questionnaires that reflect results of qualitative interviews are recommended in order to estimate sample parameters such as intersubject agreement, individual subject accuracy, and minimum required sample sizes.......A quantitative method for validating qualitative interview results and checking sample parameters is described and illustrated using common pain descriptions among a sample of Anglo-American and mandarin Chinese patients and dentists matched by age and gender. Assumptions were that subjects were...... of consistency in use of descriptors within groups, validity of description, accuracy of individuals compared with others in their group, and minimum required sample size were calculated using Cronbach's alpha, factor analysis, and Bayesian probability. Ethnic and professional differences within and across...
International Nuclear Information System (INIS)
Iorio, A.F.; Crespi, J.C.
1987-01-01
After ten years of operation at the Atucha I Nuclear Power Station a gear belonging to a pressurized heavy water reactor refuelling machine, failed. The gear box was used to operate the inlet-outlet heavy-water valve of the machine. Visual examination of the gear device showed an absence of lubricant and that several gear teeth were broken at the root. Motion was transmitted with a speed-reducing device with controlled adjustable times in order to produce a proper fitness of the valve closure. The aim of this paper is to discuss the results of the gear failure analysis in order to recommend the proper solution to prevent further failures. (Author)
International Nuclear Information System (INIS)
1988-01-01
In a search for correlations between the elemental composition of trace elements in human stones and the stone types with relation to their growth pattern, a combined PIXE and x-ray diffraction spectrometry approach was implemented. The combination of scanning PIXE and XRD has proved to be an advance in the methodology of stone analysis and may point to the growth pattern in the body. The exact role of trace elements in the formation and growth of urinary stones is not fully understood. Efforts are thus continuing firstly to solve the analytical problems concerned and secondly to design suitable experiments that would provide information about the occurrence and distribution of trace elements in urine. 1 fig., 1 ref
International Nuclear Information System (INIS)
Straub, W.A.
1987-01-01
This review is the seventh in the series compiled by using the Dialog on-line CA Search facilities at the Information Resource Center of USS Technical Center covering the period from Oct. 1984 to Nov. 1, 1986. The quest for better surface properties, through the application of various electrochemical and other coating techniques, seems to have increased and reinforces the notion that only through the value added to a steel by proper finishing steps can a major supplier hope to compete profitably. The detection, determination, and control of microalloying constituents has also been generating a lot of interest as evidenced by the number of publications devoted to this subject in the last few years. Several recent review articles amplify on the recent trends in the application of modern analytical technology to steelmaking. Another review has been devoted to the determination of trace elements and the simultaneous determination of elements in metals by mass spectrometry, atomic absorption spectrometry, and multielement emission spectrometry. Problems associated with the analysis of electroplating wastewaters have been reviewed in a recent publication that has described the use of various spectrophotometric methods for this purpose. The collection and treatment of analytical data in the modern steel making environment have been extensively reviewed emphasis on the interaction of the providers and users of the analytical data, its quality, and the cost of its collection. Raw material treatment and beneficiation was the dominant theme
Bhatia, Rajendra
1997-01-01
A good part of matrix theory is functional analytic in spirit. This statement can be turned around. There are many problems in operator theory, where most of the complexities and subtleties are present in the finite-dimensional case. My purpose in writing this book is to present a systematic treatment of methods that are useful in the study of such problems. This book is intended for use as a text for upper division and gradu ate courses. Courses based on parts of the material have been given by me at the Indian Statistical Institute and at the University of Toronto (in collaboration with Chandler Davis). The book should also be useful as a reference for research workers in linear algebra, operator theory, mathe matical physics and numerical analysis. A possible subtitle of this book could be Matrix Inequalities. A reader who works through the book should expect to become proficient in the art of deriving such inequalities. Other authors have compared this art to that of cutting diamonds. One first has to...
International Nuclear Information System (INIS)
Thomas, R.E.
1982-03-01
An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software
Energy Technology Data Exchange (ETDEWEB)
Ibsen, Lars Bo; Liingaard, M.
2006-12-15
This technical report concerns the basic theory and principles for experimental modal analysis. The sections within the report are: Output-only modal analysis software, general digital analysis, basics of structural dynamics and modal analysis and system identification. (au)
Theoretical numerical analysis a functional analysis framework
Atkinson, Kendall
2005-01-01
This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solu
International Nuclear Information System (INIS)
2003-08-01
This book deals with analysis of heat transfer which includes nonlinear analysis examples, radiation heat transfer, analysis of heat transfer in ANSYS, verification of analysis result, analysis of heat transfer of transition with automatic time stepping and open control, analysis of heat transfer using arrangement of ANSYS, resistance of thermal contact, coupled field analysis such as of thermal-structural interaction, cases of coupled field analysis, and phase change.
Information security risk analysis
Peltier, Thomas R
2001-01-01
Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex
International Nuclear Information System (INIS)
Son, Seung Hui
2004-02-01
This book deals with information technology and business process, information system architecture, methods of system development, plan on system development like problem analysis and feasibility analysis, cases for system development, comprehension of analysis of users demands, analysis of users demands using traditional analysis, users demands analysis using integrated information system architecture, system design using integrated information system architecture, system implementation, and system maintenance.
Preparation of Chemicals and Bulk Drug Substances for the U.S. Army Drug Development Program
1997-12-01
Turning to Chart No. 7, dextromethorphan hydrobromide was converted to the free base, then treated with 1-chloroethyl chloroformate to give the N...route, shown in Chart No. 9, was used m the current resynthesis. Commercial dextromethorphan hydrobromide was treated with concentrated hydrobromic
Analysis of Project Finance | Energy Analysis | NREL
Analysis of Project Finance Analysis of Project Finance NREL analysis helps potential renewable energy developers and investors gain insights into the complex world of project finance. Renewable energy project finance is complex, requiring knowledge of federal tax credits, state-level incentives, renewable
International Nuclear Information System (INIS)
Wright, A.C.D.
2002-01-01
This paper discusses the safety analysis fundamentals in reactor design. This study includes safety analysis done to show consequences of postulated accidents are acceptable. Safety analysis is also used to set design of special safety systems and includes design assist analysis to support conceptual design. safety analysis is necessary for licensing a reactor, to maintain an operating license, support changes in plant operations
An example of multidimensional analysis: Discriminant analysis
International Nuclear Information System (INIS)
Lutz, P.
1990-01-01
Among the approaches on the data multi-dimensional analysis, lectures on the discriminant analysis including theoretical and practical aspects are presented. The discrimination problem, the analysis steps and the discrimination categories are stressed. Examples on the descriptive historical analysis, the discrimination for decision making, the demonstration and separation of the top quark are given. In the linear discriminant analysis the following subjects are discussed: Huyghens theorem, projection, discriminant variable, geometrical interpretation, case for g=2, classification method, separation of the top events. Criteria allowing the obtention of relevant results are included [fr
... Sources Ask Us Also Known As Sperm Analysis Sperm Count Seminal Fluid Analysis Formal Name Semen Analysis This ... semen Viscosity—consistency or thickness of the semen Sperm count—total number of sperm Sperm concentration (density)—number ...
Papageorgiou, Nikolaos S
2009-01-01
Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.
Shape analysis in medical image analysis
Tavares, João
2014-01-01
This book contains thirteen contributions from invited experts of international recognition addressing important issues in shape analysis in medical image analysis, including techniques for image segmentation, registration, modelling and classification, and applications in biology, as well as in cardiac, brain, spine, chest, lung and clinical practice. This volume treats topics such as, anatomic and functional shape representation and matching; shape-based medical image segmentation; shape registration; statistical shape analysis; shape deformation; shape-based abnormity detection; shape tracking and longitudinal shape analysis; machine learning for shape modeling and analysis; shape-based computer-aided-diagnosis; shape-based medical navigation; benchmark and validation of shape representation, analysis and modeling algorithms. This work will be of interest to researchers, students, and manufacturers in the fields of artificial intelligence, bioengineering, biomechanics, computational mechanics, computationa...
Organization Search Go Search Polar Go MMAB SST Analysis Main page About MMAB Our Mission Our Personnel EMC Branches Global Climate & Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Contact EMC (RTG_SST_HR) analysis For a regional map, click the desired area in the global SST analysis and anomaly maps
Foundations of factor analysis
Mulaik, Stanley A
2009-01-01
Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti
International Nuclear Information System (INIS)
PECH, S.H.
2000-01-01
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report
Quantitative analysis chemistry
International Nuclear Information System (INIS)
Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung
1995-02-01
This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.
Energy Technology Data Exchange (ETDEWEB)
PECH, S.H.
2000-08-23
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.
International Nuclear Information System (INIS)
WEBB, R.H.
1999-01-01
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062/Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report
Philipp Mayring
2000-01-01
The article describes an approach of systematic, rule guided qualitative text analysis, which tries to preserve some methodological strengths of quantitative content analysis and widen them to a concept of qualitative procedure. First the development of content analysis is delineated and the basic principles are explained (units of analysis, step models, working with categories, validity and reliability). Then the central procedures of qualitative content analysis, inductive development of ca...
RELIABILITY ANALYSIS OF BENDING ELIABILITY ANALYSIS OF ...
African Journals Online (AJOL)
eobe
Reliability analysis of the safety levels of the criteria slabs, have been .... was also noted [2] that if the risk level or β < 3.1), the ... reliability analysis. A study [6] has shown that all geometric variables, ..... Germany, 1988. 12. Hasofer, A. M and ...
DTI analysis methods : Voxel-based analysis
Van Hecke, Wim; Leemans, Alexander; Emsell, Louise
2016-01-01
Voxel-based analysis (VBA) of diffusion tensor imaging (DTI) data permits the investigation of voxel-wise differences or changes in DTI metrics in every voxel of a brain dataset. It is applied primarily in the exploratory analysis of hypothesized group-level alterations in DTI parameters, as it does
Analysis of Precision of Activation Analysis Method
DEFF Research Database (Denmark)
Heydorn, Kaj; Nørgaard, K.
1973-01-01
The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...
Hazard Analysis Database Report
Grams, W H
2000-01-01
The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...
Santiago, John
2013-01-01
Circuits overloaded from electric circuit analysis? Many universities require that students pursuing a degree in electrical or computer engineering take an Electric Circuit Analysis course to determine who will ""make the cut"" and continue in the degree program. Circuit Analysis For Dummies will help these students to better understand electric circuit analysis by presenting the information in an effective and straightforward manner. Circuit Analysis For Dummies gives you clear-cut information about the topics covered in an electric circuit analysis courses to help
Cluster analysis for applications
Anderberg, Michael R
1973-01-01
Cluster Analysis for Applications deals with methods and various applications of cluster analysis. Topics covered range from variables and scales to measures of association among variables and among data units. Conceptual problems in cluster analysis are discussed, along with hierarchical and non-hierarchical clustering methods. The necessary elements of data analysis, statistics, cluster analysis, and computer implementation are integrated vertically to cover the complete path from raw data to a finished analysis.Comprised of 10 chapters, this book begins with an introduction to the subject o
Activation analysis in food analysis. Pt. 9
International Nuclear Information System (INIS)
Szabo, S.A.
1992-01-01
An overview is presented on the application of activation analysis (AA) techniques for food analysis, as reflected at a recent international conference titled Activation Analysis and its Applications. The most popular analytical techniques include instrumental neutron AA, (INAA or NAA), radiochemical NAA (RNAA), X-ray fluorescence analysis and mass spectrometry. Data are presented for the multielemental NAA of instant soups, for elemental composition of drinking water in Iraq, for Na, K, Mn contents of various Indian rices, for As, Hg, Sb and Se determination in various seafoods, for daily microelement takeup in China, for the elemental composition of Chinese teas. Expected development trends in AA are outlined. (R.P.) 24 refs.; 8 tabs
Joint fluid analysis; Joint fluid aspiration ... El-Gabalawy HS. Synovial fluid analysis, synovial biopsy, and synovial pathology. In: Firestein GS, Budd RC, Gabriel SE, McInnes IB, O'Dell JR, eds. Kelly's Textbook of ...
International Nuclear Information System (INIS)
Burgess, R.L.
1978-01-01
Progress is reported on the following research programs: analysis and modeling of ecosystems; EDFB/IBP data center; biome analysis studies; land/water interaction studies; and computer programs for development of models
Confirmatory Composite Analysis
Schuberth, Florian; Henseler, Jörg; Dijkstra, Theo K.
2018-01-01
We introduce confirmatory composite analysis (CCA) as a structural equation modeling technique that aims at testing composite models. CCA entails the same steps as confirmatory factor analysis: model specification, model identification, model estimation, and model testing. Composite models are
Introductory numerical analysis
Pettofrezzo, Anthony J
2006-01-01
Written for undergraduates who require a familiarity with the principles behind numerical analysis, this classical treatment encompasses finite differences, least squares theory, and harmonic analysis. Over 70 examples and 280 exercises. 1967 edition.
Gap Analysis: Application to Earned Value Analysis
Langford, Gary O.; Franck, Raymond (Chip)
2008-01-01
Sponsored Report (for Acquisition Research Program) Earned Value is regarded as a useful tool to monitor commercial and defense system acquisitions. This paper applies the theoretical foundations and systematics of Gap Analysis to improve Earned Value Management. As currently implemented, Earned Value inaccurately provides a higher value for the work performed. This preliminary research indicates that Earned Value calculations can be corrected. Value Analysis, properly defined and enacted,...
Importance-performance analysis based SWOT analysis
Phadermrod, Boonyarat; Crowder, Richard M.; Wills, Gary B.
2016-01-01
SWOT analysis, a commonly used tool for strategic planning, is traditionally a form of brainstorming. Hence, it has been criticised that it is likely to hold subjective views of the individuals who participate in a brainstorming session and that SWOT factors are not prioritized by their significance thus it may result in an improper strategic action. While most studies of SWOT analysis have only focused on solving these shortcomings separately, this study offers an approach to diminish both s...
Discourse analysis and Foucault's
Directory of Open Access Journals (Sweden)
Jansen I.
2008-01-01
Full Text Available Discourse analysis is a method with up to now was less recognized in nursing science, althoughmore recently nursing scientists are discovering it for their purposes. However, several authors have criticized thatdiscourse analysis is often misinterpreted because of a lack of understanding of its theoretical backgrounds. In thisarticle, I reconstruct Foucault’s writings in his “Archaeology of Knowledge” to provide a theoretical base for futurearchaeological discourse analysis, which can be categorized as a socio-linguistic discourse analysis.
Yuan, Ying; MacKinnon, David P.
2009-01-01
This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...
Fitzmaurice, Garrett M; Ware, James H
2012-01-01
Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo
Regression analysis by example
Chatterjee, Samprit
2012-01-01
Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded
2014-01-01
M.Ing. (Electrical & Electronic Engineering) One of the most important steps to be taken before a site is to be selected for the extraction of wind energy is the analysis of the energy within the wind on that particular site. No wind energy analysis system exists for the measurement and analysis of wind power. This dissertation documents the design and development of a Wind Energy Analysis System (WEAS). Using a micro-controller based design in conjunction with sensors, WEAS measure, calcu...
Slice hyperholomorphic Schur analysis
Alpay, Daniel; Sabadini, Irene
2016-01-01
This book defines and examines the counterpart of Schur functions and Schur analysis in the slice hyperholomorphic setting. It is organized into three parts: the first introduces readers to classical Schur analysis, while the second offers background material on quaternions, slice hyperholomorphic functions, and quaternionic functional analysis. The third part represents the core of the book and explores quaternionic Schur analysis and its various applications. The book includes previously unpublished results and provides the basis for new directions of research.
Computational movement analysis
Laube, Patrick
2014-01-01
This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi
Thompson, Cheryl Bagley
2009-01-01
This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.
Trend Analysis Using Microcomputers.
Berger, Carl F.
A trend analysis statistical package and additional programs for the Apple microcomputer are presented. They illustrate strategies of data analysis suitable to the graphics and processing capabilities of the microcomputer. The programs analyze data sets using examples of: (1) analysis of variance with multiple linear regression; (2) exponential…
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
Automation of activation analysis
International Nuclear Information System (INIS)
Ivanov, I.N.; Ivanets, V.N.; Filippov, V.V.
1985-01-01
The basic data on the methods and equipment of activation analysis are presented. Recommendations on the selection of activation analysis techniques, and especially the technique envisaging the use of short-lived isotopes, are given. The equipment possibilities to increase dataway carrying capacity, using modern computers for the automation of the analysis and data processing procedure, are shown
Cuesta, Hector
2013-01-01
Each chapter of the book quickly introduces a key 'theme' of Data Analysis, before immersing you in the practical aspects of each theme. You'll learn quickly how to perform all aspects of Data Analysis.Practical Data Analysis is a book ideal for home and small business users who want to slice & dice the data they have on hand with minimum hassle.
Mathematical analysis fundamentals
Bashirov, Agamirza
2014-01-01
The author's goal is a rigorous presentation of the fundamentals of analysis, starting from elementary level and moving to the advanced coursework. The curriculum of all mathematics (pure or applied) and physics programs include a compulsory course in mathematical analysis. This book will serve as can serve a main textbook of such (one semester) courses. The book can also serve as additional reading for such courses as real analysis, functional analysis, harmonic analysis etc. For non-math major students requiring math beyond calculus, this is a more friendly approach than many math-centric o
Foundations of mathematical analysis
Johnsonbaugh, Richard
2010-01-01
This classroom-tested volume offers a definitive look at modern analysis, with views of applications to statistics, numerical analysis, Fourier series, differential equations, mathematical analysis, and functional analysis. Upper-level undergraduate students with a background in calculus will benefit from its teachings, along with beginning graduate students seeking a firm grounding in modern analysis. A self-contained text, it presents the necessary background on the limit concept, and the first seven chapters could constitute a one-semester introduction to limits. Subsequent chapters discuss
Analysis in usability evaluations
DEFF Research Database (Denmark)
Følstad, Asbjørn; Lai-Chong Law, Effie; Hornbæk, Kasper
2010-01-01
While the planning and implementation of usability evaluations are well described in the literature, the analysis of the evaluation data is not. We present interviews with 11 usability professionals on how they conduct analysis, describing the resources, collaboration, creation of recommendations......, and prioritization involved. The interviews indicate a lack of structure in the analysis process and suggest activities, such as generating recommendations, that are unsupported by existing methods. We discuss how to better support analysis, and propose four themes for future research on analysis in usability...
Death rattle: prevalence, prevention and treatment.
Wildiers, Hans; Menten, Johan
2002-04-01
A retrospective analysis was performed to study the occurrence and treatment of death rattle (DR) in 107 consecutive dying patients on the palliative care unit of the University Hospital Leuven. The incidence of DR (23%) is lower than reported in literature, possibly due to low hydration. We found 2 types of rattle: "Real DR" responds generally very well to anticholinergic therapy, and is probably caused by non-expectorated secretions. "Pseudo DR" is poorly responsive to therapy and is probably caused by bronchial secretions due to pulmonary pathology, such as infection, tumor, fluid retention, or aspiration. Rattle disappeared in >90% for the patients with real DR. Real DR is a strong predictor for death, and 76% (19/25) died within 48h after onset. Administration of subcutaneous hyoscine hydrobromide, as a bolus or continuous infusion, is effective therapy for real DR and is comfortable for the patient and caregivers.
Synthesis of potential Schistosomicides: new 2-(alkylamino)-1-octometiosulfuric acids
International Nuclear Information System (INIS)
Oliveira Penido, M.L. de; Nelson, D.L.; Pilo-Veloso, D.
1990-01-01
Four new 2-(alkylamino)-1-octanethiosulfuric acids (1) were synthesized from 1-octene. 1-Octene was epoxidized with MCPBA or with a two-phase system composed of H 2 O 2 , sodium tungstate, phosphoric acid, 1-octene and a phase transfer agent. Reaction of the 1,2-epoxyoetane with primary amines furnished 1-(alkylamino)-2octanols which were converted to the respective N-alkyl-2-bromo-1-octanamine hydrobromides by reaction with hydrobromic acid, followed by phosphorus tribromide. Finally, substitution of the bromide ion with sodium thiosulfate was accompanied by rearrangement via an aziridine intermediate, resulting in formation of the product. 1. The intermediates and the final products were screened for activity against infection by Schistosoma mansoni, only the final products in which the N-alkyl group was sec-butyl or isopropyl exhibited activity. Nuclear magnetic resonance and infrared and mass spectroscopy analysis are presented. (author) [pt
DEFF Research Database (Denmark)
Bøving, Kristian Billeskov; Simonsen, Jesper
2004-01-01
This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data...
Amir Farbin
The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...
Multivariate analysis with LISREL
Jöreskog, Karl G; Y Wallentin, Fan
2016-01-01
This book traces the theory and methodology of multivariate statistical analysis and shows how it can be conducted in practice using the LISREL computer program. It presents not only the typical uses of LISREL, such as confirmatory factor analysis and structural equation models, but also several other multivariate analysis topics, including regression (univariate, multivariate, censored, logistic, and probit), generalized linear models, multilevel analysis, and principal component analysis. It provides numerous examples from several disciplines and discusses and interprets the results, illustrated with sections of output from the LISREL program, in the context of the example. The book is intended for masters and PhD students and researchers in the social, behavioral, economic and many other sciences who require a basic understanding of multivariate statistical theory and methods for their analysis of multivariate data. It can also be used as a textbook on various topics of multivariate statistical analysis.
International Nuclear Information System (INIS)
Actis, Oxana; Brodski, Michael; Erdmann, Martin; Fischer, Robert; Hinzmann, Andreas; Mueller, Gero; Muenzer, Thomas; Plum, Matthias; Steggemann, Jan; Winchen, Tobias; Klimkovich, Tatsiana
2010-01-01
VISPA is a development environment for high energy physics analyses which enables physicists to combine graphical and textual work. A physics analysis cycle consists of prototyping, performing, and verifying the analysis. The main feature of VISPA is a multipurpose window for visual steering of analysis steps, creation of analysis templates, and browsing physics event data at different steps of an analysis. VISPA follows an experiment-independent approach and incorporates various tools for steering and controlling required in a typical analysis. Connection to different frameworks of high energy physics experiments is achieved by using different types of interfaces. We present the look-and-feel for an example physics analysis at the LHC and explain the underlying software concepts of VISPA.
Cost benefit analysis cost effectiveness analysis
International Nuclear Information System (INIS)
Lombard, J.
1986-09-01
The comparison of various protection options in order to determine which is the best compromise between cost of protection and residual risk is the purpose of the ALARA procedure. The use of decision-aiding techniques is valuable as an aid to selection procedures. The purpose of this study is to introduce two rather simple and well known decision aiding techniques: the cost-effectiveness analysis and the cost-benefit analysis. These two techniques are relevant for the great part of ALARA decisions which need the use of a quantitative technique. The study is based on an hypothetical case of 10 protection options. Four methods are applied to the data
International Nuclear Information System (INIS)
Sommer, S; Tinh Tran, T.
2008-01-01
Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process
Functional analysis and applications
Siddiqi, Abul Hasan
2018-01-01
This self-contained textbook discusses all major topics in functional analysis. Combining classical materials with new methods, it supplies numerous relevant solved examples and problems and discusses the applications of functional analysis in diverse fields. The book is unique in its scope, and a variety of applications of functional analysis and operator-theoretic methods are devoted to each area of application. Each chapter includes a set of problems, some of which are routine and elementary, and some of which are more advanced. The book is primarily intended as a textbook for graduate and advanced undergraduate students in applied mathematics and engineering. It offers several attractive features making it ideally suited for courses on functional analysis intended to provide a basic introduction to the subject and the impact of functional analysis on applied and computational mathematics, nonlinear functional analysis and optimization. It introduces emerging topics like wavelets, Gabor system, inverse pro...
DEFF Research Database (Denmark)
Bemman, Brian; Meredith, David
it with a “ground truth” analysis of the same music pro- duced by a human expert (see, in particular, [5]). In this paper, we explore the problem of generating an encoding of the musical surface of a work automatically from a systematic encoding of an analysis. The ability to do this depends on one having...... an effective (i.e., comput- able), correct and complete description of some aspect of the structure of the music. Generating the surface struc- ture of a piece from an analysis in this manner serves as a proof of the analysis' correctness, effectiveness and com- pleteness. We present a reductive analysis......In recent years, a significant body of research has focused on developing algorithms for computing analyses of mu- sical works automatically from encodings of these works' surfaces [3,4,7,10,11]. The quality of the output of such analysis algorithms is typically evaluated by comparing...
Fundamentals of functional analysis
Farenick, Douglas
2016-01-01
This book provides a unique path for graduate or advanced undergraduate students to begin studying the rich subject of functional analysis with fewer prerequisites than is normally required. The text begins with a self-contained and highly efficient introduction to topology and measure theory, which focuses on the essential notions required for the study of functional analysis, and which are often buried within full-length overviews of the subjects. This is particularly useful for those in applied mathematics, engineering, or physics who need to have a firm grasp of functional analysis, but not necessarily some of the more abstruse aspects of topology and measure theory normally encountered. The reader is assumed to only have knowledge of basic real analysis, complex analysis, and algebra. The latter part of the text provides an outstanding treatment of Banach space theory and operator theory, covering topics not usually found together in other books on functional analysis. Written in a clear, concise manner,...
DEFF Research Database (Denmark)
This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....
Analysis apparatus and method of analysis
International Nuclear Information System (INIS)
1976-01-01
A continuous streaming method developed for the excution of immunoassays is described in this patent. In addition, a suitable apparatus for the method was developed whereby magnetic particles are automatically employed for the consecutive analysis of a series of liquid samples via the RIA technique
International Nuclear Information System (INIS)
Dougherty, E.M.; Fragola, J.R.
1988-01-01
The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach
Emission spectrochemical analysis
International Nuclear Information System (INIS)
Rives, R.D.; Bruks, R.R.
1983-01-01
The emission spectrochemical method of analysis based on the fact that atoms of elements can be excited in the electric arc or in the laser beam and will emit radiation with characteristic wave lengths is considered. The review contains the data on spectrochemical analysis, of liquids geological materials, scheme of laser microprobe. The main characteristics of emission spectroscopy, atomic absorption spectroscopy and X-ray fluorescent analysis, are aeneralized
International Nuclear Information System (INIS)
Crawford, H.J.; Lindstrom, P.J.
1983-06-01
Our analysis program LULU has proven very useful in all stages of experiment analysis, from prerun detector debugging through final data reduction. It has solved our problem of having arbitrary word length events and is easy enough to use that many separate experimenters are now analyzing with LULU. The ability to use the same software for all stages of experiment analysis greatly eases the programming burden. We may even get around to making the graphics elegant someday
Mastering Clojure data analysis
Rochester, Eric
2014-01-01
This book consists of a practical, example-oriented approach that aims to help you learn how to use Clojure for data analysis quickly and efficiently.This book is great for those who have experience with Clojure and who need to use it to perform data analysis. This book will also be hugely beneficial for readers with basic experience in data analysis and statistics.
Fast neutron activation analysis
International Nuclear Information System (INIS)
Pepelnik, R.
1986-01-01
Since 1981 numerous 14 MeV neutron activation analyses were performed at Korona. On the basis of that work the advantages of this analysis technique and therewith obtained results are compared with other analytical methods. The procedure of activation analysis, the characteristics of Korona, some analytical investigations in environmental research and material physics, as well as sources of systematic errors in trace analysis are described. (orig.) [de
Crisan, Dan
2011-01-01
"Stochastic Analysis" aims to provide mathematical tools to describe and model high dimensional random systems. Such tools arise in the study of Stochastic Differential Equations and Stochastic Partial Differential Equations, Infinite Dimensional Stochastic Geometry, Random Media and Interacting Particle Systems, Super-processes, Stochastic Filtering, Mathematical Finance, etc. Stochastic Analysis has emerged as a core area of late 20th century Mathematics and is currently undergoing a rapid scientific development. The special volume "Stochastic Analysis 2010" provides a sa
The ATLAS Analysis Architecture
International Nuclear Information System (INIS)
Cranmer, K.S.
2008-01-01
We present an overview of the ATLAS analysis architecture including the relevant aspects of the computing model and the major architectural aspects of the Athena framework. Emphasis will be given to the interplay between the analysis use cases and the technical aspects of the architecture including the design of the event data model, transient-persistent separation, data reduction strategies, analysis tools, and ROOT interoperability
Circuit analysis with Multisim
Baez-Lopez, David
2011-01-01
This book is concerned with circuit simulation using National Instruments Multisim. It focuses on the use and comprehension of the working techniques for electrical and electronic circuit simulation. The first chapters are devoted to basic circuit analysis.It starts by describing in detail how to perform a DC analysis using only resistors and independent and controlled sources. Then, it introduces capacitors and inductors to make a transient analysis. In the case of transient analysis, it is possible to have an initial condition either in the capacitor voltage or in the inductor current, or bo
Textile Technology Analysis Lab
Federal Laboratory Consortium — The Textile Analysis Labis built for evaluating and characterizing the physical properties of an array of textile materials, but specifically those used in aircrew...
DEFF Research Database (Denmark)
Sørensen, Olav Jull
2009-01-01
The review presents the book International Market Analysis: Theories and Methods, written by John Kuiada, professor at Centre of International Business, Department of Business Studies, Aalborg University. The book is refreshingly new in its way of looking at a classical problem. It looks at market...... analysis from the point of vie of ways of thinking about markets. Furthermore, the book includes the concept of learning in the analysis of markets og how the way we understand business reality influneces our choice of methodology for market analysis....
Chemical Security Analysis Center
Federal Laboratory Consortium — In 2006, by Presidential Directive, DHS established the Chemical Security Analysis Center (CSAC) to identify and assess chemical threats and vulnerabilities in the...
Geospatial Data Analysis Facility
Federal Laboratory Consortium — Geospatial application development, location-based services, spatial modeling, and spatial analysis are examples of the many research applications that this facility...
National Research Council Canada - National Science Library
Gilbert, John
1984-01-01
... quantification methods used in the analysis of mycotoxins in foods - Confirmation and quantification of trace organic food contaminants by mass spectrometry-selected ion monitoring - Chemiluminescence...
Federal Laboratory Consortium — FUNCTION: Uses state-of-the-art instrumentation for qualitative and quantitative analysis of organic and inorganic compounds, and biomolecules from gas, liquid, and...
Thermogravimetric Analysis Laboratory
Federal Laboratory Consortium — At NETL’s Thermogravimetric Analysis Laboratory in Morgantown, WV, researchers study how chemical looping combustion (CLC) can be applied to fossil energy systems....
Sensitivity and uncertainty analysis
Cacuci, Dan G; Navon, Ionel Michael
2005-01-01
As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c
Hox, J.J.; Maas, C.J.M.; Lensvelt-Mulders, G.J.L.M.
2004-01-01
The goal of meta-analysis is to integrate the research results of a number of studies on a specific topic. Characteristic for meta-analysis is that in general only the summary statistics of the studies are used and not the original data. When the published research results to be integrated
International Nuclear Information System (INIS)
Hahn, A.A.
1994-11-01
The complexity of instrumentation sometimes requires data analysis to be done before the result is presented to the control room. This tutorial reviews some of the theoretical assumptions underlying the more popular forms of data analysis and presents simple examples to illuminate the advantages and hazards of different techniques
Activation analysis. Detection limits
International Nuclear Information System (INIS)
Revel, G.
1999-01-01
Numerical data and limits of detection related to the four irradiation modes, often used in activation analysis (reactor neutrons, 14 MeV neutrons, photon gamma and charged particles) are presented here. The technical presentation of the activation analysis is detailed in the paper P 2565 of Techniques de l'Ingenieur. (A.L.B.)
SMART performance analysis methodology
International Nuclear Information System (INIS)
Lim, H. S.; Kim, H. C.; Lee, D. J.
2001-04-01
To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis
Contrast analysis : A tutorial
Haans, A.
2018-01-01
Contrast analysis is a relatively simple but effective statistical method for testing theoretical predictions about differences between group means against the empirical data. Despite its advantages, contrast analysis is hardly used to date, perhaps because it is not implemented in a convenient
Interactive Controls Analysis (INCA)
Bauer, Frank H.
1989-01-01
Version 3.12 of INCA provides user-friendly environment for design and analysis of linear control systems. System configuration and parameters easily adjusted, enabling INCA user to create compensation networks and perform sensitivity analysis in convenient manner. Full complement of graphical routines makes output easy to understand. Written in Pascal and FORTRAN.
Marketing research cluster analysis
Directory of Open Access Journals (Sweden)
Marić Nebojša
2002-01-01
Full Text Available One area of applications of cluster analysis in marketing is identification of groups of cities and towns with similar demographic profiles. This paper considers main aspects of cluster analysis by an example of clustering 12 cities with the use of Minitab software.
SWOT ANALYSIS - CHINESE PETROLEUM
Directory of Open Access Journals (Sweden)
Chunlan Wang
2014-01-01
Full Text Available This article was written in early December 2013,combined with the historical development andthe latest data on the Chinese Petroleum carried SWOT- analysis. This paper discusses corporate resources, cost, management and external factorssuch as the political environment and the marketsupply and demand, conducted a comprehensiveand profound analysis.
de Roon, F.A.; Nijman, T.E.; Ter Horst, J.R.
2000-01-01
In this paper we evaluate applications of (return based) style analysis.The portfolio and positivity constraints imposed by style analysis are useful in constructing mimicking portfolios without short positions.Such mimicking portfolios can be used, e.g., to construct efficient portfolios of mutual
F.A. de Roon (Frans); T.E. Nijman (Theo); B.J.M. Werker
2000-01-01
textabstractIn this paper we evaluate applications of (return based) style analysis. The portfolio and positivity constraints imposed by style analysis are useful in constructing mimicking portfolios without short positions. Such mimicking portfolios can be used e.g. to construct efficient
Directory of Open Access Journals (Sweden)
Satu Elo
2014-02-01
Full Text Available Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studies, our own experiences, and methodological textbooks. Trustworthiness was described for the main qualitative content analysis phases from data collection to reporting of the results. We concluded that it is important to scrutinize the trustworthiness of every phase of the analysis process, including the preparation, organization, and reporting of results. Together, these phases should give a reader a clear indication of the overall trustworthiness of the study. Based on our findings, we compiled a checklist for researchers attempting to improve the trustworthiness of a content analysis study. The discussion in this article helps to clarify how content analysis should be reported in a valid and understandable manner, which would be of particular benefit to reviewers of scientific articles. Furthermore, we discuss that it is often difficult to evaluate the trustworthiness of qualitative content analysis studies because of defective data collection method description and/or analysis description.
Schraagen, J.M.C.
2000-01-01
Cognitive task analysis is defined as the extension of traditional task analysis techniques to yield information about the knowledge, thought processes and goal structures that underlie observable task performance. Cognitive task analyses are conducted for a wide variety of purposes, including the
DEFF Research Database (Denmark)
Damkilde, Lars
2007-01-01
Limit State analysis has a long history and many prominent researchers have contributed. The theoretical foundation is based on the upper- and lower-bound theorems which give a very comprehensive and elegant formulation on complicated physical problems. In the pre-computer age Limit State analysis...... also enabled engineers to solve practical problems within reinforced concrete, steel structures and geotechnics....
Verhoosel, C.V.; Scott, M.A.; Borden, M.J.; Borst, de R.; Hughes, T.J.R.; Mueller-Hoeppe, D.; Loehnert, S.; Reese, S.
2011-01-01
Isogeometric analysis is a versatile tool for failure analysis. On the one hand, the excellent control over the inter-element continuity conditions enables a natural incorporation of continuum constitutive relations that incorporate higher-order strain gradients, as in gradient plasticity or damage.
DEFF Research Database (Denmark)
Durbin, Richard; Eddy, Sean; Krogh, Anders Stærmose
This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis, and phylogene...
International Nuclear Information System (INIS)
Arien, B.
2000-01-01
The objective of SCK-CEN's programme on reactor safety is to develop expertise in probabilistic and deterministic reactor safety analysis. The research programme consists of two main activities, in particular the development of software for reliability analysis of large systems and participation in the international PHEBUS-FP programme for severe accidents. Main achievements in 1999 are reported
Factorial Analysis of Profitability
Georgeta VINTILA; Ilie GHEORGHE; Ioana Mihaela POCAN; Madalina Gabriela ANGHEL
2012-01-01
The DuPont analysis system is based on decomposing the profitability ratio in factors of influence. This paper describes the factorial analysis of profitability based on the DuPont system. Significant importance is given to the impact on various indicators on the shares value and profitability.
Spool assembly support analysis
International Nuclear Information System (INIS)
Norman, B.F.
1994-01-01
This document provides the wind/seismic analysis and evaluation for the pump pit spool assemblies. Hand calculations were used for the analysis. UBC, AISC, and load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met
International Nuclear Information System (INIS)
Hansen, J.D.
1976-01-01
This article discusses the partial wave analysis of two, three and four meson systems. The difference between the two approaches, referred to as amplitude and Ascoli analysis is discussed. Some of the results obtained with these methods are shown. (B.R.H.)
Enabling interdisciplinary analysis
L. M. Reid
1996-01-01
'New requirements for evaluating environmental conditions in the Pacific Northwest have led to increased demands for interdisciplinary analysis of complex environmental problems. Procedures for watershed analysis have been developed for use on public and private lands in Washington State (Washington Forest Practices Board 1993) and for federal lands in the Pacific...
Shot loading platform analysis
International Nuclear Information System (INIS)
Norman, B.F.
1994-01-01
This document provides the wind/seismic analysis and evaluation for the shot loading platform. Hand calculations were used for the analysis. AISC and UBC load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met
Marketing research cluster analysis
Marić Nebojša
2002-01-01
One area of applications of cluster analysis in marketing is identification of groups of cities and towns with similar demographic profiles. This paper considers main aspects of cluster analysis by an example of clustering 12 cities with the use of Minitab software.
Towards Cognitive Component Analysis
DEFF Research Database (Denmark)
Hansen, Lars Kai; Ahrendt, Peter; Larsen, Jan
2005-01-01
Cognitive component analysis (COCA) is here defined as the process of unsupervised grouping of data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. We have earlier demonstrated that independent components analysis is relevant for representing...
Satu Elo; Maria Kääriäinen; Outi Kanste; Tarja Pölkki; Kati Utriainen; Helvi Kyngäs
2014-01-01
Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studie...
Interaction Analysis and Supervision.
Amidon, Edmund
This paper describes a model that uses interaction analysis as a tool to provide feedback to a teacher in a microteaching situation. The author explains how interaction analysis can be used for teacher improvement, describes the category system used in the model, the data collection methods used, and the feedback techniques found in the model. (JF)
Activation analysis. Chapter 4
International Nuclear Information System (INIS)
1976-01-01
The principle, sample and calibration standard preparation, activation by neutrons, charged particles and gamma radiation, sample transport after activation, activity measurement, and chemical sample processing are described for activation analysis. Possible applications are shown of nondestructive activation analysis. (J.P.)
Donahue, Craig J.; Rais, Elizabeth A.
2009-01-01
This lab experiment illustrates the use of thermogravimetric analysis (TGA) to perform proximate analysis on a series of coal samples of different rank. Peat and coke are also examined. A total of four exercises are described. These are dry exercises as students interpret previously recorded scans. The weight percent moisture, volatile matter,…
Ian M. Franks; Mike Hughes
2004-01-01
This book addresses and appropriately explains the notational analysis of technique, tactics, individual athlete/team exercise and work-rate in sport. The book offers guidance in: developing a system, analyzes of data, effective coaching using notational performance analysis and modeling sport behaviors. It updates and improves the 1997 edition
Directory of Open Access Journals (Sweden)
Ian M. Franks
2004-06-01
Full Text Available This book addresses and appropriately explains the notational analysis of technique, tactics, individual athlete/team exercise and work-rate in sport. The book offers guidance in: developing a system, analyzes of data, effective coaching using notational performance analysis and modeling sport behaviors. It updates and improves the 1997 edition
Allain, Rhett
2016-05-01
We currently live in a world filled with videos. There are videos on YouTube, feature movies and even videos recorded with our own cameras and smartphones. These videos present an excellent opportunity to not only explore physical concepts, but also inspire others to investigate physics ideas. With video analysis, we can explore the fantasy world in science-fiction films. We can also look at online videos to determine if they are genuine or fake. Video analysis can be used in the introductory physics lab and it can even be used to explore the make-believe physics embedded in video games. This book covers the basic ideas behind video analysis along with the fundamental physics principles used in video analysis. The book also includes several examples of the unique situations in which video analysis can be used.
Ramsay, J O
1997-01-01
Scientists today collect samples of curves and other functional observations. This monograph presents many ideas and techniques for such data. Included are expressions in the functional domain of such classics as linear regression, principal components analysis, linear modelling, and canonical correlation analysis, as well as specifically functional techniques such as curve registration and principal differential analysis. Data arising in real applications are used throughout for both motivation and illustration, showing how functional approaches allow us to see new things, especially by exploiting the smoothness of the processes generating the data. The data sets exemplify the wide scope of functional data analysis; they are drwan from growth analysis, meterology, biomechanics, equine science, economics, and medicine. The book presents novel statistical technology while keeping the mathematical level widely accessible. It is designed to appeal to students, to applied data analysts, and to experienced researc...
Systems engineering and analysis
Blanchard, Benjamin S
2010-01-01
For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.
International Nuclear Information System (INIS)
Ishii, Keizo
1997-01-01
Elemental analysis based on the particle induced x-ray emission (PIXE) is a novel technique to analyze trace elements. It is a very simple method, its sensitivity is very high, multiple elements in a sample can be simultaneously analyzed and a few 10 μg of a sample is enough to be analyzed. Owing to these characteristics, the PIXE analysis is now used in many fields (e.g. biology, medicine, dentistry, environmental pollution, archaeology, culture assets etc.). Fundamentals of the PIXE analysis are described here: the production of characteristic x-rays and inner shell ionization by heavy charged particles, the continuous background in PIXE spectrum, quantitative formulae of the PIXE analysis, the detection limit of PIXE analysis, etc. (author)
International Nuclear Information System (INIS)
Porten, D.R.; Crowe, R.D.
1994-01-01
The purpose of this accident safety analysis is to document in detail, analyses whose results were reported in summary form in the K Basins Safety Analysis Report WHC-SD-SNF-SAR-001. The safety analysis addressed the potential for release of radioactive and non-radioactive hazardous material located in the K Basins and their supporting facilities. The safety analysis covers the hazards associated with normal K Basin fuel storage and handling operations, fuel encapsulation, sludge encapsulation, and canister clean-up and disposal. After a review of the Criticality Safety Evaluation of the K Basin activities, the following postulated events were evaluated: Crane failure and casks dropped into loadout pit; Design basis earthquake; Hypothetical loss of basin water accident analysis; Combustion of uranium fuel following dryout; Crane failure and cask dropped onto floor of transfer area; Spent ion exchange shipment for burial; Hydrogen deflagration in ion exchange modules and filters; Release of Chlorine; Power availability and reliability; and Ashfall
International Nuclear Information System (INIS)
Goetz, A.; Gerring, M.; Svensson, O.; Brockhauser, S.
2012-01-01
Data Analysis Workbench (DAWB) is a new software tool being developed at the ESRF. Its goal is to provide a tool for both online data analysis which can be used on the beamlines and for offline data analysis which users can use during experiments or take home. The tool includes support for data visualization and work-flows. work-flows allow algorithms which exploit parallel architectures to be designed from existing high level modules for data analysis in combination with data collection. The workbench uses Passerelle as the work-flow engine and EDNA plug-ins for data analysis. Actors talking to Tango are used for sending commands to a limited set of hardware to start existing data collection algorithms. A Tango server allows work-flows to be executed from existing applications. There are scripting interfaces to Python, Javascript and SPEC. The current state at the ESRF is the workbench is in test on a selected number of beamlines. (authors)
J Olive, David
2017-01-01
This text presents methods that are robust to the assumption of a multivariate normal distribution or methods that are robust to certain types of outliers. Instead of using exact theory based on the multivariate normal distribution, the simpler and more applicable large sample theory is given. The text develops among the first practical robust regression and robust multivariate location and dispersion estimators backed by theory. The robust techniques are illustrated for methods such as principal component analysis, canonical correlation analysis, and factor analysis. A simple way to bootstrap confidence regions is also provided. Much of the research on robust multivariate analysis in this book is being published for the first time. The text is suitable for a first course in Multivariate Statistical Analysis or a first course in Robust Statistics. This graduate text is also useful for people who are familiar with the traditional multivariate topics, but want to know more about handling data sets with...
Field, Michael
2017-01-01
This book provides a rigorous introduction to the techniques and results of real analysis, metric spaces and multivariate differentiation, suitable for undergraduate courses. Starting from the very foundations of analysis, it offers a complete first course in real analysis, including topics rarely found in such detail in an undergraduate textbook such as the construction of non-analytic smooth functions, applications of the Euler-Maclaurin formula to estimates, and fractal geometry. Drawing on the author’s extensive teaching and research experience, the exposition is guided by carefully chosen examples and counter-examples, with the emphasis placed on the key ideas underlying the theory. Much of the content is informed by its applicability: Fourier analysis is developed to the point where it can be rigorously applied to partial differential equations or computation, and the theory of metric spaces includes applications to ordinary differential equations and fractals. Essential Real Analysis will appeal t...
Real analysis and applications
Botelho, Fabio Silva
2018-01-01
This textbook introduces readers to real analysis in one and n dimensions. It is divided into two parts: Part I explores real analysis in one variable, starting with key concepts such as the construction of the real number system, metric spaces, and real sequences and series. In turn, Part II addresses the multi-variable aspects of real analysis. Further, the book presents detailed, rigorous proofs of the implicit theorem for the vectorial case by applying the Banach fixed-point theorem and the differential forms concept to surfaces in Rn. It also provides a brief introduction to Riemannian geometry. With its rigorous, elegant proofs, this self-contained work is easy to read, making it suitable for undergraduate and beginning graduate students seeking a deeper understanding of real analysis and applications, and for all those looking for a well-founded, detailed approach to real analysis.
Nonactivation interaction analysis. Chapter 5
International Nuclear Information System (INIS)
1976-01-01
Analyses are described including the alpha scattering analysis, beta absorption and scattering analysis, gamma and X-ray absorption and scattering analysis, X-ray fluorescence analysis, neutron absorption and scattering analysis, Moessbauer effect application and an analysis based on the application of radiation ionizing effects. (J.P.)
Is activation analysis still active?
International Nuclear Information System (INIS)
Chai Zhifang
2001-01-01
This paper reviews some aspects of neutron activation analysis (NAA), covering instrumental neutron activation analysis (INAA), k 0 method, prompt gamma-ray neutron activation analysis (PGNAA), radiochemical neutron activation analysis (RNAA) and molecular activation analysis (MAA). The comparison of neutron activation analysis with other analytical techniques are also made. (author)
International Nuclear Information System (INIS)
González Caballero, I; Cuesta Noriega, A; Rodríguez Marrero, A; Fernández del Castillo, E
2012-01-01
The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of the routine tasks as much as possible. We describe the details of the PAF implementation as well as how we succeeded in engaging a group of CMS physicists to use PAF as their daily analysis framework.
Hazard Analysis Database Report
Energy Technology Data Exchange (ETDEWEB)
GAULT, G.W.
1999-10-13
The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.
Containment vessel stability analysis
International Nuclear Information System (INIS)
Harstead, G.A.; Morris, N.F.; Unsal, A.I.
1983-01-01
The stability analysis for a steel containment shell is presented herein. The containment is a freestanding shell consisting of a vertical cylinder with a hemispherical dome. It is stiffened by large ring stiffeners and relatively small longitudinal stiffeners. The containment vessel is subjected to both static and dynamic loads which can cause buckling. These loads must be combined prior to their use in a stability analysis. The buckling loads were computed with the aid of the ASME Code case N-284 used in conjunction with general purpose computer codes and in-house programs. The equations contained in the Code case were used to compute the knockdown factors due to shell imperfections. After these knockdown factors were applied to the critical stress states determined by freezing the maximum dynamic stresses and combining them with other static stresses, a linear bifurcation analysis was carried out with the aid of the BOSOR4 program. Since the containment shell contained large penetrations, the Code case had to be supplemented by a local buckling analysis of the shell area surrounding the largest penetration. This analysis was carried out with the aid of the NASTRAN program. Although the factor of safety against buckling obtained in this analysis was satisfactory, it is claimed that the use of the Code case knockdown factors are unduly conservative when applied to the analysis of buckling around penetrations. (orig.)
International Nuclear Information System (INIS)
Thompson, W.A. Jr.
1979-11-01
This paper briefly describes WASH 1400 and the Lewis report. It attempts to define basic concepts such as risk and risk analysis, common mode failure, and rare event. Several probabilistic models which go beyond the WASH 1400 methodology are introduced; the common characteristic of these models is that they recognize explicitly that risk analysis is time dependent whereas WASH 1400 takes a per demand failure rate approach which obscures the important fact that accidents are time related. Further, the presentation of a realistic risk analysis should recognize that there are various risks which compete with one another for the lives of the individuals at risk. A way of doing this is suggested
International Nuclear Information System (INIS)
Kartiwa Sumadi; Yayah Rohayati
1996-01-01
The 'monazit' analytical program has been set up for routine work of Rare Earth Elements analysis in the monazite and xenotime minerals samples. Total relative error of the analysis is very low, less than 2.50%, and the reproducibility of counting statistic and stability of the instrument were very excellent. The precision and accuracy of the analytical program are very good with the maximum percentage relative are 5.22% and 1.61%, respectively. The mineral compositions of the 30 monazite samples have been also calculated using their chemical constituents, and the results were compared to the grain counting microscopic analysis
Methods of Multivariate Analysis
Rencher, Alvin C
2012-01-01
Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit
Dunham, Ken
2014-01-01
The rapid growth and development of Android-based devices has resulted in a wealth of sensitive information on mobile devices that offer minimal malware protection. This has created an immediate demand for security professionals that understand how to best approach the subject of Android malware threats and analysis.In Android Malware and Analysis, Ken Dunham, renowned global malware expert and author, teams up with international experts to document the best tools and tactics available for analyzing Android malware. The book covers both methods of malware analysis: dynamic and static.This tact
Aven, Terje
2012-01-01
Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and
International Nuclear Information System (INIS)
Ko, Myeong Su; Kim, Tae Hwa; Park, Gyu Hyeon; Yang, Jong Beom; Oh, Chang Hwan; Lee, Kyoung Hye
2010-04-01
This textbook describes instrument analysis in easy way with twelve chapters. The contents of the book are pH measurement on principle, pH meter, pH measurement, examples of the experiments, centrifugation, Absorptiometry, Fluorescent method, Atomic absorption analysis, Gas-chromatography, Gas chromatography-mass spectrometry, High performance liquid chromatography liquid chromatograph-mass spectrometry, Electrophoresis on practical case and analysis of the result and examples, PCR on principle, device, application and examples and Enzyme-linked immunosorbent assay with indirect ELISA, sandwich ELISA and ELISA reader.
International Nuclear Information System (INIS)
Williams, Mike; Egede, Ulrik; Paterson, Stuart
2011-01-01
The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.
Factor analysis and scintigraphy
International Nuclear Information System (INIS)
Di Paola, R.; Penel, C.; Bazin, J.P.; Berche, C.
1976-01-01
The goal of factor analysis is usually to achieve reduction of a large set of data, extracting essential features without previous hypothesis. Due to the development of computerized systems, the use of largest sampling, the possibility of sequential data acquisition and the increase of dynamic studies, the problem of data compression can be encountered now in routine. Thus, results obtained for compression of scintigraphic images were first presented. Then possibilities given by factor analysis for scan processing were discussed. At last, use of this analysis for multidimensional studies and specially dynamic studies were considered for compression and processing [fr
Energy Technology Data Exchange (ETDEWEB)
Ko, Myeong Su; Kim, Tae Hwa; Park, Gyu Hyeon; Yang, Jong Beom; Oh, Chang Hwan; Lee, Kyoung Hye
2010-04-15
This textbook describes instrument analysis in easy way with twelve chapters. The contents of the book are pH measurement on principle, pH meter, pH measurement, examples of the experiments, centrifugation, Absorptiometry, Fluorescent method, Atomic absorption analysis, Gas-chromatography, Gas chromatography-mass spectrometry, High performance liquid chromatography liquid chromatograph-mass spectrometry, Electrophoresis on practical case and analysis of the result and examples, PCR on principle, device, application and examples and Enzyme-linked immunosorbent assay with indirect ELISA, sandwich ELISA and ELISA reader.
DEFF Research Database (Denmark)
Raket, Lars Lau
We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...
Bayesian nonparametric data analysis
Müller, Peter; Jara, Alejandro; Hanson, Tim
2015-01-01
This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.
International Nuclear Information System (INIS)
Ramirez T, J.J.; Lopez M, J.; Sandoval J, A.R.; Villasenor S, P.; Aspiazu F, J.A.
2001-01-01
An elemental analysis, metallographic and of phases was realized in order to determine the oxidation states of Fe contained in three metallic pieces: block, plate and cylinder of unknown material. Results are presented from the elemental analysis which was carried out in the Tandem Accelerator of ININ by Proton induced X-ray emission (PIXE). The phase analysis was carried out by X-ray diffraction which allowed to know the type of alloy or alloys formed. The combined application of nuclear techniques with metallographic techniques allows the integral characterization of industrial metals. (Author)
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Iremonger, M J
1982-01-01
BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c
Fundamentals of mathematical analysis
Paul J Sally, Jr
2013-01-01
This is a textbook for a course in Honors Analysis (for freshman/sophomore undergraduates) or Real Analysis (for junior/senior undergraduates) or Analysis-I (beginning graduates). It is intended for students who completed a course in "AP Calculus", possibly followed by a routine course in multivariable calculus and a computational course in linear algebra. There are three features that distinguish this book from many other books of a similar nature and which are important for the use of this book as a text. The first, and most important, feature is the collection of exercises. These are spread
Systems analysis-independent analysis and verification
Energy Technology Data Exchange (ETDEWEB)
Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)
1995-09-01
The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.
Plasma data analysis using statistical analysis system
International Nuclear Information System (INIS)
Yoshida, Z.; Iwata, Y.; Fukuda, Y.; Inoue, N.
1987-01-01
Multivariate factor analysis has been applied to a plasma data base of REPUTE-1. The characteristics of the reverse field pinch plasma in REPUTE-1 are shown to be explained by four independent parameters which are described in the report. The well known scaling laws F/sub chi/ proportional to I/sub p/, T/sub e/ proportional to I/sub p/, and tau/sub E/ proportional to N/sub e/ are also confirmed. 4 refs., 8 figs., 1 tab
Summary Analysis: Hanford Site Composite Analysis Update
Energy Technology Data Exchange (ETDEWEB)
Nichols, W. E. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Lehman, L. L. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)
2017-06-05
The Hanford Site’s currently maintained Composite Analysis, originally completed in 1998, requires an update. A previous update effort was undertaken by the U.S. Department of Energy (DOE) in 2001-2005, but was ended before completion to allow the Tank Closure & Waste Management Environmental Impact Statement (TC&WM EIS) (DOE/EIS-0391) to be prepared without potential for conflicting sitewide models. This EIS was issued in 2012, and the deferral was ended with guidance in memorandum “Modeling to Support Regulatory Decision Making at Hanford” (Williams, 2012) provided with the aim of ensuring subsequent modeling is consistent with the EIS.
He, Jingrui
2012-01-01
This book focuses on rare category analysis where the majority classes have smooth distributions and the minority classes exhibit the compactness property. It focuses on challenging cases where the support regions of the majority and minority classes overlap.
Longitudinal categorical data analysis
Sutradhar, Brajendra C
2014-01-01
This is the first book in longitudinal categorical data analysis with parametric correlation models developed based on dynamic relationships among repeated categorical responses. This book is a natural generalization of the longitudinal binary data analysis to the multinomial data setup with more than two categories. Thus, unlike the existing books on cross-sectional categorical data analysis using log linear models, this book uses multinomial probability models both in cross-sectional and longitudinal setups. A theoretical foundation is provided for the analysis of univariate multinomial responses, by developing models systematically for the cases with no covariates as well as categorical covariates, both in cross-sectional and longitudinal setups. In the longitudinal setup, both stationary and non-stationary covariates are considered. These models have also been extended to the bivariate multinomial setup along with suitable covariates. For the inferences, the book uses the generalized quasi-likelihood as w...
International Nuclear Information System (INIS)
1981-09-01
Suggestion are made concerning the method of the fault tree analysis, the use of certain symbols in the examination of system failures. This purpose of the fault free analysis is to find logical connections of component or subsystem failures leading to undesirable occurrances. The results of these examinations are part of the system assessment concerning operation and safety. The objectives of the analysis are: systematical identification of all possible failure combinations (causes) leading to a specific undesirable occurrance, finding of reliability parameters such as frequency of failure combinations, frequency of the undesirable occurrance or non-availability of the system when required. The fault tree analysis provides a near and reconstructable documentation of the examination. (orig./HP) [de
Denker, A; Rauschenberg, J; Röhrich, J; Strub, E
2006-01-01
Materials analysis with ion beams exploits the interaction of ions with the electrons and nuclei in the sample. Among the vast variety of possible analytical techniques available with ion beams we will restrain to ion beam analysis with ion beams in the energy range from one to several MeV per mass unit. It is possible to use either the back-scattered projectiles (RBS – Rutherford Back Scattering) or the recoiled atoms itself (ERDA – Elastic Recoil Detection Analysis) from the elastic scattering processes. These techniques allow the simultaneous and absolute determination of stoichiometry and depth profiles of the detected elements. The interaction of the ions with the electrons in the sample produces holes in the inner electronic shells of the sample atoms, which recombine and emit X-rays characteristic for the element in question. Particle Induced X-ray Emission (PIXE) has shown to be a fast technique for the analysis of elements with an atomic number above 11.
DEFF Research Database (Denmark)
Vatrapu, Ravi; Mukkamala, Raghava Rao; Hussain, Abid
2016-01-01
, conceptual and formal models of social data, and an analytical framework for combining big social data sets with organizational and societal data sets. Three empirical studies of big social data are presented to illustrate and demonstrate social set analysis in terms of fuzzy set-theoretical sentiment...... automata and agent-based modeling). However, when it comes to organizational and societal units of analysis, there exists no approach to conceptualize, model, analyze, explain, and predict social media interactions as individuals' associations with ideas, values, identities, and so on. To address...... analysis, crisp set-theoretical interaction analysis, and event-studies-oriented set-theoretical visualizations. Implications for big data analytics, current limitations of the set-theoretical approach, and future directions are outlined....
PWR systems transient analysis
International Nuclear Information System (INIS)
Kennedy, M.F.; Peeler, G.B.; Abramson, P.B.
1985-01-01
Analysis of transients in pressurized water reactor (PWR) systems involves the assessment of the response of the total plant, including primary and secondary coolant systems, steam piping and turbine (possibly including the complete feedwater train), and various control and safety systems. Transient analysis is performed as part of the plant safety analysis to insure the adequacy of the reactor design and operating procedures and to verify the applicable plant emergency guidelines. Event sequences which must be examined are developed by considering possible failures or maloperations of plant components. These vary in severity (and calculational difficulty) from a series of normal operational transients, such as minor load changes, reactor trips, valve and pump malfunctions, up to the double-ended guillotine rupture of a primary reactor coolant system pipe known as a Large Break Loss of Coolant Accident (LBLOCA). The focus of this paper is the analysis of all those transients and accidents except loss of coolant accidents
Full closure strategic analysis.
2014-07-01
The full closure strategic analysis was conducted to create a decision process whereby full roadway : closures for construction and maintenance activities can be evaluated and approved or denied by CDOT : Traffic personnel. The study reviewed current...
Electrical Subsurface Grounding Analysis
International Nuclear Information System (INIS)
J.M. Calle
2000-01-01
The purpose and objective of this analysis is to determine the present grounding requirements of the Exploratory Studies Facility (ESF) subsurface electrical system and to verify that the actual grounding system and devices satisfy the requirements
DEFF Research Database (Denmark)
Skrypnyuk, Nataliya; Nielson, Flemming; Pilegaard, Henrik
2009-01-01
We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced into the...... into the syntax of IMC in order to make our analysis feasible. Finally we describe the analysis itself together with several theoretical results that we have proved for it.......We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced...
Canonical Information Analysis
DEFF Research Database (Denmark)
Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg
2015-01-01
is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator......Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables...... for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical...
Qualitative Data Analysis Strategies
Greaves, Kristoffer
2014-01-01
A set of concept maps for qualitative data analysis strategies, inspired by Corbin, JM & Strauss, AL 2008, Basics of qualitative research: Techniques and procedures for developing grounded theory, 3rd edn, Sage Publications, Inc, Thousand Oaks, California.
Statistical data analysis handbook
National Research Council Canada - National Science Library
Wall, Francis J
1986-01-01
It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...
Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank
2009-01-01
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.
NOAA's Inundation Analysis Tool
National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...
Multidimensional nonlinear descriptive analysis
Nishisato, Shizuhiko
2006-01-01
Quantification of categorical, or non-numerical, data is a problem that scientists face across a wide range of disciplines. Exploring data analysis in various areas of research, such as the social sciences and biology, Multidimensional Nonlinear Descriptive Analysis presents methods for analyzing categorical data that are not necessarily sampled randomly from a normal population and often involve nonlinear relations. This reference not only provides an overview of multidimensional nonlinear descriptive analysis (MUNDA) of discrete data, it also offers new results in a variety of fields. The first part of the book covers conceptual and technical preliminaries needed to understand the data analysis in subsequent chapters. The next two parts contain applications of MUNDA to diverse data types, with each chapter devoted to one type of categorical data, a brief historical comment, and basic skills peculiar to the data types. The final part examines several problems and then concludes with suggestions for futu...
Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars
2016-04-12
A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.
Water Quality Analysis Simulation
The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural phenomena and man-made pollution for variious pollution management decisions.
... Plasma Free Metanephrines Platelet Count Platelet Function Tests Pleural Fluid Analysis PML-RARA Porphyrin Tests Potassium Prealbumin ... is being tested? Synovial fluid is a thick liquid that acts as a lubricant for the body's ...
Hytönen, Tuomas; Veraar, Mark; Weis, Lutz
The present volume develops the theory of integration in Banach spaces, martingales and UMD spaces, and culminates in a treatment of the Hilbert transform, Littlewood-Paley theory and the vector-valued Mihlin multiplier theorem. Over the past fifteen years, motivated by regularity problems in evolution equations, there has been tremendous progress in the analysis of Banach space-valued functions and processes. The contents of this extensive and powerful toolbox have been mostly scattered around in research papers and lecture notes. Collecting this diverse body of material into a unified and accessible presentation fills a gap in the existing literature. The principal audience that we have in mind consists of researchers who need and use Analysis in Banach Spaces as a tool for studying problems in partial differential equations, harmonic analysis, and stochastic analysis. Self-contained and offering complete proofs, this work is accessible to graduate students and researchers with a background in functional an...
Analysis Streamlining in ATLAS
Heinrich, Lukas; The ATLAS collaboration
2018-01-01
We present recent work within the ATLAS collaboration centrally provide tools to facilitate analysis management and highly automated container-based analysis execution in order to both enable non-experts to benefit from these best practices as well as the collaboration to track and re-execute analyses indpendently, e.g. during their review phase. Through integration with the ATLAS GLANCE system, users can request a pre-configured, but customizable version control setup, including continuous integration for automated build and testing as well as continuous Linux Container image building for software preservation purposes. As analyses typically require many individual steps, analysis workflow pipelines can then be defined using such images and the yadage workflow description language. The integration into the workflow exection service REANA allows the interactive or automated reproduction of the main analysis results by orchestrating a large number of container jobs using the Kubernetes. For long-term archival,...
Wolff, Thomas H; Shubin, Carol
2003-01-01
This book demonstrates how harmonic analysis can provide penetrating insights into deep aspects of modern analysis. It is both an introduction to the subject as a whole and an overview of those branches of harmonic analysis that are relevant to the Kakeya conjecture. The usual background material is covered in the first few chapters: the Fourier transform, convolution, the inversion theorem, the uncertainty principle and the method of stationary phase. However, the choice of topics is highly selective, with emphasis on those frequently used in research inspired by the problems discussed in the later chapters. These include questions related to the restriction conjecture and the Kakeya conjecture, distance sets, and Fourier transforms of singular measures. These problems are diverse, but often interconnected; they all combine sophisticated Fourier analysis with intriguing links to other areas of mathematics and they continue to stimulate first-rate work. The book focuses on laying out a solid foundation for fu...
Water Quality Analysis Simulation
U.S. Environmental Protection Agency — The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural...
Federal Laboratory Consortium — Provides engineering design of aircraft components, subsystems and installations using Pro/E, Anvil 1000, CADKEY 97, AutoCAD 13. Engineering analysis tools include...
CSIR Research Space (South Africa)
Khuluse, S
2009-04-01
Full Text Available ) determination of the distribution of the damage and (iii) preparation of products that enable prediction of future risk events. The methodology provided by extreme value theory can also be a powerful tool in risk analysis...
Ziemer, William P
2017-01-01
This first year graduate text is a comprehensive resource in real analysis based on a modern treatment of measure and integration. Presented in a definitive and self-contained manner, it features a natural progression of concepts from simple to difficult. Several innovative topics are featured, including differentiation of measures, elements of Functional Analysis, the Riesz Representation Theorem, Schwartz distributions, the area formula, Sobolev functions and applications to harmonic functions. Together, the selection of topics forms a sound foundation in real analysis that is particularly suited to students going on to further study in partial differential equations. This second edition of Modern Real Analysis contains many substantial improvements, including the addition of problems for practicing techniques, and an entirely new section devoted to the relationship between Lebesgue and improper integrals. Aimed at graduate students with an understanding of advanced calculus, the text will also appeal to mo...
DEFF Research Database (Denmark)
Fischer, Paul; Hilbert, Astrid
2012-01-01
We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...
International Nuclear Information System (INIS)
Holland, W.E.
1980-02-01
A method was developed to determine if boron-loaded polymeric material contained enriched boron or natural boron. A prototype analyzer was constructed, and initial planning was done for an actual analysis facility
Stakeholder Analysis Worksheet
Stakeholder Analysis WorksheetA worksheet that can be used to document potential stakeholder groups, the information or expertise they hold, the role that they can play, their interests or concerns about the HIA
Energy Technology Data Exchange (ETDEWEB)
Arent, D.; Benioff, R.; Mosey, G.; Bird, L.; Brown, J.; Brown, E.; Vimmerstedt, L.; Aabakken, J.; Parks, K.; Lapsa, M.; Davis, S.; Olszewski, M.; Cox, D.; McElhaney, K.; Hadley, S.; Hostick, D.; Nicholls, A.; McDonald, S.; Holloman, B.
2006-10-01
This paper presents the results of energy market analysis sponsored by the Department of Energy's (DOE) Weatherization and International Program (WIP) within the Office of Energy Efficiency and Renewable Energy (EERE). The analysis was conducted by a team of DOE laboratory experts from the National Renewable Energy Laboratory (NREL), Oak Ridge National Laboratory (ORNL), and Pacific Northwest National Laboratory (PNNL), with additional input from Lawrence Berkeley National Laboratory (LBNL). The analysis was structured to identify those markets and niches where government can create the biggest impact by informing management decisions in the private and public sectors. The analysis identifies those markets and niches where opportunities exist for increasing energy efficiency and renewable energy use.
Principles of Fourier analysis
Howell, Kenneth B
2001-01-01
Fourier analysis is one of the most useful and widely employed sets of tools for the engineer, the scientist, and the applied mathematician. As such, students and practitioners in these disciplines need a practical and mathematically solid introduction to its principles. They need straightforward verifications of its results and formulas, and they need clear indications of the limitations of those results and formulas.Principles of Fourier Analysis furnishes all this and more. It provides a comprehensive overview of the mathematical theory of Fourier analysis, including the development of Fourier series, "classical" Fourier transforms, generalized Fourier transforms and analysis, and the discrete theory. Much of the author''s development is strikingly different from typical presentations. His approach to defining the classical Fourier transform results in a much cleaner, more coherent theory that leads naturally to a starting point for the generalized theory. He also introduces a new generalized theory based ...
Quantitative investment analysis
DeFusco, Richard
2007-01-01
In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.
Energy Technology Data Exchange (ETDEWEB)
Wood, William Monford [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-02-23
Presenting a systematic study of the standard analysis of rod-pinch radiographs for obtaining quantitative measurements of areal mass densities, and making suggestions for improving the methodology of obtaining quantitative information from radiographed objects.
Biodiesel Emissions Analysis Program
Using existing data, the EPA's biodiesel emissions analysis program sought to quantify the air pollution emission effects of biodiesel for diesel engines that have not been specifically modified to operate on biodiesel.
Introduction to global analysis
Kahn, Donald W
2007-01-01
This text introduces the methods of mathematical analysis as applied to manifolds, including the roles of differentiation and integration, infinite dimensions, Morse theory, Lie groups, and dynamical systems. 1980 edition.
Biorefinery Sustainability Analysis
DEFF Research Database (Denmark)
J. S. M. Silva, Carla; Prunescu, Remus Mihail; Gernaey, Krist
2017-01-01
This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system and of t......This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system...... and of the biorefinery-based system. Socio-economic criteria and indicators used in sustainability frameworks assessment are presented and discussed. There is not one single methodology that can aptly cover the synergies of environmental, economic, social and governance issues required to assess the sustainable...
Pesticide Instrumental Analysis
International Nuclear Information System (INIS)
Samir, E.; Fonseca, E.; Baldyga, N.; Acosta, A.; Gonzalez, F.; Felicita, F.; Tomasso, M.; Esquivel, D.; Parada, A.; Enriquez, P.; Amilibia, M.
2012-01-01
This workshop was the evaluation of the pesticides impact on the vegetable matrix with the purpose to determine the analysis by GC / M S. The working material were lettuce matrix, chard and a mix of green leaves and pesticides.
Kansas Data Access and Support Center — The Kansas GAP Analysis Land Cover database depicts 43 land cover classes for the state of Kansas. The database was generated using a two-stage hybrid classification...
Perspectives in shape analysis
Bruckstein, Alfred; Maragos, Petros; Wuhrer, Stefanie
2016-01-01
This book presents recent advances in the field of shape analysis. Written by experts in the fields of continuous-scale shape analysis, discrete shape analysis and sparsity, and numerical computing who hail from different communities, it provides a unique view of the topic from a broad range of perspectives. Over the last decade, it has become increasingly affordable to digitize shape information at high resolution. Yet analyzing and processing this data remains challenging because of the large amount of data involved, and because modern applications such as human-computer interaction require real-time processing. Meeting these challenges requires interdisciplinary approaches that combine concepts from a variety of research areas, including numerical computing, differential geometry, deformable shape modeling, sparse data representation, and machine learning. On the algorithmic side, many shape analysis tasks are modeled using partial differential equations, which can be solved using tools from the field of n...
National Research Council Canada - National Science Library
1998-01-01
.... Establishing proper job procedures is one of the benefits of conducting a job hazard analysis carefully studying and recording each step of a job, identifying existing or potential job hazards...
Main: Nucleotide Analysis [KOME
Lifescience Database Archive (English)
Full Text Available Nucleotide Analysis Japonica genome blast search result Result of blastn search against jap...onica genome sequence kome_japonica_genome_blast_search_result.zip kome_japonica_genome_blast_search_result ...
Schuller, Björn W
2013-01-01
This book provides the reader with the knowledge necessary for comprehension of the field of Intelligent Audio Analysis. It firstly introduces standard methods and discusses the typical Intelligent Audio Analysis chain going from audio data to audio features to audio recognition. Further, an introduction to audio source separation, and enhancement and robustness are given. After the introductory parts, the book shows several applications for the three types of audio: speech, music, and general sound. Each task is shortly introduced, followed by a description of the specific data and methods applied, experiments and results, and a conclusion for this specific task. The books provides benchmark results and standardized test-beds for a broader range of audio analysis tasks. The main focus thereby lies on the parallel advancement of realism in audio analysis, as too often today’s results are overly optimistic owing to idealized testing conditions, and it serves to stimulate synergies arising from transfer of ...
Scientific stream pollution analysis
National Research Council Canada - National Science Library
Nemerow, Nelson Leonard
1974-01-01
A comprehensive description of the analysis of water pollution that presents a careful balance of the biological,hydrological, chemical and mathematical concepts involved in the evaluation of stream...
International Nuclear Information System (INIS)
Andreeva, J; Maier, G; Spiga, D; Calloni, M; Colling, D; Fanzago, F; D'Hondt, J; Maes, J; Van Mulders, P; Villella, I; Klem, J; Letts, J; Padhi, S; Sarkar, S
2010-01-01
During normal data taking CMS expects to support potentially as many as 2000 analysis users. Since the beginning of 2008 there have been more than 800 individuals who submitted a remote analysis job to the CMS computing infrastructure. The bulk of these users will be supported at the over 40 CMS Tier-2 centres. Supporting a globally distributed community of users on a globally distributed set of computing clusters is a task that requires reconsidering the normal methods of user support for Analysis Operations. In 2008 CMS formed an Analysis Support Task Force in preparation for large-scale physics analysis activities. The charge of the task force was to evaluate the available support tools, the user support techniques, and the direct feedback of users with the goal of improving the success rate and user experience when utilizing the distributed computing environment. The task force determined the tools needed to assess and reduce the number of non-zero exit code applications submitted through the grid interfaces and worked with the CMS experiment dashboard developers to obtain the necessary information to quickly and proactively identify issues with user jobs and data sets hosted at various sites. Results of the analysis group surveys were compiled. Reference platforms for testing and debugging problems were established in various geographic regions. The task force also assessed the resources needed to make the transition to a permanent Analysis Operations task. In this presentation the results of the task force will be discussed as well as the CMS Analysis Operations plans for the start of data taking.
Invitation to classical analysis
Duren, Peter
2012-01-01
This book gives a rigorous treatment of selected topics in classical analysis, with many applications and examples. The exposition is at the undergraduate level, building on basic principles of advanced calculus without appeal to more sophisticated techniques of complex analysis and Lebesgue integration. Among the topics covered are Fourier series and integrals, approximation theory, Stirling's formula, the gamma function, Bernoulli numbers and polynomials, the Riemann zeta function, Tauberian theorems, elliptic integrals, ramifications of the Cantor set, and a theoretical discussion of differ
Analysis of irradiated materials
International Nuclear Information System (INIS)
Bellamy, B.A.
1988-01-01
Papers presented at the UKAEA Conference on Materials Analysis by Physical Techniques (1987) covered a wide range of techniques as applied to the analysis of irradiated materials. These varied from reactor component materials, materials associated with the Authority's radwaste disposal programme, fission products and products associated with the decommissioning of nuclear reactors. An invited paper giving a very comprehensive review of Laser Ablation Microprobe Mass Spectroscopy (LAMMS) was included in the programme. (author)
Oktavianto, Digit
2013-01-01
This book is a step-by-step, practical tutorial for analyzing and detecting malware and performing digital investigations. This book features clear and concise guidance in an easily accessible format.Cuckoo Malware Analysis is great for anyone who wants to analyze malware through programming, networking, disassembling, forensics, and virtualization. Whether you are new to malware analysis or have some experience, this book will help you get started with Cuckoo Sandbox so you can start analysing malware effectively and efficiently.
International Nuclear Information System (INIS)
Niehaus, F.
1988-01-01
In this paper, the risks of various energy systems are discussed considering severe accidents analysis, particularly the probabilistic safety analysis, and probabilistic safety criteria, and the applications of these criteria and analysis. The comparative risk analysis has demonstrated that the largest source of risk in every society is from daily small accidents. Nevertheless, we have to be more concerned about severe accidents. The comparative risk analysis of five different energy systems (coal, oil, gas, LWR and STEC (Solar)) for the public has shown that the main sources of risks are coal and oil. The latest comparative risk study of various energy has been conducted in the USA and has revealed that the number of victims from coal is 42 as many than victims from nuclear. A study for severe accidents from hydro-dams in United States has estimated the probability of dam failures at 1 in 10,000 years and the number of victims between 11,000 and 260,000. The average occupational risk from coal is one fatal accident in 1,000 workers/year. The probabilistic safety analysis is a method that can be used to assess nuclear energy risks, and to analyze the severe accidents, and to model all possible accident sequences and consequences. The 'Fault tree' analysis is used to know the probability of failure of the different systems at each point of accident sequences and to calculate the probability of risks. After calculating the probability of failure, the criteria for judging the numerical results have to be developed, that is the quantitative and qualitative goals. To achieve these goals, several systems have been devised by various countries members of AIEA. The probabilistic safety ana-lysis method has been developed by establishing a computer program permit-ting to know different categories of safety related information. 19 tabs. (author)
International Nuclear Information System (INIS)
Arien, B.
1998-01-01
The objective of SCK-CEN's programme on reactor safety is to develop expertise in probabilistic and deterministic reactor safety analysis. The research programme consists of four main activities, in particular the development of software for reliability analysis of large systems and participation in the international PHEBUS-FP programme for severe accidents, the development of an expert system for the aid to diagnosis; the development and application of a probabilistic reactor dynamics method. Main achievements in 1999 are reported
Brady, Sir Michael; Highnam, Ralph; Irving, Benjamin; Schnabel, Julia A
2016-10-01
Cancer is one of the world's major healthcare challenges and, as such, an important application of medical image analysis. After a brief introduction to cancer, we summarise some of the major developments in oncological image analysis over the past 20 years, but concentrating those in the authors' laboratories, and then outline opportunities and challenges for the next decade. Copyright © 2016 Elsevier B.V. All rights reserved.
International Nuclear Information System (INIS)
Tibari, Elghali; Taous, Fouad; Marah, Hamid
2014-01-01
This report presents results related to stable isotopes analysis carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal. These analyzes cover 127 samples. These results demonstrate that Oxygen-18 and Deuterium in water analysis were performed by infrared Laser spectroscopy using a LGR / DLT-100 with Autosampler. Also, the results are expressed in δ values (‰) relative to V-SMOW to ± 0.3 ‰ for oxygen-18 and ± 1 ‰ for deuterium.
Oden, J Tinsley
2010-01-01
The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010
Griffel, DH
2002-01-01
A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the
Ivan Stosic; Drasko Nikolic; Aleksandar Zdravkovic
2012-01-01
The main purpose of this paper is to examine the impact of the current Serbian macro-environment on the businesses through the implementation of PEST analysis as a framework for assessing general or macro environment in which companies are operating. The authors argue the elements in presented PEST analysis indicate that the current macro-environment is characterized by the dominance of threats and weaknesses with few opportunities and strengths. Consequently, there is a strong need for faste...
Forensic neutron activation analysis
International Nuclear Information System (INIS)
Kishi, T.
1987-01-01
The progress of forensic neutron activation analysis (FNAA) in Japan is described. FNAA began in 1965 and during the past 20 years many cases have been handled; these include determination of toxic materials, comparison examination of physical evidences (e.g., paints, metal fragments, plastics and inks) and drug sample differentiation. Neutron activation analysis is applied routinely to the scientific criminal investigation as one of multielement analytical techniques. This paper also discusses these routine works. (author) 14 refs
Probabilistic Structural Analysis Program
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
Directory of Open Access Journals (Sweden)
Iulian N. BUJOREANU
2011-01-01
Full Text Available Sensitivity analysis represents such a well known and deeply analyzed subject that anyone to enter the field feels like not being able to add anything new. Still, there are so many facets to be taken into consideration.The paper introduces the reader to the various ways sensitivity analysis is implemented and the reasons for which it has to be implemented in most analyses in the decision making processes. Risk analysis is of outmost importance in dealing with resource allocation and is presented at the beginning of the paper as the initial cause to implement sensitivity analysis. Different views and approaches are added during the discussion about sensitivity analysis so that the reader develops an as thoroughly as possible opinion on the use and UTILITY of the sensitivity analysis. Finally, a round-up conclusion brings us to the question of the possibility of generating the future and analyzing it before it unfolds so that, when it happens it brings less uncertainty.
International Nuclear Information System (INIS)
Grimanis, A.P.
1985-01-01
A review of research and development on NAA as well as examples of applications of this method are presented, taken from work carried out over the last 21 years at the Radioanalytical Laboratory of the Department of Chemistry in the Greek Nuclear Research Center ''Demokritos''. Improved and faster radiochemical NAA methods have been developed for the determination of Au, Ni, Cl, As, Cu, U, Cr, Eu, Hg and Mo in several materials, for the simultaneous determination of Br and I; Mg, Sr and Ni; As and Cu; As, Sb and Hg; Mn, Sr and Ba; Cd and Zn; Se and As; Mo and Cr in biological materials. Instrumental NAA methods have also been developed for the determination of Ag, Cl and Na in lake waters, Al, Ca, Mg and V in wines, 7 trace elements in biological materials, 17 trace elements in sediments and 20 minor and trace elements in ceramics. A comprehensive computer program for routine activation analysis using Ge(Li) detectors have been worked out. A rather extended charged-particle activation analysis program is carried out for the last 10 years, including particle induced X-ray emission (PIXE) analysis, particle induced prompt gamma-ray emission analysis (PIGE), other nuclear reactions and proton activation analysis. A special neutron activation method, the delayed fission neutron counting method is used for the analysis of fissionable elements, as U, Th, Pu, in samples of the whole nuclear fuel cycle including geological, enriched and nuclear safeguards samples
Integrated genetic analysis microsystems
International Nuclear Information System (INIS)
Lagally, Eric T; Mathies, Richard A
2004-01-01
With the completion of the Human Genome Project and the ongoing DNA sequencing of the genomes of other animals, bacteria, plants and others, a wealth of new information about the genetic composition of organisms has become available. However, as the demand for sequence information grows, so does the workload required both to generate this sequence and to use it for targeted genetic analysis. Microfabricated genetic analysis systems are well poised to assist in the collection and use of these data through increased analysis speed, lower analysis cost and higher parallelism leading to increased assay throughput. In addition, such integrated microsystems may point the way to targeted genetic experiments on single cells and in other areas that are otherwise very difficult. Concomitant with these advantages, such systems, when fully integrated, should be capable of forming portable systems for high-speed in situ analyses, enabling a new standard in disciplines such as clinical chemistry, forensics, biowarfare detection and epidemiology. This review will discuss the various technologies available for genetic analysis on the microscale, and efforts to integrate them to form fully functional robust analysis devices. (topical review)
INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES
Directory of Open Access Journals (Sweden)
Caescu Stefan Claudiu
2011-12-01
Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such
Professionalizing Intelligence Analysis
Directory of Open Access Journals (Sweden)
James B. Bruce
2015-09-01
Full Text Available This article examines the current state of professionalism in national security intelligence analysis in the U.S. Government. Since the introduction of major intelligence reforms directed by the Intelligence Reform and Terrorism Prevention Act (IRTPA in December, 2004, we have seen notable strides in many aspects of intelligence professionalization, including in analysis. But progress is halting, uneven, and by no means permanent. To consolidate its gains, and if it is to continue improving, the U.S. intelligence community (IC should commit itself to accomplishing a new program of further professionalization of analysis to ensure that it will develop an analytic cadre that is fully prepared to deal with the complexities of an emerging multipolar and highly dynamic world that the IC itself is forecasting. Some recent reforms in intelligence analysis can be assessed against established standards of more fully developed professions; these may well fall short of moving the IC closer to the more fully professionalized analytical capability required for producing the kind of analysis needed now by the United States.
Harmonic and geometric analysis
Citti, Giovanna; Pérez, Carlos; Sarti, Alessandro; Zhong, Xiao
2015-01-01
This book presents an expanded version of four series of lectures delivered by the authors at the CRM. Harmonic analysis, understood in a broad sense, has a very wide interplay with partial differential equations and in particular with the theory of quasiconformal mappings and its applications. Some areas in which real analysis has been extremely influential are PDE's and geometric analysis. Their foundations and subsequent developments made extensive use of the Calderón–Zygmund theory, especially the Lp inequalities for Calderón–Zygmund operators (Beurling transform and Riesz transform, among others) and the theory of Muckenhoupt weights. The first chapter is an application of harmonic analysis and the Heisenberg group to understanding human vision, while the second and third chapters cover some of the main topics on linear and multilinear harmonic analysis. The last serves as a comprehensive introduction to a deep result from De Giorgi, Moser and Nash on the regularity of elliptic partial differen...
Hansson, Sven Ove; Aven, Terje
2014-07-01
This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.
Zhou, Qing; Son, Kyungjin; Liu, Ying; Revzin, Alexander
2015-01-01
Biosensors first appeared several decades ago to address the need for monitoring physiological parameters such as oxygen or glucose in biological fluids such as blood. More recently, a new wave of biosensors has emerged in order to provide more nuanced and granular information about the composition and function of living cells. Such biosensors exist at the confluence of technology and medicine and often strive to connect cell phenotype or function to physiological or pathophysiological processes. Our review aims to describe some of the key technological aspects of biosensors being developed for cell analysis. The technological aspects covered in our review include biorecognition elements used for biosensor construction, methods for integrating cells with biosensors, approaches to single-cell analysis, and the use of nanostructured biosensors for cell analysis. Our hope is that the spectrum of possibilities for cell analysis described in this review may pique the interest of biomedical scientists and engineers and may spur new collaborations in the area of using biosensors for cell analysis.
Clinical reasoning: concept analysis.
Simmons, Barbara
2010-05-01
This paper is a report of a concept analysis of clinical reasoning in nursing. Clinical reasoning is an ambiguous term that is often used synonymously with decision-making and clinical judgment. Clinical reasoning has not been clearly defined in the literature. Healthcare settings are increasingly filled with uncertainty, risk and complexity due to increased patient acuity, multiple comorbidities, and enhanced use of technology, all of which require clinical reasoning. Data sources. Literature for this concept analysis was retrieved from several databases, including CINAHL, PubMed, PsycINFO, ERIC and OvidMEDLINE, for the years 1980 to 2008. Rodgers's evolutionary method of concept analysis was used because of its applicability to concepts that are still evolving. Multiple terms have been used synonymously to describe the thinking skills that nurses use. Research in the past 20 years has elucidated differences among these terms and identified the cognitive processes that precede judgment and decision-making. Our concept analysis defines one of these terms, 'clinical reasoning,' as a complex process that uses cognition, metacognition, and discipline-specific knowledge to gather and analyse patient information, evaluate its significance, and weigh alternative actions. This concept analysis provides a middle-range descriptive theory of clinical reasoning in nursing that helps clarify meaning and gives direction for future research. Appropriate instruments to operationalize the concept need to be developed. Research is needed to identify additional variables that have an impact on clinical reasoning and what are the consequences of clinical reasoning in specific situations.
International Nuclear Information System (INIS)
Sitek, J.; Degmova, J.; Dekan, J.
2011-01-01
Meteorite Kosice fell down 28 th of February 2010 near the Kosice and represents an unique find, because the last fall of meteorite was observed in Slovakia at the year 1895. It supposes that for this kind of meteorite the orbit in cosmic space could be calculated. This is one of most important part because until now 13 places of meteorite find are known in the world of which cosmic orbit in space have been calculated. Slovakia is member of international bolide net, dealing with meteorite analysis in Middle Europe .Analysis of Kosice meteorite will also concern at the long live and short live nuclides. Results should be a contribution to determination of radiation and formative ages. From structural analysis of meteorite it will be possible to compare it with similar types of meteorites. In this work Moessbauer spectroscopy will be used for phase analysis from point of view iron contain components with the aim to identify magnetic and non magnetic fractions. From the analysis of magnetic part we can find that the first sextet with hyperfine magnetic field 33.5 T corresponds to bcc Fe-Ni alloy (kamacite) and second with field 31.5 T to FeS (triolite). Meteorites with mentioned composition belong to the mineral group of chondrites. Comparing our parameters with results of measurements at the similar meteorites we can conclude that Kosice meteorite contains the same components. According all Moessbauer parameters we can also include this meteorite in the mineral group of chondrites. (authors)
Foundations of VISAR analysis.
Energy Technology Data Exchange (ETDEWEB)
Dolan, Daniel H.
2006-06-01
The Velocity Interferometer System for Any Reflector (VISAR) is a widely used diagnostic at Sandia National Laboratories. Although the operating principles of the VISAR are well established, recently deployed systems (such as the fast push-pull and air delay VISAR) require more careful consideration, and many common assumptions about VISAR are coming into question. This report presents a comprehensive review of VISAR analysis to address these issues. Detailed treatment of several interferometer configurations is given to identify important aspects of the operation and characterization of VISAR systems. The calculation of velocity from interferometer measurements is also described. The goal is to derive the standard VISAR analysis relationships, indicate when these relationships are valid, and provide alternative methods when the standard analysis fails.
Pugh, Charles C
2015-01-01
Based on an honors course taught by the author at UC Berkeley, this introduction to undergraduate real analysis gives a different emphasis by stressing the importance of pictures and hard problems. Topics include: a natural construction of the real numbers, four-dimensional visualization, basic point-set topology, function spaces, multivariable calculus via differential forms (leading to a simple proof of the Brouwer Fixed Point Theorem), and a pictorial treatment of Lebesgue theory. Over 150 detailed illustrations elucidate abstract concepts and salient points in proofs. The exposition is informal and relaxed, with many helpful asides, examples, some jokes, and occasional comments from mathematicians, such as Littlewood, Dieudonné, and Osserman. This book thus succeeds in being more comprehensive, more comprehensible, and more enjoyable, than standard introductions to analysis. New to the second edition of Real Mathematical Analysis is a presentation of Lebesgue integration done almost entirely using the un...
Digital Fourier analysis fundamentals
Kido, Ken'iti
2015-01-01
This textbook is a thorough, accessible introduction to digital Fourier analysis for undergraduate students in the sciences. Beginning with the principles of sine/cosine decomposition, the reader walks through the principles of discrete Fourier analysis before reaching the cornerstone of signal processing: the Fast Fourier Transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Fundamentals" includes practice problems and thorough Appendices for the advanced reader. As a special feature, the book includes interactive applets (available online) that mirror the illustrations. These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. For example, a real sine signal can be treated as a sum of clockwise and counter-clockwise rotating vectors. The applet illustration included with the book animates the rotating vectors and the resulting sine signal. By changing parameters such as amplitude and frequency, the reader ca...
Frank, IE
1994-01-01
Analyzing observed or measured data is an important step in applied sciences. The recent increase in computer capacity has resulted in a revolution both in data collection and data analysis. An increasing number of scientists, researchers and students are venturing into statistical data analysis; hence the need for more guidance in this field, which was previously dominated mainly by statisticians. This handbook fills the gap in the range of textbooks on data analysis. Written in a dictionary format, it will serve as a comprehensive reference book in a rapidly growing field. However, this book is more structured than an ordinary dictionary, where each entry is a separate, self-contained entity. The authors provide not only definitions and short descriptions, but also offer an overview of the different topics. Therefore, the handbook can also be used as a companion to textbooks for undergraduate or graduate courses. 1700 entries are given in alphabetical order grouped into 20 topics and each topic is organized...
International Nuclear Information System (INIS)
Arien, B.
1998-01-01
Risk assessments of nuclear installations require accurate safety and reliability analyses to estimate the consequences of accidental events and their probability of occurrence. The objective of the work performed in this field at the Belgian Nuclear Research Centre SCK-CEN is to develop expertise in probabilistic and deterministic reactor safety analysis. The four main activities of the research project on reactor safety analysis are: (1) the development of software for the reliable analysis of large systems; (2) the development of an expert system for the aid to diagnosis; (3) the development and the application of a probabilistic reactor-dynamics method, and (4) to participate in the international PHEBUS-FP programme for severe accidents. Progress in research during 1997 is described
Energy Technology Data Exchange (ETDEWEB)
Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-12
The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.
International Nuclear Information System (INIS)
Deville, J.P.
1998-01-01
Nowadays, there are a lot of surfaces analysis methods, each having its specificity, its qualities, its constraints (for instance vacuum) and its limits. Expensive in time and in investment, these methods have to be used deliberately. This article appeals to non specialists. It gives some elements of choice according to the studied information, the sensitivity, the use constraints or the answer to a precise question. After having recalled the fundamental principles which govern these analysis methods, based on the interaction between radiations (ultraviolet, X) or particles (ions, electrons) with matter, two methods will be more particularly described: the Auger electron spectroscopy (AES) and x-rays photoemission spectroscopy (ESCA or XPS). Indeed, they are the most widespread methods in laboratories, the easier for use and probably the most productive for the analysis of surface of industrial materials or samples submitted to treatments in aggressive media. (O.M.)
Power electronics reliability analysis.
Energy Technology Data Exchange (ETDEWEB)
Smith, Mark A.; Atcitty, Stanley
2009-12-01
This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.
International Nuclear Information System (INIS)
Gregg, H.R.; Meltzer, M.P.
1996-01-01
The portable Contamination Analysis Unit (CAU) measures trace quantities of surface contamination in real time. The detector head of the portable contamination analysis unit has an opening with an O-ring seal, one or more vacuum valves and a small mass spectrometer. With the valve closed, the mass spectrometer is evacuated with one or more pumps. The O-ring seal is placed against a surface to be tested and the vacuum valve is opened. Data is collected from the mass spectrometer and a portable computer provides contamination analysis. The CAU can be used to decontaminate and decommission hazardous and radioactive surfaces by measuring residual hazardous surface contamination, such as tritium and trace organics. It provides surface contamination data for research and development applications as well as real-time process control feedback for industrial cleaning operations and can be used to determine the readiness of a surface to accept bonding or coatings. 1 fig
Jorgensen, Palle
2017-01-01
The book features new directions in analysis, with an emphasis on Hilbert space, mathematical physics, and stochastic processes. We interpret 'non-commutative analysis' broadly to include representations of non-Abelian groups, and non-Abelian algebras; emphasis on Lie groups and operator algebras (C* algebras and von Neumann algebras.)A second theme is commutative and non-commutative harmonic analysis, spectral theory, operator theory and their applications. The list of topics includes shift invariant spaces, group action in differential geometry, and frame theory (over-complete bases) and their applications to engineering (signal processing and multiplexing), projective multi-resolutions, and free probability algebras.The book serves as an accessible introduction, offering a timeless presentation, attractive and accessible to students, both in mathematics and in neighboring fields.
DEFF Research Database (Denmark)
Frigaard, Peter; Andersen, Thomas Lykke
The present book describes the most important aspects of wave analysis techniques applied to physical model tests. Moreover, the book serves as technical documentation for the wave analysis software WaveLab 3, cf. Aalborg University (2012). In that respect it should be mentioned that supplementary...... to the present technical documentation exists also the online help document describing the WaveLab software in detail including all the inputs and output fields. In addition to the two main authors also Tue Hald, Jacob Helm-Petersen and Morten Møller Jakobsen have contributed to the note. Their input is highly...... acknowledged. The outline of the book is as follows: • Chapter 2 and 3 describes analysis of waves in time and frequency domain. • Chapter 4 and 5 describes the separation of incident and reflected waves for the two-dimensional case. • Chapter 6 describes the estimation of the directional spectra which also...
Canuto, Claudio
2015-01-01
The purpose of the volume is to provide a support textbook for a second lecture course on Mathematical Analysis. The contents are organised to suit, in particular, students of Engineering, Computer Science and Physics, all areas in which mathematical tools play a crucial role. The basic notions and methods concerning integral and differential calculus for multivariable functions, series of functions and ordinary differential equations are presented in a manner that elicits critical reading and prompts a hands-on approach to concrete applications. The pedagogical layout echoes the one used in the companion text Mathematical Analysis I. The book’s structure has a specifically-designed modular nature, which allows for great flexibility in the preparation of a lecture course on Mathematical Analysis. The style privileges clarity in the exposition and a linear progression through the theory. The material is organised on two levels. The first, reflected in this book, allows students to grasp the essential ideas, ...
Bandemer, Hans
1992-01-01
Fuzzy data such as marks, scores, verbal evaluations, imprecise observations, experts' opinions and grey tone pictures, are quite common. In Fuzzy Data Analysis the authors collect their recent results providing the reader with ideas, approaches and methods for processing such data when looking for sub-structures in knowledge bases for an evaluation of functional relationship, e.g. in order to specify diagnostic or control systems. The modelling presented uses ideas from fuzzy set theory and the suggested methods solve problems usually tackled by data analysis if the data are real numbers. Fuzzy Data Analysis is self-contained and is addressed to mathematicians oriented towards applications and to practitioners in any field of application who have some background in mathematics and statistics.
Zorich, Vladimir A
2015-01-01
VLADIMIR A. ZORICH is professor of mathematics at Moscow State University. His areas of specialization are analysis, conformal geometry, quasiconformal mappings, and mathematical aspects of thermodynamics. He solved the problem of global homeomorphism for space quasiconformal mappings. He holds a patent in the technology of mechanical engineering, and he is also known by his book Mathematical Analysis of Problems in the Natural Sciences . This second English edition of a very popular two-volume work presents a thorough first course in analysis, leading from real numbers to such advanced topics as differential forms on manifolds; asymptotic methods; Fourier, Laplace, and Legendre transforms; elliptic functions; and distributions. Especially notable in this course are the clearly expressed orientation toward the natural sciences and the informal exploration of the essence and the roots of the basic concepts and theorems of calculus. Clarity of exposition is matched by a wealth of instructive exercises, problems...
DEFF Research Database (Denmark)
Christensen, Ole; Feichtinger, Hans G.; Paukner, Stephan
2015-01-01
, it characterizes a function by its transform over phase space, which is the time–frequency plane (TF-plane) in a musical context or the location–wave-number domain in the context of image processing. Since the transition from the signal domain to the phase space domain introduces an enormous amount of data...... of the generalities relevant for an understanding of Gabor analysis of functions on Rd. We pay special attention to the case d = 2, which is the most important case for image processing and image analysis applications. The chapter is organized as follows. Section 2 presents central tools from functional analysis......, the application of Gabor expansions to image representation is considered in Sect. 6....
Sohrab, Houshang H
2014-01-01
This expanded second edition presents the fundamentals and touchstone results of real analysis in full rigor, but in a style that requires little prior familiarity with proofs or mathematical language. The text is a comprehensive and largely self-contained introduction to the theory of real-valued functions of a real variable. The chapters on Lebesgue measure and integral have been rewritten entirely and greatly improved. They now contain Lebesgue’s differentiation theorem as well as his versions of the Fundamental Theorem(s) of Calculus. With expanded chapters, additional problems, and an expansive solutions manual, Basic Real Analysis, Second Edition, is ideal for senior undergraduates and first-year graduate students, both as a classroom text and a self-study guide. Reviews of first edition: The book is a clear and well-structured introduction to real analysis aimed at senior undergraduate and beginning graduate students. The prerequisites are few, but a certain mathematical sophistication is required. ....
Software safety hazard analysis
International Nuclear Information System (INIS)
Lawrence, J.D.
1996-02-01
Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper
Vágner, Petr; Pavelka, Michal; Maršík, František
2017-04-01
The well-known Gouy-Stodola theorem states that a device produces maximum useful power when working reversibly, that is with no entropy production inside the device. This statement then leads to a method of thermodynamic optimization based on entropy production minimization. Exergy destruction (difference between exergy of fuel and exhausts) is also given by entropy production inside the device. Therefore, assessing efficiency of a device by exergy analysis is also based on the Gouy-Stodola theorem. However, assumptions that had led to the Gouy-Stodola theorem are not satisfied in several optimization scenarios, e.g. non-isothermal steady-state fuel cells, where both entropy production minimization and exergy analysis should be used with caution. We demonstrate, using non-equilibrium thermodynamics, a few cases where entropy production minimization and exergy analysis should not be applied.
Directory of Open Access Journals (Sweden)
Sutawanir Darwis
2012-05-01
Full Text Available Empirical decline curve analysis of oil production data gives reasonable answer in hyperbolic type curves situations; however the methodology has limitations in fitting real historical production data in present of unusual observations due to the effect of the treatment to the well in order to increase production capacity. The development ofrobust least squares offers new possibilities in better fitting production data using declinecurve analysis by down weighting the unusual observations. This paper proposes a robustleast squares fitting lmRobMM approach to estimate the decline rate of daily production data and compares the results with reservoir simulation results. For case study, we usethe oil production data at TBA Field West Java. The results demonstrated that theapproach is suitable for decline curve fitting and offers a new insight in decline curve analysis in the present of unusual observations.
Applied multivariate statistical analysis
Härdle, Wolfgang Karl
2015-01-01
Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners. It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added. All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior. All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...
Trajectory Based Traffic Analysis
DEFF Research Database (Denmark)
Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin
2013-01-01
We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...
International Nuclear Information System (INIS)
Johnstad, H.
1989-06-01
The Physics Analysis Workstation (PAW) is a high-level program providing data presentation and statistical or mathematical analysis. PAW has been developed at CERN as an instrument to assist physicists in the analysis and presentation of their data. The program is interfaced to a high level graphics package, based on basic underlying graphics. 3-D graphics capabilities are being implemented. The major objects in PAW are 1 or 2 dimensional binned event data with fixed number of entries per event, vectors, functions, graphics pictures, and macros. Command input is handled by an integrated user interface package, which allows for a variety of choices for input, either with typed commands, or in a tree structure menu driven mode. 6 refs., 1 fig
Choudary, A D R
2014-01-01
The book targets undergraduate and postgraduate mathematics students and helps them develop a deep understanding of mathematical analysis. Designed as a first course in real analysis, it helps students learn how abstract mathematical analysis solves mathematical problems that relate to the real world. As well as providing a valuable source of inspiration for contemporary research in mathematics, the book helps students read, understand and construct mathematical proofs, develop their problem-solving abilities and comprehend the importance and frontiers of computer facilities and much more. It offers comprehensive material for both seminars and independent study for readers with a basic knowledge of calculus and linear algebra. The first nine chapters followed by the appendix on the Stieltjes integral are recommended for graduate students studying probability and statistics, while the first eight chapters followed by the appendix on dynamical systems will be of use to students of biology and environmental scie...
Banks, David L; Rios Insua, David
2015-01-01
Flexible Models to Analyze Opponent Behavior A relatively new area of research, adversarial risk analysis (ARA) informs decision making when there are intelligent opponents and uncertain outcomes. Adversarial Risk Analysis develops methods for allocating defensive or offensive resources against intelligent adversaries. Many examples throughout illustrate the application of the ARA approach to a variety of games and strategic situations. The book shows decision makers how to build Bayesian models for the strategic calculation of their opponents, enabling decision makers to maximize their expected utility or minimize their expected loss. This new approach to risk analysis asserts that analysts should use Bayesian thinking to describe their beliefs about an opponent's goals, resources, optimism, and type of strategic calculation, such as minimax and level-k thinking. Within that framework, analysts then solve the problem from the perspective of the opponent while placing subjective probability distributions on a...
COMPUTER METHODS OF GENETIC ANALYSIS.
Directory of Open Access Journals (Sweden)
A. L. Osipov
2017-02-01
Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.
International Nuclear Information System (INIS)
Strait, R.S.
1996-01-01
The first phase of the Depleted Uranium Hexafluoride Management Program (Program)--management strategy selection--consists of several program elements: Technology Assessment, Engineering Analysis, Cost Analysis, and preparation of an Environmental Impact Statement (EIS). Cost Analysis will estimate the life-cycle costs associated with each of the long-term management strategy alternatives for depleted uranium hexafluoride (UF6). The scope of Cost Analysis will include all major expenditures, from the planning and design stages through decontamination and decommissioning. The costs will be estimated at a scoping or preconceptual design level and are intended to assist decision makers in comparing alternatives for further consideration. They will not be absolute costs or bid-document costs. The purpose of the Cost Analysis Guidelines is to establish a consistent approach to analyzing of cost alternatives for managing Department of Energy's (DOE's) stocks of depleted uranium hexafluoride (DUF6). The component modules that make up the DUF6 management program differ substantially in operational maintenance, process-options, requirements for R and D, equipment, facilities, regulatory compliance, (O and M), and operations risk. To facilitate a consistent and equitable comparison of costs, the guidelines offer common definitions, assumptions or basis, and limitations integrated with a standard approach to the analysis. Further, the goal is to evaluate total net life-cycle costs and display them in a way that gives DOE the capability to evaluate a variety of overall DUF6 management strategies, including commercial potential. The cost estimates reflect the preconceptual level of the designs. They will be appropriate for distinguishing among management strategies
Schramm, Michael J
2008-01-01
This text forms a bridge between courses in calculus and real analysis. It focuses on the construction of mathematical proofs as well as their final content. Suitable for upper-level undergraduates and graduate students of real analysis, it also provides a vital reference book for advanced courses in mathematics.The four-part treatment begins with an introduction to basic logical structures and techniques of proof, including discussions of the cardinality concept and the algebraic and order structures of the real and rational number systems. Part Two presents in-depth examinations of the compl
Fourier analysis an introduction
Stein, Elias M
2003-01-01
This first volume, a three-part introduction to the subject, is intended for students with a beginning knowledge of mathematical analysis who are motivated to discover the ideas that shape Fourier analysis. It begins with the simple conviction that Fourier arrived at in the early nineteenth century when studying problems in the physical sciences--that an arbitrary function can be written as an infinite sum of the most basic trigonometric functions.The first part implements this idea in terms of notions of convergence and summability of Fourier series, while highlighting applications such as th
DEFF Research Database (Denmark)
Reinau, Kristian Hegner
Traditionally, focus in the transport field, both politically and scientifically, has been on private cars and public transport. Freight transport has been a neglected topic. Recent years has seen an increased focus upon congestion as a core issue across Europe, resulting in a great need for know...... speed data for freight. Secondly, the analytical methods used, space-time cubes and emerging hot spot analysis, are also new in the freight transport field. The analysis thus estimates precisely how fast freight moves on the roads in Northern Jutland and how this has evolved over time....
Abrahams, J R; Hiller, N
1965-01-01
Signal Flow Analysis provides information pertinent to the fundamental aspects of signal flow analysis. This book discusses the basic theory of signal flow graphs and shows their relation to the usual algebraic equations.Organized into seven chapters, this book begins with an overview of properties of a flow graph. This text then demonstrates how flow graphs can be applied to a wide range of electrical circuits that do not involve amplification. Other chapters deal with the parameters as well as circuit applications of transistors. This book discusses as well the variety of circuits using ther
Automated Software Vulnerability Analysis
Sezer, Emre C.; Kil, Chongkyung; Ning, Peng
Despite decades of research, software continues to have vulnerabilities. Successful exploitations of these vulnerabilities by attackers cost millions of dollars to businesses and individuals. Unfortunately, most effective defensive measures, such as patching and intrusion prevention systems, require an intimate knowledge of the vulnerabilities. Many systems for detecting attacks have been proposed. However, the analysis of the exploited vulnerabilities is left to security experts and programmers. Both the human effortinvolved and the slow analysis process are unfavorable for timely defensive measure to be deployed. The problem is exacerbated by zero-day attacks.
Spectral analysis by correlation
International Nuclear Information System (INIS)
Fauque, J.M.; Berthier, D.; Max, J.; Bonnet, G.
1969-01-01
The spectral density of a signal, which represents its power distribution along the frequency axis, is a function which is of great importance, finding many uses in all fields concerned with the processing of the signal (process identification, vibrational analysis, etc...). Amongst all the possible methods for calculating this function, the correlation method (correlation function calculation + Fourier transformation) is the most promising, mainly because of its simplicity and of the results it yields. The study carried out here will lead to the construction of an apparatus which, coupled with a correlator, will constitute a set of equipment for spectral analysis in real time covering the frequency range 0 to 5 MHz. (author) [fr
DEFF Research Database (Denmark)
Josefsen, Knud; Nielsen, Henrik
2011-01-01
Northern blotting analysis is a classical method for analysis of the size and steady-state level of a specific RNA in a complex sample. In short, the RNA is size-fractionated by gel electrophoresis and transferred by blotting onto a membrane to which the RNA is covalently bound. Then, the membrane...... is analysed by hybridization to one or more specific probes that are labelled for subsequent detection. Northern blotting is relatively simple to perform, inexpensive, and not plagued by artefacts. Recent developments of hybridization membranes and buffers have resulted in increased sensitivity closing...
Subseabed disposal safety analysis
International Nuclear Information System (INIS)
Koplick, C.M.; Kabele, T.J.
1982-01-01
This report summarizes the status of work performed by Analytic Sciences Corporation (TASC) in FY'81 on subseabed disposal safety analysis. Safety analysis for subseabed disposal is divided into two phases: pre-emplacement which includes all transportation, handling, and emplacement activities; and long-term (post-emplacement), which is concerned with the potential hazard after waste is safely emplaced. Details of TASC work in these two areas are provided in two technical reports. The work to date, while preliminary, supports the technical and environmental feasibility of subseabed disposal of HLW
Sprecher, David A
2010-01-01
This classic text in introductory analysis delineates and explores the intermediate steps between the basics of calculus and the ultimate stage of mathematics: abstraction and generalization.Since many abstractions and generalizations originate with the real line, the author has made it the unifying theme of the text, constructing the real number system from the point of view of a Cauchy sequence (a step which Dr. Sprecher feels is essential to learn what the real number system is).The material covered in Elements of Real Analysis should be accessible to those who have completed a course in
Energy Technology Data Exchange (ETDEWEB)
Johnson, M A
1983-03-01
Energy analysis contributed to the public debate on the gasohol programme in the U.S. where this analysis became a legal requirement. The published energy analyses for gasohol are reviewed and we assess their inherent assumptions and data sources. The analyses are normalised to S.I. units to faciltate comparisons. The process of rationalising the various treatments uncovered areas of uncertainties particularly in the methodologies which could be used to analyse some parts of the process. Although the definitive study has still to be written, the consensus is that maize to fuel ethanol via the traditional fermentation route is a net consumer of energy. (Refs. 13).
Bhatia, Rajendra
2009-01-01
These notes are a record of a one semester course on Functional Analysis given by the author to second year Master of Statistics students at the Indian Statistical Institute, New Delhi. Students taking this course have a strong background in real analysis, linear algebra, measure theory and probability, and the course proceeds rapidly from the definition of a normed linear space to the spectral theorem for bounded selfadjoint operators in a Hilbert space. The book is organised as twenty six lectures, each corresponding to a ninety minute class session. This may be helpful to teachers planning a course on this topic. Well prepared students can read it on their own.
Analysis of maintenance strategies
International Nuclear Information System (INIS)
Laakso, K.; Simola, K.
1998-01-01
The main topics of the presentation include: (1) an analysis model and methods to evaluate maintenance action programs and the support decision to make changes in them and (2) to understand the maintenance strategies in a systems perspective as a basis for future developments. The subproject showed how systematic models for maintenance analysis and decision support, utilising computerised and statistical tool packages, can be taken into use for evaluation and optimisation of maintenance of active systems from the safety and economic point of view
Foundations of stochastic analysis
Rao, M M; Lukacs, E
1981-01-01
Foundations of Stochastic Analysis deals with the foundations of the theory of Kolmogorov and Bochner and its impact on the growth of stochastic analysis. Topics covered range from conditional expectations and probabilities to projective and direct limits, as well as martingales and likelihood ratios. Abstract martingales and their applications are also discussed. Comprised of five chapters, this volume begins with an overview of the basic Kolmogorov-Bochner theorem, followed by a discussion on conditional expectations and probabilities containing several characterizations of operators and mea
International Nuclear Information System (INIS)
Chatelus, R.; Schot, P.M.
2010-01-01
In order to verify compliance with safeguards and draw conclusions on the absence of undeclared nuclear material and activities, the International Atomic Energy Agency (IAEA) collects and analyses trade information that it receives from open sources as well as from Member States. Although the IAEA does not intervene in national export controls, it has to monitor the trade of dual use items. Trade analysis helps the IAEA to evaluate global proliferation threats, to understand States' ability to report exports according to additional protocols but also to compare against State declarations. Consequently, the IAEA has explored sources of trade-related information and has developed analysis methodologies beyond its traditional safeguards approaches. (author)
Cai, Tony
2010-01-01
Over the last few years, significant developments have been taking place in highdimensional data analysis, driven primarily by a wide range of applications in many fields such as genomics and signal processing. In particular, substantial advances have been made in the areas of feature selection, covariance estimation, classification and regression. This book intends to examine important issues arising from highdimensional data analysis to explore key ideas for statistical inference and prediction. It is structured around topics on multiple hypothesis testing, feature selection, regression, cla
Senthilkumar, K.; Ruchika Mehra Vijayan, E.
2017-11-01
This paper aims to illustrate real time analysis of large scale data. For practical implementation we are performing sentiment analysis on live Twitter feeds for each individual tweet. To analyze sentiments we will train our data model on sentiWordNet, a polarity assigned wordNet sample by Princeton University. Our main objective will be to efficiency analyze large scale data on the fly using distributed computation. Apache Spark and Apache Hadoop eco system is used as distributed computation platform with Java as development language
DEFF Research Database (Denmark)
Vatrapu, Ravi; Hussain, Abid; Buus Lassen, Niels
2015-01-01
of Facebook or Twitter data. However, there exist no other holistic computational social science approach beyond the relational sociology and graph theory of SNA. To address this limitation, this paper presents an alternative holistic approach to Big Social Data analytics called Social Set Analysis (SSA......This paper argues that the basic premise of Social Network Analysis (SNA) -- namely that social reality is constituted by dyadic relations and that social interactions are determined by structural properties of networks-- is neither necessary nor sufficient, for Big Social Data analytics...
Hoffman, Kenneth
2007-01-01
Developed for an introductory course in mathematical analysis at MIT, this text focuses on concepts, principles, and methods. Its introductions to real and complex analysis are closely formulated, and they constitute a natural introduction to complex function theory.Starting with an overview of the real number system, the text presents results for subsets and functions related to Euclidean space of n dimensions. It offers a rigorous review of the fundamentals of calculus, emphasizing power series expansions and introducing the theory of complex-analytic functions. Subsequent chapters cover seq
International Nuclear Information System (INIS)
Clark, R.D.
1996-01-01
This analysis defines and evaluates the surface water supply system from the existing J-13 well to the North Portal. This system includes the pipe running from J-13 to a proposed Booster Pump Station at the intersection of H Road and the North Portal access road. Contained herein is an analysis of the proposed Booster Pump Station with a brief description of the system that could be installed to the South Portal and the optional shaft. The tanks that supply the water to the North Portal are sized, and the supply system to the North Portal facilities and up to Topopah Spring North Ramp is defined
Structural analysis for Diagnosis
DEFF Research Database (Denmark)
Izadi-Zamanabadi, Roozbeh; Blanke, M.
2001-01-01
Aiming at design of algorithms for fault diagnosis, structural analysis of systems offers concise yet easy overall analysis. Graph-based matching, which is the essential technique to obtain redundant information for diagnosis, is re-considered in this paper. Matching is re-formulated as a problem...... of relating faults to known parameters and measurements of a system. Using explicit fault modelling, minimal over-determined subsystems are shown to provide necessary redundancy relations from the matching. Details of the method are presented and a realistic example used to clearly describe individual steps...
Structural analysis for diagnosis
DEFF Research Database (Denmark)
Izadi-Zamanabadi, Roozbeh; Blanke, M.
2002-01-01
Aiming at design of algorithms for fault diagnosis, structural analysis of systems offers concise yet easy overall analysis. Graph-based matching, which is the essential tech-nique to obtain redundant information for diagnosis, is reconsidered in this paper. Matching is reformulated as a problem...... of relating faults to known parameters and measurements of a system. Using explicit fault modelling, minimal overdetermined subsystems are shown to provide necessary redundancy relations from the matching. Details of the method are presented and a realistic example used to clearly describe individual steps....
International Nuclear Information System (INIS)
Bergman, R.
1980-12-01
The report describes the development of a method for in vivo Cd-analysis. The method is based on the analysis of the prompt gamma radiation which is emitted by neutron capture of the isotope Cd113. Different parts of the body can be analysed selectively by neutrons in the interval of 1 to 100 KeV. The results show that the level of Cd in Kidneys can be measured without exceeding the dose of 40 mrad and that only 20% uncertainty is introduced when analysing Cd. The development has been made at the R2 reactor in Studsvik using 25 KeV neutrons. (G.B.)
Brieda, Lubos
2015-01-01
This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.
Beginning statistics with data analysis
Mosteller, Frederick; Rourke, Robert EK
2013-01-01
This introduction to the world of statistics covers exploratory data analysis, methods for collecting data, formal statistical inference, and techniques of regression and analysis of variance. 1983 edition.
Electronic Circuit Analysis Language (ECAL)
Chenghang, C.
1983-03-01
The computer aided design technique is an important development in computer applications and it is an important component of computer science. The special language for electronic circuit analysis is the foundation of computer aided design or computer aided circuit analysis (abbreviated as CACD and CACA) of simulated circuits. Electronic circuit analysis language (ECAL) is a comparatively simple and easy to use circuit analysis special language which uses the FORTRAN language to carry out the explanatory executions. It is capable of conducting dc analysis, ac analysis, and transient analysis of a circuit. Futhermore, the results of the dc analysis can be used directly as the initial conditions for the ac and transient analyses.
Lifescience Database Archive (English)
Full Text Available igliptin hydrobromide hydrate (JAN) ... Antidiabetic agent ... DG01601 ... DPP-4 inhibitor Unclassified ... DG02044 ... H...ypoglycemics ... DG01601 ... DPP-4 inhibitor ... DPP4 inhibitor, antidiabetics DPP4 [HSA:1803] [KO:K01278] ...
Prehistory analysis using photon activation analysis
International Nuclear Information System (INIS)
Krausova, I.; Chvatil, D.; Tajer, J.
2017-01-01
Instrumental photon activation analysis (IPAA) is a suitable radio-analytical method for non-destructive determination of total nitrogen in various matrices. IPAA determination of nitrogen is based on 14 N (γ, n) 13 N nuclear reaction after high-energy photon irradiation. The analytically usable product of this photo-nuclear reaction is a positron emitter emitting only non-specific annihilation of 511 keV, which can be emitted by other radionuclides present in the sample. Some of them, besides the non-specific 511 keV line, also emit specific lines that allow their contribution to analytical radionuclide 13 N to be subtracted. An efficient source of high-energy photon radiation is the secondary bremsstrahlung generated by the conversion of the electron beam accelerated by a high-frequency circular accelerator - a microtron. The non-destructive IPAA contributed to the clarification of the origins of a precious bracelet originating from a fortified settlement in the area of Karlovy Vary - Drahovice from the late Bronze Age. (authors)
Systems analysis - independent analysis and verification
Energy Technology Data Exchange (ETDEWEB)
DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)
1996-10-01
The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.
Learning Haskell data analysis
Church, James
2015-01-01
If you are a developer, analyst, or data scientist who wants to learn data analysis methods using Haskell and its libraries, then this book is for you. Prior experience with Haskell and a basic knowledge of data science will be beneficial.
International Nuclear Information System (INIS)
Malik, S; Bloom, K; Shipsey, I; Cavanaugh, R; Klima, B; Chan, Kai-Feng; D'Hondt, J; Narain, M; Palla, F; Rolandi, G; Schörner-Sadenius, T
2014-01-01
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
International Nuclear Information System (INIS)
2002-01-01
This document is one in a series of publications known as the ETDE/INIS Joint Reference Series and also constitutes a part of the ETDE Procedures Manual. It presents the rules, guidelines and procedures to be adopted by centers submitting input to the International Nuclear Information System (INIS) or the Energy Technology Data Exchange (ETDE). It is a manual for the subject analysis part of input preparation, meaning the selection, subject classification, abstracting and subject indexing of relevant publications, and is to be used in conjunction with the Thesauruses, Subject Categories documents and the documents providing guidelines for the preparation of abstracts. The concept and structure of the new manual are intended to describe in a logical and efficient sequence all the steps comprising the subject analysis of documents to be reported to INIS or ETDE. The manual includes new chapters on preparatory analysis, subject classification, abstracting and subject indexing, as well as rules, guidelines, procedures, examples and a special chapter on guidelines and examples for subject analysis in particular subject fields. (g.t.; a.n.)
Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip
2011-01-01
Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.
Israel, Carsten W; Ekosso-Ejangue, Lucy; Sheta, Mohamed-Karim
2015-09-01
The key to a successful analysis of a pacemaker electrocardiogram (ECG) is the application of the systematic approach used for any other ECG without a pacemaker: analysis of (1) basic rhythm and rate, (2) QRS axis, (3) PQ, QRS and QT intervals, (4) morphology of P waves, QRS, ST segments and T(U) waves and (5) the presence of arrhythmias. If only the most obvious abnormality of a pacemaker ECG is considered, wrong conclusions can easily be drawn. If a systematic approach is skipped it may be overlooked that e.g. atrial pacing is ineffective, the left ventricle is paced instead of the right ventricle, pacing competes with intrinsic conduction or that the atrioventricular (AV) conduction time is programmed too long. Apart from this analysis, a pacemaker ECG which is not clear should be checked for the presence of arrhythmias (e.g. atrial fibrillation, atrial flutter, junctional escape rhythm and endless loop tachycardia), pacemaker malfunction (e.g. atrial or ventricular undersensing or oversensing, atrial or ventricular loss of capture) and activity of specific pacing algorithms, such as automatic mode switching, rate adaptation, AV delay modifying algorithms, reaction to premature ventricular contractions (PVC), safety window pacing, hysteresis and noise mode. A systematic analysis of the pacemaker ECG almost always allows a probable diagnosis of arrhythmias and malfunctions to be made, which can be confirmed by pacemaker control and can often be corrected at the touch of the right button to the patient's benefit.
Nancy E. Fleenor
2002-01-01
A Landscape Analysis Plan (LAP) sets out broad guidelines for project development within boundaries of the Kings River Sustainable Forest Ecosystems Project. The plan must be a dynamic, living document, subject to change as new information arises over the course of this very long-term project (several decades). Two watersheds, each of 32,000 acres, were dedicated to...
Bayesian logistic regression analysis
Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.
2012-01-01
In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an
Monotowns: A Quantitative Analysis
Directory of Open Access Journals (Sweden)
Shastitko Andrei
2016-06-01
Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.
SWOT ANALYSIS - CHINESE PETROLEUM
Chunlan Wang; Lei Zhang; Qi Zhong
2014-01-01
This article was written in early December 2013, combined with the historical development and the latest data on the Chinese Petroleum carried SWOTanalysis. This paper discusses corporate resources, cost, management and external factors such as the political environment and the market supply and demand, conducted a comprehensive and profound analysis.
Sensitivity Analysis Without Assumptions.
Ding, Peng; VanderWeele, Tyler J
2016-05-01
Unmeasured confounding may undermine the validity of causal inference with observational studies. Sensitivity analysis provides an attractive way to partially circumvent this issue by assessing the potential influence of unmeasured confounding on causal conclusions. However, previous sensitivity analysis approaches often make strong and untestable assumptions such as having an unmeasured confounder that is binary, or having no interaction between the effects of the exposure and the confounder on the outcome, or having only one unmeasured confounder. Without imposing any assumptions on the unmeasured confounder or confounders, we derive a bounding factor and a sharp inequality such that the sensitivity analysis parameters must satisfy the inequality if an unmeasured confounder is to explain away the observed effect estimate or reduce it to a particular level. Our approach is easy to implement and involves only two sensitivity parameters. Surprisingly, our bounding factor, which makes no simplifying assumptions, is no more conservative than a number of previous sensitivity analysis techniques that do make assumptions. Our new bounding factor implies not only the traditional Cornfield conditions that both the relative risk of the exposure on the confounder and that of the confounder on the outcome must satisfy but also a high threshold that the maximum of these relative risks must satisfy. Furthermore, this new bounding factor can be viewed as a measure of the strength of confounding between the exposure and the outcome induced by a confounder.
Seber, George A F
2012-01-01
Concise, mathematically clear, and comprehensive treatment of the subject.* Expanded coverage of diagnostics and methods of model fitting.* Requires no specialized knowledge beyond a good grasp of matrix algebra and some acquaintance with straight-line regression and simple analysis of variance models.* More than 200 problems throughout the book plus outline solutions for the exercises.* This revision has been extensively class-tested.
DEFF Research Database (Denmark)
Kirchmeier-Andersen, Sabine; Møller Christensen, Jakob; Lihn Jensen, Bente
2004-01-01
This article presents the latest version of VIA (version 3.0). The development of the program was initiated by a demand for more systematic training of language analysis in high schools and universities. The system is now web-based, which enables teachers and students to share exercises across...
Shabalin, P L; Yakubenko, A A; Pokhilevich, VA; Krein, M G
1986-01-01
This collection of eleven papers covers a broad spectrum of topics in analysis, from the study of certain classes of analytic functions to the solvability of singular problems for differential and integral equations to computational schemes for the partial differential equations and singular integral equations.
Lyman L. McDonald; Christina D. Vojta; Kevin S. McKelvey
2013-01-01
Perhaps the greatest barrier between monitoring and management is data analysis. Data languish in drawers and spreadsheets because those who collect or maintain monitoring data lack training in how to effectively summarize and analyze their findings. This chapter serves as a first step to surmounting that barrier by empowering any monitoring team with the basic...
Idris, Ivan
2014-01-01
This book is for programmers, scientists, and engineers who have knowledge of the Python language and know the basics of data science. It is for those who wish to learn different data analysis methods using Python and its libraries. This book contains all the basic ingredients you need to become an expert data analyst.
Haskell data analysis cookbook
Shukla, Nishant
2014-01-01
Step-by-step recipes filled with practical code samples and engaging examples demonstrate Haskell in practice, and then the concepts behind the code. This book shows functional developers and analysts how to leverage their existing knowledge of Haskell specifically for high-quality data analysis. A good understanding of data sets and functional programming is assumed.
Energy Technology Data Exchange (ETDEWEB)
Frame, Katherine Chiyoko [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-06-28
Neutron multiplicity measurements are widely used for nondestructive assay (NDA) of special nuclear material (SNM). When combined with isotopic composition information, neutron multiplicity analysis can be used to estimate the spontaneous fission rate and leakage multiplication of SNM. When combined with isotopic information, the total mass of fissile material can also be determined. This presentation provides an overview of this technique.
Greenberg, Marc W.; Laing, William
2013-01-01
An Economic Analysis (EA) is a systematic approach to the problem of choosing the best method of allocating scarce resources to achieve a given objective. An EA helps guide decisions on the "worth" of pursuing an action that departs from status quo ... an EA is the crux of decision-support.
Activation Analysis of Aluminium
Energy Technology Data Exchange (ETDEWEB)
Brune, Dag
1961-01-15
An analysis of pure aluminium alloyed with magnesium was per- formed by means of gamma spectrometry , Chemical separations were not employed. The isotopes to be determined were obtained in conditions of optimum activity by suitably choosing the time of irradiation and decay. The following elements were detected and measured quantitatively: Iron, zinc, copper, gallium, manganese, chromium, scandium and hafnium.
VENTILATION TECHNOLOGY SYSTEMS ANALYSIS
The report gives results of a project to develop a systems analysis of ventilation technology and provide a state-of-the-art assessment of ventilation and indoor air quality (IAQ) research needs. (NOTE: Ventilation technology is defined as the hardware necessary to bring outdoor ...
International Nuclear Information System (INIS)
Kaiser, V.
1993-01-01
In Chapter 2 process energy cost analysis for chemical processing is treated in a general way, independent of the specific form of energy and power production. Especially, energy data collection and data treatment, energy accounting (metering, balance setting), specific energy input, and utility energy costs and prices are discussed. (R.P.) 14 refs., 4 figs., 16 tabs
DEFF Research Database (Denmark)
Mai, Jens Erik
2005-01-01
is presented as an alternative and the paper discusses how this approach includes a broader range of analyses and how it requires a new set of actions from using this approach; analysis of the domain, users and indexers. The paper concludes that the two-step procedure to indexing is insufficient to explain...
On frame multiresolution analysis
DEFF Research Database (Denmark)
Christensen, Ole
2003-01-01
We use the freedom in frame multiresolution analysis to construct tight wavelet frames (even in the case where the refinable function does not generate a tight frame). In cases where a frame multiresolution does not lead to a construction of a wavelet frame we show how one can nevertheless...
Methods in algorithmic analysis
Dobrushkin, Vladimir A
2009-01-01
…helpful to any mathematics student who wishes to acquire a background in classical probability and analysis … This is a remarkably beautiful book that would be a pleasure for a student to read, or for a teacher to make into a year's course.-Harvey Cohn, Computing Reviews, May 2010
Quantitative Moessbauer analysis
International Nuclear Information System (INIS)
Collins, R.L.
1978-01-01
The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)
DEFF Research Database (Denmark)
Ris Hansen, Inge; Søgaard, Karen; Gram, Bibi
2015-01-01
This is the analysis plan for the multicentre randomised control study looking at the effect of training and exercises in chronic neck pain patients that is being conducted in Jutland and Funen, Denmark. This plan will be used as a work description for the analyses of the data collected....
Energy Technology Data Exchange (ETDEWEB)
Malik, S. [Nebraska U.; Shipsey, I. [Purdue U.; Cavanaugh, R. [Illinois U., Chicago; Bloom, K. [Nebraska U.; Chan, Kai-Feng [Taiwan, Natl. Taiwan U.; D' Hondt, J. [Vrije U., Brussels; Klima, B. [Fermilab; Narain, M. [Brown U.; Palla, F. [INFN, Pisa; Rolandi, G. [CERN; Schörner-Sadenius, T. [DESY
2014-01-01
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
DEFF Research Database (Denmark)
Conti, Roberto; Hong, Jeong Hee; Szymanski, Wojciech
2012-01-01
of such an algebra. Then we outline a powerful combinatorial approach to analysis of endomorphisms arising from permutation unitaries. The restricted Weyl group consists of automorphisms of this type. We also discuss the action of the restricted Weyl group on the diagonal MASA and its relationship...
International Nuclear Information System (INIS)
Kelsey, C.A.; Mettler, F.A.
1988-01-01
An elementary introduction to ROC analysis illustrates how ROC curves depend on observer threshold levels and discusses the relation between ROC curve parameters and other measures of observer performance including accuracy sensitivity specificity true positive fraction, true negative fraction, false positive fraction and false negative fraction
Information Security Risk Analysis
Peltier, Thomas R
2010-01-01
Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.
Lubsch, A.; Timmermans, K.
2017-01-01
Texture analysis is a method to test the physical properties of a material by tension and compression. The growing interest in commercialisation of seaweeds for human food has stimulated research into the physical properties of seaweed tissue. These are important parameters for the survival of
Shifted Independent Component Analysis
DEFF Research Database (Denmark)
Mørup, Morten; Madsen, Kristoffer Hougaard; Hansen, Lars Kai
2007-01-01
Delayed mixing is a problem of theoretical interest and practical importance, e.g., in speech processing, bio-medical signal analysis and financial data modelling. Most previous analyses have been based on models with integer shifts, i.e., shifts by a number of samples, and have often been carried...
Multiscale principal component analysis
International Nuclear Information System (INIS)
Akinduko, A A; Gorban, A N
2014-01-01
Principal component analysis (PCA) is an important tool in exploring data. The conventional approach to PCA leads to a solution which favours the structures with large variances. This is sensitive to outliers and could obfuscate interesting underlying structures. One of the equivalent definitions of PCA is that it seeks the subspaces that maximize the sum of squared pairwise distances between data projections. This definition opens up more flexibility in the analysis of principal components which is useful in enhancing PCA. In this paper we introduce scales into PCA by maximizing only the sum of pairwise distances between projections for pairs of datapoints with distances within a chosen interval of values [l,u]. The resulting principal component decompositions in Multiscale PCA depend on point (l,u) on the plane and for each point we define projectors onto principal components. Cluster analysis of these projectors reveals the structures in the data at various scales. Each structure is described by the eigenvectors at the medoid point of the cluster which represent the structure. We also use the distortion of projections as a criterion for choosing an appropriate scale especially for data with outliers. This method was tested on both artificial distribution of data and real data. For data with multiscale structures, the method was able to reveal the different structures of the data and also to reduce the effect of outliers in the principal component analysis
Euler principal component analysis
Liwicki, Stephan; Tzimiropoulos, Georgios; Zafeiriou, Stefanos; Pantic, Maja
Principal Component Analysis (PCA) is perhaps the most prominent learning tool for dimensionality reduction in pattern recognition and computer vision. However, the ℓ 2-norm employed by standard PCA is not robust to outliers. In this paper, we propose a kernel PCA method for fast and robust PCA,
International Nuclear Information System (INIS)
Santoro, R.T.; Iida, H.; Khripunov, V.; Petrizzi, L.; Sato, S.; Sawan, M.; Shatalov, G.; Schipakin, O.
2001-01-01
This paper summarizes the main results of nuclear analysis calculations performed during the International Thermonuclear Experimental Reactor (ITER) Engineering Design Activity (EDA). Major efforts were devoted to fulfilling the General Design Requirements to minimize the nuclear heating rate in the superconducting magnets and ensuring that radiation conditions at the cryostat are suitable for hands-on-maintenance after reactor shut-down. (author)
Elementary functional analysis
Shilov, Georgi E
1996-01-01
Introductory text covers basic structures of mathematical analysis (linear spaces, metric spaces, normed linear spaces, etc.), differential equations, orthogonal expansions, Fourier transforms - including problems in the complex domain, especially involving the Laplace transform - and more. Each chapter includes a set of problems, with hints and answers. Bibliography. 1974 edition.
Computer aided safety analysis
International Nuclear Information System (INIS)
1988-05-01
The document reproduces 20 selected papers from the 38 papers presented at the Technical Committee/Workshop on Computer Aided Safety Analysis organized by the IAEA in co-operation with the Institute of Atomic Energy in Otwock-Swierk, Poland on 25-29 May 1987. A separate abstract was prepared for each of these 20 technical papers. Refs, figs and tabs
1979-01-31
but expinds ’acordiohlike.’ (4) The height- integrated intensity ratio of the red (6300 A) to green (5577 A) emisions of atomic o\\)gen is a good... molecular ion: Analysis of two rocket experiments, Planet. Space Sci. 16, 737, 1968. Hays, P. B. and C. D. Anger, The influence of ground scattering on
Communication Network Analysis Methods.
Farace, Richard V.; Mabee, Timothy
This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…
Making Strategic Analysis Matter
2012-01-01
Bryan Gabbard , Assessing the Tradecraft of Intelligence Analysis, Santa Monica, Calif.: RAND Corporation, TR-293, 2008. 4 See The Commission on the...July 7, 2011: http://www.rand.org/pubs/occasional_papers/OP152.html Treverton, Gregory F., and C. Bryan Gabbard , Assessing the Tradecraft of
Instrumental analysis, second edition
International Nuclear Information System (INIS)
Christian, G.D.; O'Reilly, J.E.
1988-01-01
The second edition of Instrumental Analysis is a survey of the major instrument-based methods of chemical analysis. It appears to be aimed at undergraduates but would be equally useful in a graduate course. The volume explores all of the classical quantitative methods and contains sections on techniques that usually are not included in a semester course in instrumentation (such as electron spectroscopy and the kinetic methods). Adequate coverage of all of the methods contained in this book would require several semesters of focused study. The 25 chapters were written by different authors, yet the style throughout the book is more uniform than in the earlier edition. With the exception of a two-chapter course in analog and digital circuits, the book purports to de-emphasize instrumentation, focusing more on the theory behind the methods and the application of the methods to analytical problems. However, a detailed analysis of the instruments used in each method is by no means absent. The book has the favor of a user's guide to analysis
Kolmogorov, A N; Silverman, Richard A
1975-01-01
Self-contained and comprehensive, this elementary introduction to real and functional analysis is readily accessible to those with background in advanced calculus. It covers basic concepts and introductory principles in set theory, metric spaces, topological and linear spaces, linear functionals and linear operators, and much more. 350 problems. 1970 edition.
Rotation in correspondence analysis
van de Velden, Michel; Kiers, Henk A.L.
2005-01-01
In correspondence analysis rows and columns of a nonnegative data matrix are depicted as points in a, usually, two-dimensional plot. Although such a two-dimensional plot often provides a reasonable approximation, the situation can occur that an approximation of higher dimensionality is required.
2016-04-01
Expand Childcare Center hours Dual-military Co-location Policy Maternity , Paternity, and Adoption leave o Women in Service Increase...Distribution unlimited Analysis of Undesignated Work Karan A. Schriver, Edward J. Schmitz, Greggory J. Schell, Hoda Parvin April 2016...designated and undesignated work requirements. Over time, this mix fluctuates, causing changes to the force profile. Undesignated workload has
Indian Academy of Sciences (India)
Chrissa G. Tsiara
2018-03-13
Mar 13, 2018 ... a meta-analysis of case–control studies was conducted. Univariate and ...... recent hepatitis C virus: potential benefit for ribavirin use in. HCV/HIV ... C/G polymorphism in breast pathologies and in HIV-infected patients.
International Nuclear Information System (INIS)
Abe, Toshinori
2001-01-01
The North American Linear Collider Detector group has developed simulation and analysis program packages. LCDROOT is one of the packages, and is based on ROOT and the C++ programing language to maximally benefit from object oriented programming techniques. LCDROOT is constantly improved and now has a new topological vertex finder, ZVTOP3. In this proceeding, the features of the LCDROOT simulation are briefly described
Communication Analysis of Environment.
Malik, M. F.; Thwaites, H. M.
This textbook was developed for use in a Concordia University (Quebec) course entitled "Communication Analysis of Environment." Designed as a practical application of information theory and cybernetics in the field of communication studies, the course is intended to be a self-instructional process, whereby each student chooses one…
Learning: An Evolutionary Analysis
Swann, Joanna
2009-01-01
This paper draws on the philosophy of Karl Popper to present a descriptive evolutionary epistemology that offers philosophical solutions to the following related problems: "What happens when learning takes place?" and "What happens in human learning?" It provides a detailed analysis of how learning takes place without any direct transfer of…
Stress Analysis of Composites.
1981-01-01
8217, Finite Elements in Nonlinear Mechanics, 1., 109-130, Tapir Publishers, Norway (1978). 9. A.J. Barnard and P.W. Sharman, ’Elastic-Plastic Analysis Using...Hybrid Stress Finite Elements,’ Finite Elements in Nonlinear Mechanics, 1, 131-148, Tapir Publishers Norway, (1978). ’.........Pian, ’Variational
Pavlovic, Dusko; Domenach, Florent; Ignatov, Dmitry I.; Poelmans, Jonas
2012-01-01
Formal Concept Analysis (FCA) begins from a context, given as a binary relation between some objects and some attributes, and derives a lattice of concepts, where each concept is given as a set of objects and a set of attributes, such that the first set consists of all objects that satisfy all
Multidisciplinary System Reliability Analysis
Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)
2001-01-01
The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.
Deterministic uncertainty analysis
International Nuclear Information System (INIS)
Worley, B.A.
1987-01-01
Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig
International Nuclear Information System (INIS)
1997-10-01
The improvement of safety in nuclear power stations is an important proposition. Therefore also as to the safety evaluation, it is important to comprehensively and systematically execute it by referring to the operational experience and the new knowledge which is important for the safety throughout the period of use as well as before the construction and the start of operation of nuclear power stations. In this report, the results when the safety analysis for ''Fugen'' was carried out by referring to the newest technical knowledge are described. As the result, it was able to be confirmed that the safety of ''Fugen'' has been secured by the inherent safety and the facilities which were designed for securing the safety. The basic way of thinking on the safety analysis including the guidelines to be conformed to is mentioned. As to the abnormal transient change in operation and accidents, their definition, the events to be evaluated and the standards for judgement are reported. The matters which were taken in consideration at the time of the analysis are shown. The computation programs used for the analysis were REACT, HEATUP, LAYMON, FATRAC, SENHOR, LOTRAC, FLOOD and CONPOL. The analyses of the abnormal transient change in operation and accidents are reported on the causes, countermeasures, protective functions and results. (K.I.)
International Nuclear Information System (INIS)
Lima-e-Silva, Pedro Paulo de
1996-01-01
The conventional Risk Analysis (RA) relates usually a certain undesired event frequency with its consequences. Such technique is used nowadays in Brazil to analyze accidents and their consequences strictly under the human approach, valuing loss of human equipment, human structures and human lives, without considering the damage caused to natural resources that keep life possible on Earth. This paradigm was developed primarily because of the Homo sapiens' lack of perception upon the natural web needed to sustain his own life. In reality, the Brazilian professionals responsible today for licensing, auditing and inspecting environmental aspects of human activities face huge difficulties in making technical specifications and procedures leading to acceptable levels of impact, furthermore considering the intrinsic difficulties to define those levels. Therefore, in Brazil the RA technique is a weak tool for licensing for many reasons, and of them are its short scope (only accident considerations) and wrong a paradigm (only human direct damages). A paper from the author about the former was already proposed to the 7th International Conference on Environmetrics, past July'96, USP-SP. This one discusses the extension of the risk analysis concept to take into account environmental consequences, transforming the conventional analysis into a broader methodology named here as Environmental Risk Analysis. (author)
Energy Technology Data Exchange (ETDEWEB)
Fudge, A.
1978-12-15
The following aspects of isotope dilution analysis are covered in this report: fundamental aspects of the technique; elements of interest in the nuclear field, choice and standardization of spike nuclide; pre-treatment to achieve isotopic exchange and chemical separation; sensitivity; selectivity; and accuracy.
Szapacs, Cindy
2006-01-01
Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…
Perfusion dyssynchrony analysis
Chiribiri, A.; Villa, A.D.M.; Sammut, E.; Breeuwer, M.; Nagel, E.
2015-01-01
AIMS: We sought to describe perfusion dyssynchrony analysis specifically to exploit the high temporal resolution of stress perfusion CMR. This novel approach detects differences in the temporal distribution of the wash-in of contrast agent across the left ventricular wall. METHODS AND RESULTS:
Proteoglycan isolation and analysis
DEFF Research Database (Denmark)
Woods, A; Couchman, J R
2001-01-01
Proteoglycans can be difficult molecules to isolate and analyze due to large mass, charge, and tendency to aggregate or form macromolecular complexes. This unit describes detailed methods for purification of matrix, cell surface, and cytoskeleton-linked proteoglycans. Methods for analysis...
Uranium and transuranium analysis
International Nuclear Information System (INIS)
Regnaud, F.
1989-01-01
Analytical chemistry of uranium, neptunium, plutonium, americium and curium is reviewed. Uranium and neptunium are mainly treated and curium is only briefly evoked. Analysis methods include coulometry, titration, mass spectrometry, absorption spectrometry, spectrofluorometry, X-ray spectrometry, nuclear methods and radiation spectrometry [fr
International Nuclear Information System (INIS)
Preyssl, C.
1986-01-01
Safety analysis provides the only tool for evaluation and quantification of rare or hypothetical events leading to system failure. So far probability theory has been used for the fault- and event-tree methodology. The phenomenon of uncertainties constitutes an important aspect in risk analysis. Uncertainties can be classified as originating from 'randomness' or 'fuzziness'. Probability theory addresses randomness only. The use of 'fuzzy set theory' makes it possible to include both types of uncertainty in the mathematical model of risk analysis. Thus the 'fuzzy fault tree' is expressed in 'possibilistic' terms implying a range of simplifications and improvements. 'Human failure' and 'conditionality' can be treated correctly. Only minimum-maximum relations are used to combine the possibility distributions of events. Various event-classifications facilitate the interpretation of the results. The method is demonstrated by application to a TRIGA-research reactor. Uncertainty as an implicit part of 'fuzzy risk' can be quantified explicitly using an 'uncertainty measure'. Based on this the 'degree of relative compliance' with a quantizative safety goal can be defined for a particular risk. The introduction of 'weighting functionals' guarantees the consideration of the importances attached to different parts of the risk exceeding or complying with the standard. The comparison of two reference systems is demonstrated in a case study. It is concluded that any application of the 'fuzzy risk analysis' has to be free of any hypostatization when reducing subjective to objective information. (Author)
Bayesian Independent Component Analysis
DEFF Research Database (Denmark)
Winther, Ole; Petersen, Kaare Brandt
2007-01-01
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...
Ignalina Safety Analysis Group
International Nuclear Information System (INIS)
Ushpuras, E.
1995-01-01
The article describes the fields of activities of Ignalina NPP Safety Analysis Group (ISAG) in the Lithuanian Energy Institute and overview the main achievements gained since the group establishment in 1992. The group is working under the following guidelines: in-depth analysis of the fundamental physical processes of RBMK-1500 reactors; collection, systematization and verification of the design and operational data; simulation and analysis of potential accident consequences; analysis of thermohydraulic and neutronic characteristics of the plant; provision of technical and scientific consultations to VATESI, Governmental authorities, and also international institutions, participating in various projects aiming at Ignalina NPP safety enhancement. The ISAG is performing broad scientific co-operation programs with both Eastern and Western scientific groups, supplying engineering assistance for Ignalina NPP. ISAG is also participating in the joint Lithuanian - Swedish - Russian project - Barselina, the first Probabilistic Safety Assessment (PSA) study of Ignalina NPP. The work is underway together with Maryland University (USA) for assessment of the accident confinement system for a range of breaks in the primary circuit. At present the ISAG personnel is also involved in the project under the grant from the Nuclear Safety Account, administered by the European Bank for reconstruction and development for the preparation and review of an in-depth safety assessment of the Ignalina plant
Kane, Jonathan M
2016-01-01
This is a textbook on proof writing in the area of analysis, balancing a survey of the core concepts of mathematical proof with a tight, rigorous examination of the specific tools needed for an understanding of analysis. Instead of the standard "transition" approach to teaching proofs, wherein students are taught fundamentals of logic, given some common proof strategies such as mathematical induction, and presented with a series of well-written proofs to mimic, this textbook teaches what a student needs to be thinking about when trying to construct a proof. Covering the fundamentals of analysis sufficient for a typical beginning Real Analysis course, it never loses sight of the fact that its primary focus is about proof writing skills. This book aims to give the student precise training in the writing of proofs by explaining exactly what elements make up a correct proof, how one goes about constructing an acceptable proof, and, by learning to recognize a correct proof, how to avoid writing incorrect proofs. T...
Russian Language Analysis Project
Serianni, Barbara; Rethwisch, Carolyn
2011-01-01
This paper is the result of a language analysis research project focused on the Russian Language. The study included a diverse literature review that included published materials as well as online sources in addition to an interview with a native Russian speaker residing in the United States. Areas of study include the origin and history of the…
Douglas, David
2016-01-01
Doxing is the intentional public release onto the Internet of personal information about an individual by a third party, often with the intent to humiliate, threaten, intimidate, or punish the identified individual. In this paper I present a conceptual analysis of the practice of doxing and how it
Polysome Profile Analysis - Yeast
Czech Academy of Sciences Publication Activity Database
Pospíšek, M.; Valášek, Leoš Shivaya
2013-01-01
Roč. 530, č. 2013 (2013), s. 173-181 ISSN 0076-6879 Institutional support: RVO:61388971 Keywords : grow yeast cultures * polysome profile analysis * sucrose density gradient centrifugation Subject RIV: CE - Biochemistry Impact factor: 2.194, year: 2013
1999-03-01
analysis that takes place in anatomy or circuit diagrams. The goal is to break an entity down into a set of non- overlapping parts, and to specify the...components. For example, one subject in predicting the fate of different species, broke them into three types: animals that humans would save (e.g., gorillas
ATLAS Distributed Analysis Tools
Gonzalez de la Hoz, Santiago; Liko, Dietrich
2008-01-01
The ATLAS production system has been successfully used to run production of simulation data at an unprecedented scale. Up to 10000 jobs were processed in one day. The experiences obtained operating the system on several grid flavours was essential to perform a user analysis using grid resources. First tests of the distributed analysis system were then performed. In the preparation phase data was registered in the LHC File Catalog (LFC) and replicated in external sites. For the main test, few resources were used. All these tests are only a first step towards the validation of the computing model. The ATLAS management computing board decided to integrate the collaboration efforts in distributed analysis in only one project, GANGA. The goal is to test the reconstruction and analysis software in a large scale Data production using Grid flavors in several sites. GANGA allows trivial switching between running test jobs on a local batch system and running large-scale analyses on the Grid; it provides job splitting a...
Isaacson, Eugene
1994-01-01
This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.
Dyess, Susan Macleod
2011-12-01
This paper reports a concept analysis of faith. There are numerous scholars who consider spirituality and religiosity as they relate to health and nursing. Faith is often implied as linked to these concepts but deserves distinct exploration. In addition, as nursing practice conducted within communities of faith continues to emerge, concept clarification of faith is warranted. Qualitative analysis deliberately considered the concept of faith within the lens of Margaret Newman's health as expanding consciousness. Data sources used included a secondary analysis of stories collected within a study conducted in 2008, two specific reconstructed stories, the identification of attributes noted within these various stories and selected philosophical literature from 1950 to 2009. A definition was identified from the analysis; faith is an evolving pattern of believing, that grounds and guides authentic living and gives meaning in the present moment of inter-relating. Four key attributes of faith were also identified as focusing on beliefs, foundational meaning for life, living authentically in accordance with beliefs, and interrelating with self, others and/or Divine. Although a seemingly universal concept, faith was defined individually. Faith appeared to be broader than spiritual practices and religious ritual and became the very foundation that enabled human beings to make sense of their world and circumstances. More work is needed to understand how faith community nursing can expand the traditional understanding of denominationally defined faith community practices and how nurses can support faith for individuals with whom they encounter within all nursing practice. © 2011 Blackwell Publishing Ltd.
Don S. Stone; Joseph E. Jakes; Jonathan Puthoff; Abdelmageed A. Elmustafa
2010-01-01
Finite element analysis is used to simulate cone indentation creep in materials across a wide range of hardness, strain rate sensitivity, and work-hardening exponent. Modeling reveals that the commonly held assumption of the hardness strain rate sensitivity (mΗ) equaling the flow stress strain rate sensitivity (mσ...
Energy-Water Modeling and Analysis | Energy Analysis | NREL
Generation (ReEDS Model Analysis) U.S. Energy Sector Vulnerabilities to Climate Change and Extreme Weather Modeling and Analysis Energy-Water Modeling and Analysis NREL's energy-water modeling and analysis vulnerabilities from various factors, including water. Example Projects Renewable Electricity Futures Study
Sensitivity analysis and related analysis : A survey of statistical techniques
Kleijnen, J.P.C.
1995-01-01
This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical
Energy Technology Data Exchange (ETDEWEB)
Dwayne C. Kicker
2001-09-28
A statistical description of the probable block sizes formed by fractures around the emplacement drifts has been developed for each of the lithologic units of the repository host horizon. A range of drift orientations with the drift azimuth varied in 15{sup o} increments has been considered in the static analysis. For the quasi-static seismic analysis, and the time-dependent and thermal effects analysis, two drift orientations have been considered: a drift azimuth of 105{sup o} and the current emplacement drift azimuth of 75{sup o}. The change in drift profile resulting from progressive deterioration of the emplacement drifts has been assessed both with and without backfill. Drift profiles have been determined for four different time increments, including static (i.e., upon excavation), 200 years, 2,000 years, and 10,000 years. The effect of seismic events on rock fall has been analyzed. Block size distributions and drift profiles have been determined for three seismic levels, including a 1,000-year event, a 5,000-year event, and a 10,000-year event. Data developed in this modeling and analysis activity have been entered into the TDMS (DTN: MO0109RDDAAMRR.003). The following conclusions have resulted from this drift degradation analysis: (1) The available fracture data are suitable for supporting a detailed key block analysis of the repository host horizon rock mass. The available data from the north-south Main Drift and the east-west Cross Drift provide a sufficient representative fracture sample of the repository emplacement drift horizon. However, the Tptpln fracture data are only available from a relatively small section of the Cross Drift, resulting in a smaller fracture sample size compared to the other lithologic units. This results in a lower degree of confidence that the key block data based on the Tptpln data set is actually representative of the overall Tptpln key block population. (2) The seismic effect on the rock fall size distribution for all events
Energy Technology Data Exchange (ETDEWEB)
Kouzes, Richard T.; Zhu, Zihua
2011-09-12
The current focal point of the nuclear physics program at PNNL is the MAJORANA DEMONSTRATOR, and the follow-on Tonne-Scale experiment, a large array of ultra-low background high-purity germanium detectors, enriched in 76Ge, designed to search for zero-neutrino double-beta decay (0νββ). This experiment requires the use of germanium isotopically enriched in 76Ge. The MAJORANA DEMONSTRATOR is a DOE and NSF funded project with a major science impact. The DEMONSTRATOR will utilize 76Ge from Russia, but for the Tonne-Scale experiment it is hoped that an alternate technology, possibly one under development at Nonlinear Ion Dynamics (NID), will be a viable, US-based, lower-cost source of separated material. Samples of separated material from NID require analysis to determine the isotopic distribution and impurities. DOE is funding NID through an SBIR grant for development of their separation technology for application to the Tonne-Scale experiment. The Environmental Molecular Sciences facility (EMSL), a DOE user facility at PNNL, has the required mass spectroscopy instruments for making isotopic measurements that are essential to the quality assurance for the MAJORANA DEMONSTRATOR and for the development of the future separation technology required for the Tonne-Scale experiment. A sample of isotopically separated copper was provided by NID to PNNL in January 2011 for isotopic analysis as a test of the NID technology. The results of that analysis are reported here. A second sample of isotopically separated copper was provided by NID to PNNL in August 2011 for isotopic analysis as a test of the NID technology. The results of that analysis are also reported here.
International Nuclear Information System (INIS)
Kunz, P.F.
1991-04-01
There are many tools used in analysis in High Energy Physics (HEP). They range from low level tools such as a programming language to high level such as a detector simulation package. This paper will discuss some aspects of these tools that are directly associated with the process of analyzing HEP data. Physics analysis tools cover the whole range from the simulation of the interactions of particles to the display and fitting of statistical data. For purposes of this paper, the stages of analysis is broken down to five main stages. The categories are also classified as areas of generation, reconstruction, and analysis. Different detector groups use different terms for these stages thus it is useful to define what is meant by them in this paper. The particle generation stage is a simulation of the initial interaction, the production of particles, and the decay of the short lived particles. The detector simulation stage simulates the behavior of an event in a detector. The track reconstruction stage does pattern recognition on the measured or simulated space points, calorimeter information, etc., and reconstructs track segments of the original event. The event reconstruction stage takes the reconstructed tracks, along with particle identification information and assigns masses to produce 4-vectors. Finally the display and fit stage displays statistical data accumulated in the preceding stages in the form of histograms, scatter plots, etc. The remainder of this paper will consider what analysis tools are available today, and what one might expect in the future. In each stage, the integration of the tools with other stages and the portability of the tool will be analyzed
Żarnecki, Aleksander F.; Piotrowski, Lech W.; Mankiewicz, Lech; Małek, Sebastian
2012-05-01
GLORIA stands for “GLObal Robotic-telescopes Intelligent Array”. GLORIA will be the first free and open-access network of robotic telescopes of the world. It will be a Web 2.0 environment where users can do research in astronomy by observing with robotic telescopes, and/or analyzing data that other users have acquired with GLORIA, or from other free access databases, like the European Virtual Observatory. GLORIA project will define free standards, protocols and methodology for controlling Robotic Telescopes and related instrumentation, for conducting so called on-line experiments by scheduling observations in the telescope network, and for conducting so-called off-line experiments based on the analysis of astronomical meta-data produced by GLORIA or other databases. Luiza analysis framework for GLORIA was based on the Marlin package developed for the International Linear Collider (ILC), data analysis. HEP experiments have to deal with enormous amounts of data and distributed data analysis is a must, so the Marlin framework concept seemed to be well suited for GLORIA needs. The idea (and large parts of code) taken from Marlin is that every computing task is implemented as a processor (module) that analyzes the data stored in an internal data structure and created additional output is also added to that collection. The advantage of such a modular approach is to keep things as simple as possible. Every single step of the full analysis chain that goes eg. from raw images to light curves can be processed separately and the output of each step is still self consistent and can be fed in to the next step without any manipulation.
Extended Testability Analysis Tool
Melcher, Kevin; Maul, William A.; Fulton, Christopher
2012-01-01
The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.
Multiparameter Cell Cycle Analysis.
Jacobberger, James W; Sramkoski, R Michael; Stefan, Tammy; Woost, Philip G
2018-01-01
Cell cycle cytometry and analysis are essential tools for studying cells of model organisms and natural populations (e.g., bone marrow). Methods have not changed much for many years. The simplest and most common protocol is DNA content analysis, which is extensively published and reviewed. The next most common protocol, 5-bromo-2-deoxyuridine S phase labeling detected by specific antibodies, is also well published and reviewed. More recently, S phase labeling using 5'-ethynyl-2'-deoxyuridine incorporation and a chemical reaction to label substituted DNA has been established as a basic, reliable protocol. Multiple antibody labeling to detect epitopes on cell cycle regulated proteins, which is what this chapter is about, is the most complex of these cytometric cell cycle assays, requiring knowledge of the chemistry of fixation, the biochemistry of antibody-antigen reactions, and spectral compensation. However, because this knowledge is relatively well presented methodologically in many papers and reviews, this chapter will present a minimal Methods section for one mammalian cell type and an extended Notes section, focusing on aspects that are problematic or not well described in the literature. Most of the presented work involves how to segment the data to produce a complete, progressive, and compartmentalized cell cycle analysis from early G1 to late mitosis (telophase). A more recent development, using fluorescent proteins fused with proteins or peptides that are degraded by ubiquitination during specific periods of the cell cycle, termed "Fucci" (fluorescent, ubiquitination-based cell cycle indicators) provide an analysis similar in concept to multiple antibody labeling, except in this case cells can be analyzed while living and transgenic organisms can be created to perform cell cycle analysis ex or in vivo (Sakaue-Sawano et al., Cell 132:487-498, 2007). This technology will not be discussed.
Complementing Gender Analysis Methods.
Kumar, Anant
2016-01-01
The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital.
Blind Analysis in Particle Physics
International Nuclear Information System (INIS)
Roodman, A
2003-01-01
A review of the blind analysis technique, as used in particle physics measurements, is presented. The history of blind analyses in physics is briefly discussed. Next the dangers of and the advantages of a blind analysis are described. Three distinct kinds of blind analysis in particle physics are presented in detail. Finally, the BABAR collaboration's experience with the blind analysis technique is discussed
Proton exciting X ray analysis
International Nuclear Information System (INIS)
Ma Xinpei
1986-04-01
The analyzing capability of proton exciting X ray analysis for different elements in organisms was discussed, and dealing with examples of trace element analysis in the human body and animal organisms, such as blood serum, urine, and hair. The sensitivity, accuracy, and capability of multielement analysis were discussed. Its strong points for the trace element analysis in biomedicine were explained
DEFF Research Database (Denmark)
Larsen, Michael Holm
1999-01-01
This note introduces the IDEF0 modelling language (semantics and syntax), and associated rules and techniques, for developing structured graphical representations of a system or enterprise. Use of this standard for IDEF0 permits the construction of models comprising system functions (activities...... that require a modelling technique for the analysis, development, re-engineering, integration, or acquisition of information systems; and incorporate a systems or enterprise modelling technique into a business process analysis or software engineering methodology.This note is a summary of the Standard...... for Integration Definition for Function Modelling (IDEF0). I.e. the Draft Federal Information Processing Standards Publication 183, 1993, December 21, Announcing the Standard for Integration Definition for Function Modelling (IDEF0)....
International Nuclear Information System (INIS)
Tomar, B.S.
2016-01-01
In the present talk the fundamentals of the nuclear forensic investigations will be discussed followed by the detailed standard operating procedure (SOP) for the nuclear forensic analysis. The characteristics, such as, dimensions, particle size, elemental and isotopic composition help the nuclear forensic analyst in source attribution of the interdicted material, as the specifications of the nuclear materials used by different countries are different. The analysis of elemental composition could be done by SEM-EDS, XRF, CHNS analyser, etc. depending upon the type of the material. Often the trace constituents (analysed by ICP-AES, ICP-MS, AAS, etc) provide valuable information about the processes followed during the production of the material. Likewise the isotopic composition determined by thermal ionization mass spectrometry provides useful information about the enrichment of the nuclear fuel and hence its intended use
Visualization analysis and design
Munzner, Tamara
2015-01-01
Visualization Analysis and Design provides a systematic, comprehensive framework for thinking about visualization in terms of principles and design choices. The book features a unified approach encompassing information visualization techniques for abstract data, scientific visualization techniques for spatial data, and visual analytics techniques for interweaving data transformation and analysis with interactive visual exploration. It emphasizes the careful validation of effectiveness and the consideration of function before form. The book breaks down visualization design according to three questions: what data users need to see, why users need to carry out their tasks, and how the visual representations proposed can be constructed and manipulated. It walks readers through the use of space and color to visually encode data in a view, the trade-offs between changing a single view and using multiple linked views, and the ways to reduce the amount of data shown in each view. The book concludes with six case stu...
Invitation to complex analysis
Boas, Ralph P
2010-01-01
Ideal for a first course in complex analysis, this book can be used either as a classroom text or for independent study. Written at a level accessible to advanced undergraduates and beginning graduate students, the book is suitable for readers acquainted with advanced calculus or introductory real analysis. The treatment goes beyond the standard material of power series, Cauchy's theorem, residues, conformal mapping, and harmonic functions by including accessible discussions of intriguing topics that are uncommon in a book at this level. The flexibility afforded by the supplementary topics and applications makes the book adaptable either to a short, one-term course or to a comprehensive, full-year course. Detailed solutions of the exercises both serve as models for students and facilitate independent study. Supplementary exercises, not solved in the book, provide an additional teaching tool. This second edition has been painstakingly revised by the author's son, himself an award-winning mathematical expositor...
Tohyama, Mikio
2015-01-01
What is this sound? What does that sound indicate? These are two questions frequently heard in daily conversation. Sound results from the vibrations of elastic media and in daily life provides informative signals of events happening in the surrounding environment. In interpreting auditory sensations, the human ear seems particularly good at extracting the signal signatures from sound waves. Although exploring auditory processing schemes may be beyond our capabilities, source signature analysis is a very attractive area in which signal-processing schemes can be developed using mathematical expressions. This book is inspired by such processing schemes and is oriented to signature analysis of waveforms. Most of the examples in the book are taken from data of sound and vibrations; however, the methods and theories are mostly formulated using mathematical expressions rather than by acoustical interpretation. This book might therefore be attractive and informative for scientists, engineers, researchers, and graduat...
Energy Technology Data Exchange (ETDEWEB)
Hopke, P K [Department of Chemistry, Clarkson Univ., Potsdam, NY (United States)
2000-07-01
As a consequence of various IAEA programmes to sample airborne particulate matter and determine its elemental composition, the participating research groups are accumulating data on the composition of the atmospheric aerosol. It is necessary to consider ways in which these data can be utilized in order to be certain that the data obtained are correct and that the information then being transmitted to others who may make decisions based on such information is as representative and correct as possible. In order to both examine the validity of those data and extract appropriate information from them, it is necessary to utilize a variety of data analysis methods. The objective of this workbook is to provide a guide with examples of utilizing data analysis on airborne particle composition data using a spreadsheet program (EXCEL) and a personal computer based statistical package (StatGraphics)
Kass, Robert E; Brown, Emery N
2014-01-01
Continual improvements in data collection and processing have had a huge impact on brain research, producing data sets that are often large and complicated. By emphasizing a few fundamental principles, and a handful of ubiquitous techniques, Analysis of Neural Data provides a unified treatment of analytical methods that have become essential for contemporary researchers. Throughout the book ideas are illustrated with more than 100 examples drawn from the literature, ranging from electrophysiology, to neuroimaging, to behavior. By demonstrating the commonality among various statistical approaches the authors provide the crucial tools for gaining knowledge from diverse types of data. Aimed at experimentalists with only high-school level mathematics, as well as computationally-oriented neuroscientists who have limited familiarity with statistics, Analysis of Neural Data serves as both a self-contained introduction and a reference work.
In Silico Expression Analysis.
Bolívar, Julio; Hehl, Reinhard; Bülow, Lorenz
2016-01-01
Information on the specificity of cis-sequences enables the design of functional synthetic plant promoters that are responsive to specific stresses. Potential cis-sequences may be experimentally tested, however, correlation of genomic sequence with gene expression data enables an in silico expression analysis approach to bioinformatically assess the stress specificity of candidate cis-sequences prior to experimental verification. The present chapter demonstrates an example for the in silico validation of a potential cis-regulatory sequence responsive to cold stress. The described online tool can be applied for the bioinformatic assessment of cis-sequences responsive to most abiotic and biotic stresses of plants. Furthermore, a method is presented based on a reverted in silico expression analysis approach that predicts highly specific potentially functional cis-regulatory elements for a given stress.
Leonard, Kathryn; Tari, Sibel; Hubert, Evelyne; Morin, Geraldine; El-Zehiry, Noha; Chambers, Erin
2018-01-01
Based on the second Women in Shape (WiSH) workshop held in Sirince, Turkey in June 2016, these proceedings offer the latest research on shape modeling and analysis and their applications. The 10 peer-reviewed articles in this volume cover a broad range of topics, including shape representation, shape complexity, and characterization in solving image-processing problems. While the first six chapters establish understanding in the theoretical topics, the remaining chapters discuss important applications such as image segmentation, registration, image deblurring, and shape patterns in digital fabrication. The authors in this volume are members of the WiSH network and their colleagues, and most were involved in the research groups formed at the workshop. This volume sheds light on a variety of shape analysis methods and their applications, and researchers and graduate students will find it to be an invaluable resource for further research in the area.
Kanjilal, S. K.; Lindquist, M. R.; Ulbricht, L. E.
1994-02-01
Jumper connectors are used for remotely connecting pipe lines containing transfer fluids ranging from hazardous chemicals to other nonhazardous liquids. The jumper connector assembly comprises hooks, hookpins, a block, a nozzle, an operating screw, and a nut. The hooks are tightened against the nozzle flanges by the operating screw that is tightened with a remotely connected torque wrench. Stress analysis for the jumper connector assembly (used extensively on the US Department of Energy's Hanford Site, near Richland, Washington) is performed by using hand calculation and finite-element techniques to determine the stress levels resulting from operating and seismic loads on components of the assembly. The analysis addresses loading conditions such as prestress, seismic, operating, thermal, and leakage. The preload torque-generated forces at which each component reaches its stress limits are presented in a tabulated format. Allowable operating loads for the jumper assembly are provided to prevent leakage of the assembly during operating cycles.
DEFF Research Database (Denmark)
Lund, Henrik; Sorknæs, Peter; Mathiesen, Brian Vad
2018-01-01
of electricity, which have been introduced in recent decades. These uncertainties pose a challenge to the design and assessment of future energy strategies and investments, especially in the economic assessment of renewable energy versus business-as-usual scenarios based on fossil fuels. From a methodological...... point of view, the typical way of handling this challenge has been to predict future prices as accurately as possible and then conduct a sensitivity analysis. This paper includes a historical analysis of such predictions, leading to the conclusion that they are almost always wrong. Not only...... are they wrong in their prediction of price levels, but also in the sense that they always seem to predict a smooth growth or decrease. This paper introduces a new method and reports the results of applying it on the case of energy scenarios for Denmark. The method implies the expectation of fluctuating fuel...
International Nuclear Information System (INIS)
Hopke, P.K.
2000-01-01
As a consequence of various IAEA programmes to sample airborne particulate matter and determine its elemental composition, the participating research groups are accumulating data on the composition of the atmospheric aerosol. It is necessary to consider ways in which these data can be utilized in order to be certain that the data obtained are correct and that the information then being transmitted to others who may make decisions based on such information is as representative and correct as possible. In order to both examine the validity of those data and extract appropriate information from them, it is necessary to utilize a variety of data analysis methods. The objective of this workbook is to provide a guide with examples of utilizing data analysis on airborne particle composition data using a spreadsheet program (EXCEL) and a personal computer based statistical package (StatGraphics)
Physical analysis for tribology
International Nuclear Information System (INIS)
Quinn, F.J.
1991-01-01
This textbook by Dr. Quinn contains an interesting and useful combination of subject matter related to tribology and methods of surface analysis pertinent to wear problems. A brief introductory chapter includes a good overview of wear phenomena and mechanisms. Three chapters, comprising about one-third of the book, discuss surface and surface film diagnostic and analysis methods. These include optical, electrical and magnetic techniques as well as electron and x-ray diffraction methods. Considerable detail is provided on background related to crystallography and diffraction. Those not concerned with technique per se, will likely omit these sections. The last five chapters are core subject matter for students, engineers, and researchers interested in wear phenomena. Dr. Quinn draws considerable material from his own extensive background in the area, as well as a good selection of other examples from the research literature
DEFF Research Database (Denmark)
Feng, Ling
2008-01-01
This dissertation concerns the investigation of the consistency of statistical regularities in a signaling ecology and human cognition, while inferring appropriate actions for a speech-based perceptual task. It is based on unsupervised Independent Component Analysis providing a rich spectrum...... of audio contexts along with pattern recognition methods to map components to known contexts. It also involves looking for the right representations for auditory inputs, i.e. the data analytic processing pipelines invoked by human brains. The main ideas refer to Cognitive Component Analysis, defined...... as the process of unsupervised grouping of generic data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. Its hypothesis runs ecologically: features which are essentially independent in a context defined ensemble, can be efficiently coded as sparse...
Handbook of radioactivity analysis
2012-01-01
The updated and much expanded Third Edition of the "Handbook of Radioactivity Analysis" is an authoritative reference providing the principles, practical techniques, and procedures for the accurate measurement of radioactivity from the very low levels encountered in the environment to higher levels measured in radioisotope research, clinical laboratories, biological sciences, radionuclide standardization, nuclear medicine, nuclear power, fuel cycle facilities and in the implementation of nuclear forensic analysis and nuclear safeguards. The Third Edition contains seven new chapters providing a reference text much broader in scope than the previous Second Edition, and all of the other chapters have been updated and expanded many with new authors. The book describes the basic principles of radiation detection and measurement, the preparation of samples from a wide variety of matrices, assists the investigator or technician in the selection and use of appropriate radiation detectors, and presents state-of-the-ar...
International Nuclear Information System (INIS)
Taure, I.; Riekstina, D.; Veveris, O.
2004-01-01
Neutron activation analysis (NAA) in Latvia began to develop after 1961 when nuclear reactor in Salaspils started to work. It provided a powerful neuron source, which is necessary for this analytical method. In 1963 at Institute of Physics of the Latvian Academy of Sciences the Laboratory of Neutron Activation Analysis was formed. At the first stage of development the main tasks were of theoretical and technical aspects of NAA. Later the NAA was used to solve problems in technology, biology, and medicine. In the beginning of the 80-ties more attention was focussed to the use of NAA in the environmental research. Environmental problems stayed the main task till the closing the nuclear reactor in Salaspils in 1998 that ceased the main the existence of the laboratory and of NAA, this significant and powerful analytical method in Latvia and Baltic in general. (authors)
Goldstein, Allen A
1967-01-01
This text introduces the methods of applied functional analysis and applied convexity. Suitable for advanced undergraduates and graduate students of mathematics, science, and technology, it focuses on the solutions to two closely related problems. The first concerns finding roots of systems of equations and operative equations in a given region. The second involves extremal problems of minimizing or maximizing functions defined on subsets of finite and infinite dimensional spaces. Rather than citing practical algorithms for solving problems, this treatment provides the tools for studying problem-related algorithms.Topics include iterations and fixed points, metric spaces, nonlinear programming, polyhedral convex programming, and infinite convex programming. Additional subjects include linear spaces and convex sets and applications to integral equations. Students should be familiar with advanced calculus and linear algebra. As an introduction to elementary functional analysis motivated by application, this vol...
UVISS preliminary visibility analysis
DEFF Research Database (Denmark)
Betto, Maurizio
1998-01-01
The goal of this work is to obtain a preliminary assessment of the sky visibility for anastronomical telescope located on the express pallet of the International SpaceStation (ISS)} taking into account the major constraints imposed on the instrument by the ISSattitude and structure. Part of the w......The goal of this work is to obtain a preliminary assessment of the sky visibility for anastronomical telescope located on the express pallet of the International SpaceStation (ISS)} taking into account the major constraints imposed on the instrument by the ISSattitude and structure. Part...... of the work is also to setup the kernel of a software tool for the visibility analysis thatshould be easily expandable to consider more complex strucures for future activities.This analysis is part of the UVISS assessment study and it is meant to provide elementsfor the definition and the selection...
Multivariate analysis techniques
Energy Technology Data Exchange (ETDEWEB)
Bendavid, Josh [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Fisher, Wade C. [Michigan State Univ., East Lansing, MI (United States); Junk, Thomas R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)
2016-01-01
The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually both be improved by separating signal events from background events with higher efficiency and purity.
DEFF Research Database (Denmark)
Esteban, Marta; Schindler, Birgit K; Jiménez-Guerrero, José A
2015-01-01
Human biomonitoring (HBM) is an effective tool for assessing actual exposure to chemicals that takes into account all routes of intake. Although hair analysis is considered to be an optimal biomarker for assessing mercury exposure, the lack of harmonization as regards sampling and analytical...... assurance program (QAP) for assessing mercury levels in hair samples from more than 1800 mother-child pairs recruited in 17 European countries. To ensure the comparability of the results, standard operating procedures (SOPs) for sampling and for mercury analysis were drafted and distributed to participating...... laboratories. Training sessions were organized for field workers and four external quality-assessment exercises (ICI/EQUAS), followed by the corresponding web conferences, were organized between March 2011 and February 2012. ICI/EQUAS used native hair samples at two mercury concentration ranges (0...
CERN. Geneva; Fitch, Blake
2011-01-01
Traditionaly, the primary role of supercomputers was to create data, primarily for simulation applications. Due to usage and technology trends, supercomputers are increasingly also used for data analysis. Some of this data is from simulations, but there is also a rapidly increasingly amount of real-world science and business data to be analyzed. We briefly overview Blue Gene and other current supercomputer architectures. We outline future architectures, up to the Exascale supercomputers expected in the 2020 time frame. We focus on the data analysis challenges and opportunites, especially those concerning Flash and other up-and-coming storage class memory. About the speakers Blake G. Fitch has been with IBM Research, Yorktown Heights, NY since 1987, mainly pursuing interests in parallel systems. He joined the Scalable Parallel Systems Group in 1990, contributing to research and development that culminated in the IBM scalable parallel system (SP*) product. His research interests have focused on applicatio...
DEFF Research Database (Denmark)
Poulsen, Mikael Zebbelin
2002-01-01
, by the implementation of the Simpy tool box. This is an object oriented system implemented in the Python language. It can be used for analysis of DAEs, ODEs and non-linear equation and uses e.g. symbolic representations of expressions and equations. The presentations of theory and algorithms for structural index......Differential algebraic equations (DAEs) constitute a fundamental model class for many modelling purposes in engineering and other sciences, especially for dynamical simulation of component based systems. This thesis describes a practical methodology and approach for analysing general DAE...... analysis of DAE is original in the sense that it is based on a new matrix representation of the structural information of a general DAE system instead of a graph oriented representation. Also the presentation of the theory is found to be more complete compared to other presentations, since it e.g. proves...
International Nuclear Information System (INIS)
Keeney, R.; Renn, O.; Winterfeldt, D. von; Kotte, U.
1985-01-01
What are the targets and criteria on which national energy policy should be based. What priorities should be set, and how can different social interests be matched. To answer these questions, a new instrument of decision theory is presented which has been applied with good results to controversial political issues in the USA. The new technique is known under the name of value tree analysis. Members of important West German organisations (BDI, VDI, RWE, the Catholic and Protestant Church, Deutscher Naturschutzring, and ecological research institutions) were asked about the goals of their organisations. These goals were then ordered systematically and arranged in a hierarchical tree structure. The value trees of different groups can be combined into a catalogue of social criteria of acceptability and policy assessment. The authors describe the philosophy and methodology of value tree analysis and give an outline of its application in the development of a socially acceptable energy policy. (orig.) [de
International Nuclear Information System (INIS)
Schmittroth, F.
1979-09-01
A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples
International Nuclear Information System (INIS)
Bethge, K.
1995-01-01
Full text: Ion beam analysis is an accelerator application area for the study of materials and the structure of matter; electrostatic accelerators of the Van de Graaff or Dynamitron type are often used for energies up to a few MeV. Two types of machines are available - the single-ended accelerator type with higher beam currents and greater flexibility of beam management, or the tandem accelerator, limited to atomic species with negative ions. The accelerators are not generally installed at specialist accelerator laboratories and have to be easy to maintain and simple to operate. The most common technique for industrial research is Rutherford Back Scattering Spectrometry (RBS). Helium ions are the preferred projectiles, since at elevated energies (above 3 MeV) nuclear resonance scattering can be used to detect photons associated with target molecules containing elements such as carbon, nitrogen or oxygen. Due to the large amount of available data on nuclear reactions in this energy range, activation analysis (detecting trace elements by irradiating the sample) can be performed with charged particles from accelerators over a wider range of atoms than with the conventional use of neutrons, which is more suited to light elements. Resonance reactions have been used to detect trace metals such as aluminium, titanium and vanadium. Hydrogen atoms are vital to the material performance of several classes of materials, such as semiconductors, insulators and ceramics. Prudent selection of the projectile ion aids the analysis of hydrogen composition; the technique is then a simple measurement of the emitted gamma radiation. Solar cell material and glass can be analysed in this way. On a world-wide basis, numerous laboratories perform ion beam analysis for research purposes; considerable work is carried out in cooperation between scientific laboratories and industry, but only a few laboratories provide a completely commercial service
Sroufe, Paul; Phithakkitnukoon, Santi; Dantu, Ram; Cangussu, João
2010-01-01
Email has become an integral part of everyday life. Without a second thought we receive bills, bank statements, and sales promotions all to our inbox. Each email has hidden features that can be extracted. In this paper, we present a new mechanism to characterize an email without using content or context called Email Shape Analysis. We explore the applications of the email shape by carrying out a case study; botnet detection and two possible applications: spam filtering, and social-context bas...
Rangayyan, Rangaraj M
2015-01-01
The book will help assist a reader in the development of techniques for analysis of biomedical signals and computer aided diagnoses with a pedagogical examination of basic and advanced topics accompanied by over 350 figures and illustrations. Wide range of filtering techniques presented to address various applications. 800 mathematical expressions and equations. Practical questions, problems and laboratory exercises. Includes fractals and chaos theory with biomedical applications.
Numerical analysis of bifurcations
International Nuclear Information System (INIS)
Guckenheimer, J.
1996-01-01
This paper is a brief survey of numerical methods for computing bifurcations of generic families of dynamical systems. Emphasis is placed upon algorithms that reflect the structure of the underlying mathematical theory while retaining numerical efficiency. Significant improvements in the computational analysis of dynamical systems are to be expected from more reliance of geometric insight coming from dynamical systems theory. copyright 1996 American Institute of Physics
ORGANISATIONAL CULTURE ANALYSIS MODEL
Mihaela Simona Maracine
2012-01-01
The studies and researches undertaken have demonstrated the importance of studying organisational culture because of the practical valences it presents and because it contributes to increasing the organisation’s performance. The analysis of the organisational culture’s dimensions allows observing human behaviour within the organisation and highlighting reality, identifying the strengths and also the weaknesses which have an impact on its functionality and development. In this paper, we try to...
Liu, Jianghong
2004-01-01
The concept of aggression is important to nursing because further knowledge of aggression can help generate a better theoretical model to drive more effective intervention and prevention approaches. This paper outlines a conceptual analysis of aggression. First, the different forms of aggression are reviewed, including the clinical classification and the stimulus-based classification. Then the manifestations and measurement of aggression are described. Finally, the causes and consequences of ...
International Nuclear Information System (INIS)
Kolev, N.I.
2011-01-01
This paper summarizes the author's results in boiling analysis obtained in the last 17 years. It demonstrates that more information can be extracted from the analysis by incorporating even of gross turbulence characteristics consistently in the analysis and appropriate local volume and time averaging. The main findings are: Even in large scale analysis (no direct numerical simulation) the steady and transient averaged turbulence characteristics are necessary to increase the quality of predicting heat and mass transfer. It allows simulating the heat transfer change behind spacer grids analytically which is not the practice up to now. This allows also to simulate the change of the deposition behind the spacer grid and therefore this bring us closer to the mechanistic prediction of dry out. Accurate boiling heat transfer predictions require knowledge on the nucleation characteristics of each particular surface. The pulsation characteristics at the wall controlling the heat transfer are associated with the bubble departure frequencies but not identical with them. Considering the mutual interactions of the bubbles leads to the surprising analytical prediction of the departure from nucleate boiling just by using the mechanisms acting during flow boiling only. The performance of the author's analytical two-phase convection model combined with its analytical nuclide boiling model is proven to have the accuracy of the empirical Chen's model by having the advantage of predicting analytically the internal characteristics of the flow each of it validated by experiment. This is also important for the future use in multiphase CFD where details about the flow field generation have to be also predicted by constitutive relation as summarized in this paper. (author)
International Nuclear Information System (INIS)
Kolev, Nikolay Ivanov
2011-01-01
This paper summarizes the author's results in boiling analysis obtained in the last 17 years. It demonstrates that more information can be extracted from the analysis by incorporating even of gross turbulence characteristics consistently in the analysis and appropriate local volume and time averaging. The main findings are: Even in large scale analysis (no direct numerical simulation) the steady and transient averaged turbulence characteristics are necessary to increase the quality of predicting heat and mass transfer. It allows to simulate the heat transfer change behind spacer grids analytically which is not the practice up to now. This allows also to simulate the change of the deposition behind the spacer grid and therefore this bring us closer to the mechanistic prediction of dry out. Accurate boiling heat transfer predictions require knowledge on the nucleation characteristics of each particular surface. The pulsation characteristics at the wall controlling the heat transfer are associated with the bubble departure frequencies but not identical with them. Considering the mutual interactions of the bubbles leads to the surprising analytical prediction of the departure from nucleate boiling just by using the mechanisms acting during flow boiling only. The performance of the author's analytical two-phase convection model combined with its analytical nuclide boiling model is proven to have the accuracy of the empirical Chen's model by having the advantage of predicting analytically the internal characteristics of the flow each of it validated by experiment. This is also important for the future use in multiphase CFD where details about the flow field generation have to be also predicted by constitutive relation as summarized in this paper. (author)
Energy Technology Data Exchange (ETDEWEB)
Salvesen, F.; Sandgren, J. [KanEnergi AS, Rud (Norway)
1997-12-31
The present energy situation in the target area is summarized: 20 million inhabitants without electricity in north- west Russia, 50 % of the people in the Baltic`s without electricity, very high technical skills, biggest problems is the finance. The energy situation, the advantages of the renewables, the restrictions, and examples for possible technical solutions are reviewed on the basis of short analysis and experience with the Baltics and Russia
International Nuclear Information System (INIS)
Puglisi, M.; Cornacchia, M.
1981-01-01
The need for a very low final amplifier output impedance, always associated with class A operation, requires a very large power waste in the final tube. The recently suggested pulsed rf operation, while saving a large amount of power, increases the inherent final amplifier non linearity. A method is presented for avoiding the large signal non linear analysis and it is shown how each component of the beam induced voltage depends upon all the beam harmonics via some coupling coefficients which are evaluated
International Nuclear Information System (INIS)
James Houseworth
2001-01-01
This Analysis/Model Report (AMR) documents an analysis that was performed to estimate climatic variables for the next 10,000 years by forecasting the timing and nature of climate change at Yucca Mountain (YM), Nevada (Figure 1), the site of a potential repository for high-level radioactive waste. The future-climate estimates are based on an analysis of past-climate data from analog meteorological stations, and this AMR provides the rationale for the selection of these analog stations. The stations selected provide an upper and a lower climate bound for each future climate, and the data from those sites will provide input to the infiltration model (USGS 2000) and for the total system performance assessment for the Site Recommendation (TSPA-SR) at YM. Forecasting long-term future climates, especially for the next 10,000 years, is highly speculative and rarely attempted. A very limited literature exists concerning the subject, largely from the British radioactive waste disposal effort. The discussion presented here is one method, among many, of establishing upper and lower bounds for future climate estimates. The method used here involves selecting a particular past climate from many past climates, as an analog for future climate. Other studies might develop a different rationale or select other past climates resulting in a different future climate analog. Revision 00 of this AMR was prepared in accordance with the ''Work Direction and Planning Document for Future Climate Analysis'' (Peterman 1999) under Interagency Agreement DE-AI08-97NV12033 with the U.S. Department of Energy (DOE). The planning document for the technical scope, content, and management of ICN 01 of this AMR is the ''Technical Work Plan for Unsaturated Zone (UZ) Flow and Transport Process Model Report'' (BSC 2001a). The scope for the TBV resolution actions in this ICN is described in the ''Technical Work Plan for: Integrated Management of Technical Product Input Department''. (BSC 2001b, Addendum B
Thoughts on multielement analysis
International Nuclear Information System (INIS)
Kaiser, H.
1976-01-01
The author discusses, in an informal fashion, some of the important aspects of multielement analysis that are frequently overlooked in the present-day trend of trying to measure everything (elements, compounds) in everything (environmental samples). While many points are touched upon, with the aim of providing 'fuel' for the subsequent General Discussion, two themes are illustrated in some depth; do our backgrounds spoil our results, and do our experiments require the impossible A base for planning experimental strategy is outlined. (author)
Analysis technologies firewall
Directory of Open Access Journals (Sweden)
Б.Я. Корнієнко
2006-02-01
Full Text Available With the purpose of research of properties and maintenance of protection of the information in telecommunication systems and information systems is lead the comparative analysis of technologies of firewalls. Three basictechnologies of firewalls are considered: on the basis of popular free-of-charge sources of program maintenance, commercial program decisions and hardware-software decisions. Results of researches are presented in tables and conclusions.
Cowell, Simon; Poulin, Philippe
2015-01-01
In Early Transcendentals (The American Mathematical Monthly, Vol. 104, No 7) Steven Weintraub presents a rigorous justifcation of the "early transcendental" calculus textbook approach to the exponential and logarithmic functions. However, he uses tools such as term-by-term differentiation of infinite series. We present a rigorous treatment of the early transcendental approach suitable for a first course in analysis, using mainly the supremum property of the real numbers.
Dimensional analysis for engineers
Simon, Volker; Gomaa, Hassan
2017-01-01
This monograph provides the fundamentals of dimensional analysis and illustrates the method by numerous examples for a wide spectrum of applications in engineering. The book covers thoroughly the fundamental definitions and the Buckingham theorem, as well as the choice of the system of basic units. The authors also include a presentation of model theory and similarity solutions. The target audience primarily comprises researchers and practitioners but the book may also be suitable as a textbook at university level.
International Nuclear Information System (INIS)
Taroco, E.; Feijoo, R.A.
1981-07-01
In this paper it is presented a variational method for the limit analysis of an ideal plastic solid. This method has been denominated as Modified Secundary Creep and enables to find the collapse loads through a minimization of a functional and a limit process. Given an ideal plastic material it is shown how to determinate the associated secundary creep constitutive equation. Finally, as an application, it is found the limit load in an pressurized von Mises rigid plastic sphere. (Author) [pt
DEFF Research Database (Denmark)
Riber-Hansen, Rikke; Vainer, Ben; Steiniche, Torben
2012-01-01
Digital image analysis (DIA) is increasingly implemented in histopathological research to facilitate truly quantitative measurements, decrease inter-observer variation and reduce hands-on time. Originally, efforts were made to enable DIA to reproduce manually obtained results on histological slides...... reproducibility, application of stereology-based quantitative measurements, time consumption, optimization of histological slides, regions of interest selection and recent developments in staining and imaging techniques....
International Nuclear Information System (INIS)
Uppuluri, V.R.R.
1979-01-01
Mathematical foundations of risk analysis are addressed. The importance of having the same probability space in order to compare different experiments is pointed out. Then the following topics are discussed: consequences as random variables with infinite expectations; the phenomenon of rare events; series-parallel systems and different kinds of randomness that could be imposed on such systems; and the problem of consensus of estimates of expert opinion
Polynomials in algebraic analysis
Multarzyński, Piotr
2012-01-01
The concept of polynomials in the sense of algebraic analysis, for a single right invertible linear operator, was introduced and studied originally by D. Przeworska-Rolewicz \\cite{DPR}. One of the elegant results corresponding with that notion is a purely algebraic version of the Taylor formula, being a generalization of its usual counterpart, well known for functions of one variable. In quantum calculus there are some specific discrete derivations analyzed, which are right invertible linear ...
Numerical analysis II essentials
REA, The Editors of; Staff of Research Education Association
1989-01-01
REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Numerical Analysis II covers simultaneous linear systems and matrix methods, differential equations, Fourier transformations, partial differential equations, and Monte Carlo methods.
Digital Repository Service at National Institute of Oceanography (India)
Fernandes, A.A.; Antony, M.K.; Somayajulu, Y.K.; Sarma, Y.V.B.; Almeida, A.M.; Mahadevan, R.
, Head Applied Statistics Unit, Indian Statistical Institute, Calcutta for going through the section on Canonical Correlation Analysis and offering his comments on the same. This report has been prepared using ?Latex? on a ?Linux? platform, viz...., the personal computer Kapila. I wish to thank Mr. Dattaram Shivji for installing ?Latex? and ?GMT? packages on the personal computer. The style file used for preparing this report, has been hacked by me from a Goa University, Ph. D style file prepared by Dr. D...
[Diagnosis: synovial fluid analysis].
Gallo Vallejo, Francisco Javier; Giner Ruiz, Vicente
2014-01-01
Synovial fluid analysis in rheumatological diseases allows a more accurate diagnosis in some entities, mainly infectious and microcrystalline arthritis. Examination of synovial fluid in patients with osteoarthritis is useful if a differential diagnosis will be performed with other processes and to distinguish between inflammatory and non-inflammatory forms. Joint aspiration is a diagnostic and sometimes therapeutic procedure that is available to primary care physicians. Copyright © 2014 Elsevier España, S.L. All rights reserved.
Efficient Incremental Data Analysis
Nikolic, Milos
2016-01-01
Many data-intensive applications require real-time analytics over streaming data. In a growing number of domains -- sensor network monitoring, social web applications, clickstream analysis, high-frequency algorithmic trading, and fraud detections to name a few -- applications continuously monitor stream events to promptly react to certain data conditions. These applications demand responsive analytics even when faced with high volume and velocity of incoming changes, large numbers of users, a...
Trace Chemical Analysis Methodology
1980-04-01
147 65 Modified DR/2 spectrophotometer face ........... ... 150 66 Colorimetric oil analysis field test kit ......... .. 152 67 Pictorial step...Assisted Pattern Recognitio Perhaps the most promising application of pattern recogntiontechniques for this research effort is the elucidation ".f the...large compartment on the spectrophotomer face . The screwdriver is used to adjust the zero adjust and light ad- just knobs, and the stainless steel
2012-09-01
Services FSD Federated Services Daemon I&A Identification and Authentication IKE Internet Key Exchange KPI Key Performance Indicator LAN Local Area...spection takes place in different processes in the server architecture. Key Performance Indica- tor ( KPI )s associated with the system need to be...application and risk analysis of security controls. Thus, measurement of the KPIs is needed before an informed tradeoff between the performance penalties
Energy Technology Data Exchange (ETDEWEB)
Salvesen, F; Sandgren, J [KanEnergi AS, Rud (Norway)
1998-12-31
The present energy situation in the target area is summarized: 20 million inhabitants without electricity in north- west Russia, 50 % of the people in the Baltic`s without electricity, very high technical skills, biggest problems is the finance. The energy situation, the advantages of the renewables, the restrictions, and examples for possible technical solutions are reviewed on the basis of short analysis and experience with the Baltics and Russia
Regional energy facility siting analysis
International Nuclear Information System (INIS)
Eberhart, R.C.; Eagles, T.W.
1976-01-01
Results of the energy facility siting analysis portion of a regional pilot study performed for the anticipated National Energy Siting and Facility Report are presented. The question of cell analysis versus site-specific analysis is explored, including an evaluation of the difference in depth between the two approaches. A discussion of the possible accomplishments of regional analysis is presented. It is concluded that regional sitting analysis could be of use in a national siting study, if its inherent limits are recognized
Strictness Analysis for Attribute Grammars
DEFF Research Database (Denmark)
Rosendahl, Mads
1992-01-01
interpretation of attribute grammars. The framework is used to construct a strictness analysis for attribute grammars. Results of the analysis enable us to transform an attribute grammar such that attributes are evaluated during parsing, if possible. The analysis is proved correct by relating it to a fixpoint...... semantics for attribute grammars. An implementation of the analysis is discussed and some extensions to the analysis are mentioned....
Energy Technology Data Exchange (ETDEWEB)
Kouzes, Richard T.; Zhu, Zihua
2011-02-01
The current focal point of the nuclear physics program at PNNL is the MAJORANA DEMONSTRATOR, and the follow-on Tonne-Scale experiment, a large array of ultra-low background high-purity germanium detectors, enriched in 76Ge, designed to search for zero-neutrino double-beta decay (0νββ). This experiment requires the use of germanium isotopically enriched in 76Ge. The DEMONSTRATOR will utilize 76Ge from Russia, but for the Tonne-Scale experiment it is hoped that an alternate technology under development at Nonlinear Ion Dynamics (NID) will be a viable, US-based, lower-cost source of separated material. Samples of separated material from NID require analysis to determine the isotopic distribution and impurities. The MAJORANA DEMONSTRATOR is a DOE and NSF funded project with a major science impact. DOE is funding NID through an SBIR grant for development of their separation technology for application to the Tonne-Scale experiment. The Environmental Molecular Sciences facility (EMSL), a DOE user facility at PNNL, has the required mass spectroscopy instruments for making these isotopic measurements that are essential to the quality assurance for the MAJORANA DEMONSTRATOR and for the development of the future separation technology required for the Tonne-Scale experiment. A sample of isotopically separated copper was provided by NID to PNNL for isotopic analysis as a test of the NID technology. The results of that analysis are reported here.