Directory of Open Access Journals (Sweden)
Alessandra Andreotti
Full Text Available BACKGROUND: The World Health Organization (WHO conducted the World Health Survey (WHS between 2002 and 2004 in 70 countries to provide cross-population comparable data on health, health-related outcomes and risk factors. The aim of this study was to apply Grade of Membership (GoM modelling as a means to condense extensive health information from the WHS into a set of easily understandable health profiles and to assign the degree to which an individual belongs to each profile. PRINCIPAL FINDINGS: This paper described the application of the GoM models to summarize population health status using World Health Survey data. Grade of Membership analysis is a flexible, non-parametric, multivariate method, used to calculate health profiles from WHS self-reported health state and health conditions. The WHS dataset was divided into four country economic categories based on the World Bank economic groupings (high, upper-middle, lower-middle and low income economies for separate GoM analysis. Three main health profiles were produced for each of the four areas: I. Robust; II. Intermediate; III. Frail; moreover population health, wealth and inequalities are defined for countries in each economic area as a means to put the health results into perspective. CONCLUSIONS: These analyses have provided a robust method to better understand health profiles and the components which can help to identify healthy and non-healthy individuals. The obtained profiles have described concrete levels of health and have clearly delineated characteristics of healthy and non-healthy respondents. The GoM results provided both a useable way of summarising complex individual health information and a selection of intermediate determinants which can be targeted for interventions to improve health. As populations' age, and with limited budgets for additional costs for health care and social services, applying the GoM methods may assist with identifying higher risk profiles for decision
John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger
2002-01-01
CPM-GOMS is a modeling method that combines the task decomposition of a GOMS analysis with a model of human resource usage at the level of cognitive, perceptual, and motor operations. CPM-GOMS models have made accurate predictions about skilled user behavior in routine tasks, but developing such models is tedious and error-prone. We describe a process for automatically generating CPM-GOMS models from a hierarchical task decomposition expressed in a cognitive modeling tool called Apex. Resource scheduling in Apex automates the difficult task of interleaving the cognitive, perceptual, and motor resources underlying common task operators (e.g. mouse move-and-click). Apex's UI automatically generates PERT charts, which allow modelers to visualize a model's complex parallel behavior. Because interleaving and visualization is now automated, it is feasible to construct arbitrarily long sequences of behavior. To demonstrate the process, we present a model of automated teller interactions in Apex and discuss implications for user modeling. available to model human users, the Goals, Operators, Methods, and Selection (GOMS) method [6, 21] has been the most widely used, providing accurate, often zero-parameter, predictions of the routine performance of skilled users in a wide range of procedural tasks [6, 13, 15, 27, 28]. GOMS is meant to model routine behavior. The user is assumed to have methods that apply sequences of operators and to achieve a goal. Selection rules are applied when there is more than one method to achieve a goal. Many routine tasks lend themselves well to such decomposition. Decomposition produces a representation of the task as a set of nested goal states that include an initial state and a final state. The iterative decomposition into goals and nested subgoals can terminate in primitives of any desired granularity, the choice of level of detail dependent on the predictions required. Although GOMS has proven useful in HCI, tools to support the
GOM Deepwater Horizon Oil Spill: A Time Series Analysis of Variations in Spilled Hydrocarbons
Palomo, C. M.; Yan, B.
2013-12-01
An estimated amount of 210 million gallons of crude oil was released into the Gulf of Mexico (GOM) from April 20th to July 15th 2010 during the Deepwater Horizon oil rig explosion. The spill caused a tremendous financial, ecological, environmental and health impact and continues to affect the GOM today. Variations in hydrocarbons including alkanes, hopanes and poly-cyclic aromatic hydrocarbons (PAHs) can be analyzed to better understand the oil spill and assist in oil source identification. Twenty-one sediment samples*, two tar ball samples and one surface water oil sample were obtained from distinct locations in the GOM and within varying time frames from May to December 2010. Each sample was extracted through the ASE 200 solvent extractor, concentrated down under nitrogen gas, purified through an alumina column, concentrated down again with nitrogen gas and analyzed via GC X GC-TOF MS. Forty-one different hydrocarbons were quantified in each sample. Various hydrocarbon 'fingerprints,' such as parental :alkylate PAH ratios, high molecular weight PAHs: low molecular weight alkane ratios, and carbon preference index were calculated. The initial objective of this project was to identify the relative hydrocarbon contributions of petrogenic sources and combustion sources. Based on the calculated ratios, it is evident that the sediment core taken in October of 2010 was greatly affected by combustion sources. Following the first month of the spill, oil in the gulf was burned in attempts to contain the spill. Combustion related sources have quicker sedimentation rates, and hydrocarbons from a combustion source essentially move into deeper depths quicker than those from a petrogenic source, as was observed in analyses of the October 2010 sediment. *Of the twenty-one sediment samples prepared, nine were quantified for this project.
Kinash, N.; Cook, A.; Sawyer, D.; Heber, R.
2017-12-01
In May 2017 the University of Texas led a drilling and pressure coring expedition in the northern Gulf of Mexico, UT-GOM2-01. The holes were located in Green Canyon Block 955, where the Gulf of Mexico Joint Industry Project Leg II identified an approximately 100m thick hydrate-filled course-grained levee unit in 2009. Two separate wells were drilled into this unit: Holes H002 and H005. In Hole H002, a cutting shoe drill bit was used to collect the pressure cores, and only 1 of the 8 cores collected was pressurized during recovery. The core recovery in Hole H002 was generally poor, about 34%, while the only pressurized core had 45% recovery. In Hole H005, a face bit was used during pressure coring where 13 cores were collected and 9 cores remained pressurized. Core recovery in Hole H005 was much higher, at about 75%. The type of bit was not the only difference between the holes, however. Drilling mud was used throughout the drilling and pressure coring of Hole H002, while only seawater was used during the first 80m of pressure cores collected in Hole H005. Herein we focus on lithologic analysis of Hole H002 with the goal of documenting and understanding core recovery in Hole H002 to compare with Hole H005. X-ray Computed Tomography (XCT) images were collected by Geotek on pressurized cores, mostly from Hole H005, and at Ohio State on unpressurized cores, mostly from Hole H002. The XCT images of unpressurized cores show minimal sedimentary structures and layering, unlike the XCT images acquired on the pressurized, hydrate-bearing cores. Only small sections of the unpressurized cores remained intact. The unpressurized cores appear to have two prominent facies: 1) silt that did not retain original sedimentary fabric and often was loose within the core barrel, and 2) dense mud sections with some sedimentary structures and layering present. On the XCT images, drilling mud appears to be concentrated on the sides of cores, but also appears in layers and fractures within
Using Apex To Construct CPM-GOMS Models
John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger
2006-01-01
process for automatically generating computational models of human/computer interactions as well as graphical and textual representations of the models has been built on the conceptual foundation of a method known in the art as CPM-GOMS. This method is so named because it combines (1) the task decomposition of analysis according to an underlying method known in the art as the goals, operators, methods, and selection (GOMS) method with (2) a model of human resource usage at the level of cognitive, perceptual, and motor (CPM) operations. CPM-GOMS models have made accurate predictions about behaviors of skilled computer users in routine tasks, but heretofore, such models have been generated in a tedious, error-prone manual process. In the present process, CPM-GOMS models are generated automatically from a hierarchical task decomposition expressed by use of a computer program, known as Apex, designed previously to be used to model human behavior in complex, dynamic tasks. An inherent capability of Apex for scheduling of resources automates the difficult task of interleaving the cognitive, perceptual, and motor resources that underlie common task operators (e.g., move and click mouse). The user interface of Apex automatically generates Program Evaluation Review Technique (PERT) charts, which enable modelers to visualize the complex parallel behavior represented by a model. Because interleaving and the generation of displays to aid visualization are automated, it is now feasible to construct arbitrarily long sequences of behaviors. The process was tested by using Apex to create a CPM-GOMS model of a relatively simple human/computer-interaction task and comparing the time predictions of the model and measurements of the times taken by human users in performing the various steps of the task. The task was to withdraw $80 in cash from an automated teller machine (ATM). For the test, a Visual Basic mockup of an ATM was created, with a provision for input from (and measurement
A GOMS model applied to a simplified control panel design
International Nuclear Information System (INIS)
Chavez, C.; Edwards, R.M.
1992-01-01
The design of the user interface for a new system requires many decisions to be considered. To develop sensitivity to user needs requires understanding user behavior. The how-to-do-it knowledge is a mixture of task-related and interface-related components. A conscientious analysis of these components, allows the designer to construct a model in terms of goals, operators, methods, and selection (GOMS model) rules that can be advantageously used in the design process and evaluation of a user interface. The emphasis of the present work is on describing the importance and use of a GOMS model as a formal user interface analysis tool in the development of a simplified panel for the control of a nuclear power plant. At Pennsylvania State University, a highly automated control system with a greatly simplified human interface has been proposed to improve power plant safety. Supervisory control is to be conducted with a simplified control panel with the following functions: startup, shutdown, increase power, decrease power, reset, and scram. Initial programming of the operator interface has been initiated within the framework of a U.S. Department of Energy funded university project for intelligent distributed control. A hypothesis to be tested is that this scheme can be also used to estimate mental work load content and predict human performance
2011-11-14
... DEPARTMENT OF THE INTERIOR Bureau of Ocean Energy Management Gulf of Mexico (GOM), Outer... studies, NEPA analysis, resource evaluation, economic analysis, and renewable energy. BSEE is responsible... Program AGENCY: Bureau of Ocean Energy Management (BOEM), Interior. ACTION: Notice of Availability (NOA...
Directory of Open Access Journals (Sweden)
Letícia de Oliveira Cardoso
2011-02-01
Full Text Available Com objetivo de identificar perfis de consumo e de comportamentos alimentares e descrever suas prevalências, aplicou-se o método Grade of Membership em dados de um inquérito sobre fatores de risco à saúde de adolescentes do ensino fundamental da cidade do Rio de Janeiro, Brasil (N = 1.632. Foram gerados quatro perfis: perfil "A" (12,1%, caracterizado pelo consumo frequente de todos os alimentos marcadores de dieta saudável, menos frequente dos alimentos não saudáveis e pela presença de comportamentos alimentares saudáveis; perfil "B" (45,8%, marcado pelo hábito de realizar o desjejum e três refeições/dia, consumo menos frequente de legumes e frutas e de cinco dos marcadores de alimentação não saudável; perfil "C" (22,8%, ausência de comportamentos alimentares saudáveis e pelo consumo menos frequente de legumes, frutas, leite, embutidos, biscoitos e refrigerantes; e perfil "D", caracterizado pelo consumo frequente de todos os alimentos não saudáveis e menos frequente de legumes e frutas. Os resultados apontam para a necessidade de promoção da alimentação saudável nesta população.
Louisiana Geographic Information Center — Offshore Minerals Management Platforms for the Gulf of Mexico (GOM). Identifies the location of platforms in GOM. All platforms existing in the database are included.
Using GOMS and NASA-TLX to Evaluate Human-Computer Interaction Process in Interactive Segmentation
Ramkumar, A.; Stappers, P.J.; Niessen, W.J.; Adebahr, S; Schimek-Jasch, T; Nestle, U; Song, Y.
2016-01-01
HCI plays an important role in interactive medical image segmentation. The Goals, Operators, Methods, and Selection rules (GOMS) model and the National Aeronautics and Space Administration Task Load Index (NASA-TLX) questionnaire are different methods that are often used to evaluate the HCI
Performance Evaluation of HYCOM-GOM for Hydrokinetic Resource Assessment in the Florida Strait
Energy Technology Data Exchange (ETDEWEB)
Neary, Vincent S [ORNL; Gunawan, Budi [ORNL; Ryou, Albert S [ORNL
2012-06-01
The U.S. Department of Energy (DoE) is assessing and mapping the potential off-shore ocean current hydrokinetic energy resources along the U.S. coastline, excluding tidal currents, to facilitate market penetration of water power technologies. This resource assessment includes information on the temporal and three-dimensional spatial distribution of the daily averaged power density, and the overall theoretical hydrokinetic energy production, based on modeled historical simulations spanning a 7-year period of record using HYCOM-GOM, an ocean current observation assimilation model that generates a spatially distributed three-dimensional representation of daily averaged horizontal current magnitude and direction time series from which power density time series and their statistics can be derived. This study ascertains the deviation of HYCOM-GOM outputs, including transport (flow) and power density, from outputs based on three independent observation sources to evaluate HYCOM-GOM performance. The three independent data sources include NOAA s submarine cable data of transport, ADCP data at a high power density location, and HF radar data in the high power density region of the Florida Strait. Comparisons with these three independent observation sets indicate discrepancies with HYCOM model outputs, but overall indicate that the HYCOM-GOM model can provide an adequate assessment of the ocean current hydrokinetic resource in high power density regions like the Florida Strait. Additional independent observational data, in particular stationary ADCP measurements, would be useful for expanding this model performance evaluation study. ADCP measurements are rare in ocean environments not influenced by tides, and limited to one location in the Florida Strait. HF radar data, although providing great spatial coverage, is limited to surface currents only.
El Cardenal Isidro Gomá y la cuestión vasca
Directory of Open Access Journals (Sweden)
Dionisio Vivas, Miguel Ángel
2012-07-01
Full Text Available One of the many areas of activity of Cardinal Isidro Goma during the Civil War was the attention to the Basque question. After, the controversy with lehendakari Aguirre A, the result of the alignment of Basque nationalism with the Republic. The primate was also a central role regarding the end of conflict in the Basque Country, with negotiations for the surrender of Bilbao. Finally had to face conflict with the national clergy, who remained in Spain, who was victimized who was exiled or those who criticized the position of Cardinal Goma.Uno de los múltiples campos de actuación del cardenal Isidro Gomá durante la guerra civil fue la atención a la cuestión vasca. En primer lugar los problemas derivados del enfrentamiento de las autoridades militares con el obispo de Vitoria, Mateo Múgica. Después la polémica entre el propio Gomá y el lehendakari Aguirre, consecuencia del alineamiento del nacionalismo vasco con la República. El primado tuvo, asimismo, un papel central en torno al fin del conflicto bélico en el Pais Vasco, con las negociaciones para la rendición de Bilbao. Por último debió afrontar los conflictos clero nacionalista, el que permaneció en España, el que fue represaliado y el que fue exiliado o aquellos que censuraron la posición del Cardenal Gomá.
Flemings, P. B.; Phillips, S. C.
2017-12-01
In May 2017, a science team led by the University of Texas-Austin conducted drilling and coring operations from the Helix Q4000 targeting gas hydrates in sand-rich reservoirs in the Green Canyon 955 block in the northern Gulf of Mexico. The UT-GOM2-1 expedition goals were to 1) test two configurations of pressure coring devices to assess relative performance with respect to recovery and quality of samples and 2) gather sufficient samples to allow laboratories throughout the US to investigate a range of outstanding science questions related to the origin and nature of gas hydrate-bearing sands. In the first well (UT-GOM2-1-H002), 1 of the 8 cores were recovered under pressure with 34% recovery. In the second well (UT-GOM2-1-H005), 12 of 13 cores were recovered under pressure with 77% recovery. The pressure cores were imaged and logged under pressure. Samples were degassed both shipboard and dockside to interpret hydrate concentration and gas composition. Samples for microbiological and porewater analysis were taken from the depressurized samples. 21 3 ft pressure cores were returned to the University of Texas for storage, distribution, and further analysis. Preliminary analyses document that the hydrate-bearing interval is composed of two interbedded (cm to m thickness) facies. Lithofacies II is composed of sandy silt and has trough cross bedding whereas Lithofacies III is composed of clayey silt and no bedforms are observed. Lithofacies II has low density (1.7 to 1.9 g/cc) and high velocity (3000-3250 m/s) beds whereas Lithofacies 3 has high density ( 1.9-2.1g/cc) and low velocity ( 1700 m/s). Quantitative degassing was used to determine that Lithofacies II contains high hydrate saturation (66-87%) and Lithofacies III contains moderate saturation ( 18-30%). Gas samples were analyzed periodically in each experiment and were composed of primarily methane with an average of 94 ppm ethane and detectable, but not quantifiable, propane. The core data will provide a
2011-08-12
... (BOEMRE), Interior. ACTION: Notice of Availability (NOA) of a Final Supplemental Environmental Impact... sale's incremental contribution to the cumulative impacts on environmental and socioeconomic resources... Mexico (GOM), Outer Continental Shelf (OCS), Western Planning Area (WPA), Oil and Gas Lease Sale for the...
Paloma Fernández Gomá: latidos de poesía que unen orillas
Medrano, Susana de los Ángeles
2010-01-01
En la lírica española contemporánea ya no cabe duda de que la poesía de mujer constituye un aporte significativo. Dentro de su entramado, en la panorámica actual de la lírica femenina de Andalucía, Paloma Fernández Gomá se perfila como una de las poetas más interesantes y singulares. Nacida en Madrid y asentada en Algeciras (Cádiz) desde niña, su poesía se inicia tempranamente pero recién se da a conocer en 1991 con El ocaso del girasol, al que le siguen hasta el momento una decena de poemari...
National Oceanic and Atmospheric Administration, Department of Commerce — The subproject described here is one of several components of ECOHAB-GOM: The Ecology and Oceanography of Toxic Alexandrium Blooms in the Gulf of Maine, a multi-PI,...
Directory of Open Access Journals (Sweden)
Roberto Ceamanos Llorens
2011-06-01
Full Text Available Este artículo investiga la labor realizada por Isidro Gomá y Tomás al frente del obispado de Tarazona y Tudela (1927-1933. Gomá, arzobispo de Toledo y primado de la Iglesia de España (1933-1940, es una de las principales figuras por la que se han interesado los historiadores de la Guerra Civil española a causa del apoyo que concedió a los sublevados. Sin embargo, el período previo a su nombramiento como Primado permanece desconocido. En estas líneas se muestra como, durante su estancia en Tarazona, Gomá dirigió con mano firme su diócesis y, llegada la República, se enfrentó con contundencia a las reformas laicas, consolidándose como un referente fundamental en los medios integristas y antirrepublicanos católicos. Esta circunstancia llevó al Vaticano a pensar en él cuando decidió cubrir la vacante sede de Toledo.Cet article porte sur le travail réalisé par Isidro Gomá y Tomás lorsqu’il était à la tête de l’évêché de Tarazona et Tudela (1927-1933. Gomá, archevêque de Tolède et primat de l’Église d’Espagne (1933-1940, est l’une des figures principales à laquelle se sont intéressés les historiens de la guerre civile espagnole, notamment en raison de l’appui qu’il fournit aux révoltés. La période qui précéda sa nomination comme Primat reste toutefois méconnue. Les lignes qui suivent montrent comment Gomá a dirigé d’une main de fer son diocèse durant son séjour à Tarazona et comment il s’est radicalement opposé –après l’avènement de la République– aux réformes laïques, devenant ainsi un référent fondamental parmi les milieux intégristes et antirrépublicains catholiques. C’est d’ailleurs pour cela que le Vatican a pensé à lui lorsqu’il fallut pourvoir le siège de Tolède resté vacant.This article examines the work of Isidro Gomá y Tomás in the see of Tarazona and Tudela (1927-1933. Gomá, the archbishop of Toledo (1933-1940, is one of foremost personalities
Designing a reservoir flow rate experiment for the GOM hydrate JIP leg 2 LWD drilling
Energy Technology Data Exchange (ETDEWEB)
Gullapalli, I.; Silpngarmlert, S.; Reik, B.; Kamal, M.; Jones, E. [Chevron Energy Technology Co., San Ramon, CA (United States); Moridis, G. [Lawrence Berkeley National Laboratories, CA (United States); Collett, T. [United States Geological Survey, Reston, VA (United States)
2008-07-01
Studies have indicated that the Gulf of Mexico may contain large deep sea hydrate deposits. This paper provided details of short-term production profiles obtained from a geological model of hydrate deposits located in the Gulf area. A well test analysis tool was used to obtain the production parameters. Pressure transients from numerical simulations of various well test designs were used to provide estimates of important flow parameters. The aim of the study was to determine the type and duration of a well test capable of providing data to support the accurate modeling of gas hydrate deposits. Parameters studied in the test included the effects of permeability and hydrate saturation as a function of the duration of the flow test. Results indicated that production using a constant bottom hole pressure is an appropriate method of impacting hydrate dissociation by depressurization. However, changes in transient pressure plots could not be characterized in order to identify regions of varying saturation levels. Results suggested that the rate of effective water to effective gas was higher than rates obtained from relative permeability relations due to low gas saturation levels. Fluid saturation regions were in areas of low confidence in relative permeability curves. However, it was not possible to calculate absolute permeability of the reservoir for systems with short production periods. Further studies are needed to determine effective permeability using history matching and a hydrate simulator. 8 refs., 4 tabs., 27 figs.
Directory of Open Access Journals (Sweden)
Gabriel Laguna Mariscal
2014-12-01
Full Text Available Carlos García Gual, Javier Gomá Lanzón, Fernando Savater. Muchas felicidades. Tres visiones y más de la idea de felicidad. Madrid: Ariel, 2015, 207 pp. ISBN: 978-84-344-1892-9.
2012-01-20
... circumstances and information arising from, among other things, the Deepwater Horizon event. This Final... Supplemental EIS and in consideration of the Deepwater Horizon event, including scientific journals; interviews... resources and socioeconomic factors. This analysis considers both routine activities and accidental events...
Observation and analysis of speciated atmospheric mercury in Shangri-La, Tibetan Plateau, China
Zhang, H.; Fu, X. W.; Lin, C.-J.; Wang, X.; Feng, X. B.
2015-01-01
This study reports the concentrations and potential sources of speciated atmospheric mercury at the Shangri-La Atmosphere Watch Regional Station (SAWRS), a pristine high-altitude site (3580 m a.s.l.) in Tibetan Plateau, China. Total gaseous mercury (TGM, defined as the sum of gaseous elemental mercury, GEM, and gaseous oxidized mercury, GOM), GOM and particulate-bound mercury (PBM) were monitored from November 2009 to November 2010 to investigate the characteristics and potential influence of the Indian summer monsoon (ISM) and the Westerlies on atmospheric transport of mercury. The mean concentrations (± standard deviation) of TGM, PBM and GOM were 2.55 ± 0.73 ng m-3, 38.82 ± 31.26 pg m-3 and 8.22 ± 7.90 pg m-3, respectively. A notable seasonal pattern of TGM concentrations was observed with higher concentrations at the beginning and the end of the ISM season. High TGM concentrations (> 2.5 ng m-3) were associated with the transport of dry air that carried regional anthropogenic emissions from both Chinese domestic and foreign (e.g., Myanmar, Bay of Bengal, and northern India) sources based on analysis of HYSPLIT4 back trajectories. Somewhat lower PBM and GOM levels during the ISM period were attributed to the enhanced wet scavenging. The high GOM and PBM were likely caused by local photo-chemical transformation under low RH and the domestic biofuel burning in cold seasons.
Prolégomènes à une théorie des modes de formation des dispositions politiques
Directory of Open Access Journals (Sweden)
Fabienne Federini
2007-11-01
Full Text Available À partir de la critique des analyses menées par Pierre Bourdieu sur le politique, il s’agit de comprendre la diversité des dispositions politiques des acteurs en focalisant plus particulièrement notre attention sur les conditions sociales de leur formation, tout en laissant de côté la variété des contextes de leur actualisation, de leur mise en œuvre ou de leur mise en veille. Acquérir une culture politique précocement au sein de sa famille, ce n’est pas la même chose que l’acquérir plus tardivement à la suite d’un déclassement social, d’une rupture professionnelle ou lors d’une guerre par exemple. On considère ainsi que les différents temps de la socialisation politique sur des acteurs ayant une formation scolaire équivalente ont des effets sur la formation de leurs dispositions politiques, dont le registre de politisation et les pratiques qu’elles engendrent sont autant d’indices.From the review of analysis by Pierre Bourdieu on politics, diversity of political aptitudes of the actors has to be understood by particularly focusing attention on the social conditions in which these aptitudes are produced. Acquiring early political knowledge within family is different from acquiring it later following loss of social position, professional disruption or during war. We thus believe that different moments of political socialization have different effects on the means of production of political aptitudes of actors ; the plurality of their politization degrees and the variety of political practices are the sign of it.A partir de la crítica de los análisis efectuados por Pierre Bourdieu sobre la política se intenta aprehender la diversidad de las disposiciones hacia lo político, focalizándose en particular en las condiciones sociales de la formación de esas disposiciones, dejando al lado la variedad de contextos que ayudan a actualizarlas, la aplicación de esas disposiciones o al contrario, su olvido. Adquirir
National Oceanic and Atmospheric Administration, Department of Commerce — This data set was taken from CRD 08-18 at the NEFSC. Specifically, the Gulf of Maine diet matrix was developed for the EMAX exercise described in that center...
1990-12-28
as possible, thinking aloud as he played. Gray videotaped KP performing this task to provide observed behavior against which to measure the...Lehman, J. F., Newell, A., Rosenblom, P. S., Simon , T., & Tessler, S. G. "Soar as a Unified Theory of Cognition: Spring 1990. in the Proceedings of the...Center Brown University Dr. Michael Blackburns User Interface Institute Department of Psychology Code 943 P.O. Box 704 Proience. RI 02912 Naval Ocean
Risk Profiles for Falls among Older Adults: New Directions for Prevention
Directory of Open Access Journals (Sweden)
William A. Satariano
2017-08-01
Full Text Available ObjectiveTo address whether neighborhood factors, together with older adults’ levels of health and functioning, suggest new combinations of risk factors for falls and new directions for prevention. To explore the utility of Grade-of-Membership (GoM analysis to conduct this descriptive analysis.MethodThis is a cross-sectional, descriptive study of 884 people aged ≥65 years from Alameda County, CA, Cook County, IL, Allegheny County, PA, and Wake and Durham counties, NC. Interviews focused on neighborhood characteristics, physical and cognitive function, walking, and falls and injuries. Four risk profiles (higher order interactions of individual and neighborhood factors were derived from GoM analysis.ResultsProfiles 1 and 2 reflect previous results showing that frail older adults are likely to fall indoors (Profile 1; healthy older adults are likely to fall outdoors (Profile 2. Profile 3 identifies the falls risk for older with mild cognitive impairment living in moderately walkable neighborhoods. Profile 4 identifies the risk found for healthy older adults living in neighborhoods with low walkability.DiscussionNeighborhood walkability, in combination with levels of health and functioning, is associated with both indoor and outdoor falls. Descriptive results suggest possible research hypotheses and new directions for prevention, based on individual and neighborhood factors.
GoM Coastal Biopsy Surveys - NRDA
National Oceanic and Atmospheric Administration, Department of Commerce — Small vessel surveys were conducted within estuarine and nearshore coastal waters of Barataria Bay, LA and Mississippi Sound, MS to collect tissue biopsy samples...
EQ-5D-3L as a health measure of Brazilian adult population.
Menezes, Renata de Miranda; Andrade, Mônica Viegas; Noronha, Kenya Valéria Micaela de Souza; Kind, Paul
2015-11-01
This study explores the use of EQ-5D-3L as a measure of population health status in a Brazilian region with significant socioeconomic, demographic, and epidemiological heterogeneity. Data came from a study of 3363 literate individuals aged between 18 and 64 years living in urban areas of the state of Minas Gerais. Descriptive analysis and logistic and OLS regression models were performed to analyze the relationship between EQ-5D-3L (descriptive system and EQ VAS) and other health (self-assessed health status and 8 self-reported diagnosed chronic diseases), socioeconomic (educational level and economic class), and demographic (gender and age) measures. Additionally, a grade of membership (GoM) analysis was performed to identify multidimensional health profiles. A total of 76 health statuses were identified in the Brazilian population. The most prevalent one is full health (44 % of the sample). Elderly people, women, and individuals with poor health and lower socioeconomic conditions generally report more health problems in the EQ-5D-3L dimensions. The GoM analysis demonstrated that health status of older individuals is associated with the socioeconomic condition. Arthritis exhibited the strongest association with the EQ-5D-3L instrument. The results indicate that EQ-5D-3L is a good measure of health status for the Brazilian population. The instrument has a good discriminatory capacity in terms of demographic, socioeconomic, and health measures. The high prevalence of individuals with full health may indicate the presence of ceiling effect. However, this prevalence is smaller than that in other countries.
Directory of Open Access Journals (Sweden)
Kentisbeer J.
2013-04-01
Full Text Available Speciated atmospheric mercury has been measures semi-continuously at the Auchencorth Moss field site in southern Scotland since 2004. Here we present an analysis of the data from 2009 to 2011 for the three species: elemental, gaseous oxidized (GOM and particulate bound (PBM mercury. Measurements of elemental mercury were made using the Tekran 2537A analyser and the Tekran 1130 and 1135 speciation units were used to collect GOM and PBM respectively. The data shows no upward or downward trend for elemental mercury, with yearly average concentrations between 1.3 and 1.5 ng m-3. We will continue the work started in Kentisbeer et al, 2010 to analyse the effect of wind direction on the mercury species, making further of air mass back trajectories and introducing cluster analysis to investigate the effects of longer rangetransport to the site.
A MITgcm/DART ensemble analysis and prediction system with application to the Gulf of Mexico
Hoteit, Ibrahim
2013-09-01
This paper describes the development of an advanced ensemble Kalman filter (EnKF)-based ocean data assimilation system for prediction of the evolution of the loop current in the Gulf of Mexico (GoM). The system integrates the Data Assimilation Research Testbed (DART) assimilation package with the Massachusetts Institute of Technology ocean general circulation model (MITgcm). The MITgcm/DART system supports the assimilation of a wide range of ocean observations and uses an ensemble approach to solve the nonlinear assimilation problems. The GoM prediction system was implemented with an eddy-resolving 1/10th degree configuration of the MITgcm. Assimilation experiments were performed over a 6-month period between May and October during a strong loop current event in 1999. The model was sequentially constrained with weekly satellite sea surface temperature and altimetry data. Experiments results suggest that the ensemble-based assimilation system shows a high predictive skill in the GoM, with estimated ensemble spread mainly concentrated around the front of the loop current. Further analysis of the system estimates demonstrates that the ensemble assimilation accurately reproduces the observed features without imposing any negative impact on the dynamical balance of the system. Results from sensitivity experiments with respect to the ensemble filter parameters are also presented and discussed. © 2013 Elsevier B.V.
2002-09-01
The Federal Government plans to offer U.S. Outer Continental Shelf (OCS) lands in the Eastern Planning Area of the Gulf of Mexico (GOM) for oil and gas leasing. This report summarizes results of that analysis, the objective of which was to estimate the risk of oil-spill contact to sensitive offshore and onshore environmental resources and socioeconomic features from oil spills accidentally occurring from the OCS activities.
International Nuclear Information System (INIS)
Lin Chiuhsiangloe; Hsieh Tsungling
2016-01-01
Task analysis methods provide an insight for quantitative and qualitative predictions of how people will use a proposed system, though the different versions have different emphases. Most of the methods can attest to the coverage of the functionality of a system and all provide estimates of task performance time. However, most of the tasks that operators deal with in a digital work environment in the main control room of an advanced nuclear power plant require high mental activity. Such mental tasks overlap and must be dealt with at the same time; most of them can be assumed to be highly parallel in nature. Therefore, the primary aim to be addressed in this paper was to develop a method that adopts CPM-GOMS (cognitive perceptual motor-goals operators methods selection rules) as the basic pattern of mental task analysis for the advanced main control room. A within-subjects experiment design was used to examine the validity of the modified CPM-GOMS. Thirty participants participated in two task types, which included high- and low-compatibility types. The results indicated that the performance was significantly higher on the high-compatibility task type than on the low-compatibility task type; that is, the modified CPM-GOMS could distinguish the difference between high- and low-compatibility mental tasks. (author)
Directory of Open Access Journals (Sweden)
Claudia Cristina de Aguiar Pereira
2007-03-01
Full Text Available Após a introdução da terapia anti-retroviral de alta potência (TARV, o perfil da morbimortalidade relacionada ao HIV alterou-se, passando de óbitos causados por doenças oportunistas para quadros mórbido-crônicos de doenças "pré-AIDS", ou não definidoras de AIDS, causadas pelos efeitos adversos da terapia. Investigou-se a mortalidade relacionada ao HIV/AIDS através das causas múltiplas de morte, utilizando-se as declarações de óbito de residentes nos municípios de São Paulo e Santos, Brasil, que faleceram de causas relacionadas à doença pelo HIV em 2001. Utilizou-se o método Grade of Membership (GoM, que possibilitou criar perfis de causas de morte. Foram encontrados três perfis de mortalidade: o primeiro, relacionado às causas de morte identificadas no período pré-TARV, com predominância de doenças oportunistas; no segundo, houve uma mistura de características do período pré e pós-TARV; o terceiro perfil, residual, não contemplou a doença pelo HIV, mas incorporou grupos de causas de morte associadas aos períodos pré e pós-TARV. Espera-se que este estudo contribua para a elaboração de políticas direcionadas à adequação dos serviços de saúde ao novo cenário de morbimortalidade relacionada ao HIV.Following the introduction of highly active antiretroviral therapy (HAART, the HIV-related morbidity-mortality profile has changed. Opportunistic infections are not as prevalent as before, and "pre-AIDS" diseases have become more common, related mostly to the side effects of HAART. This study focused on HIV/AIDS-related mortality, based on multiple causes of death among individuals who died of HIV-related causes in the cities of São Paulo and Santos, Brazil, in 2001. Grade of Membership (GoM analysis was used. Three mortality profiles were detected: (1 causes of death normally observed before the introduction of HAART, marked by opportunistic infections; (2 causes of death with mixed characteristics, both pre
GoM Coastal and Estuarine Biopsy Surveys
National Oceanic and Atmospheric Administration, Department of Commerce — Small vessel surveys are conducted within estuarine and nearshore coastal waters to collect tissue biopsy samples from bottlenose dolphins. Visual surveys are...
GoM Estuarine Bottlenose Dolphin Photo-identification studies
National Oceanic and Atmospheric Administration, Department of Commerce — These data sets include a compilation of small vessel based studies of bottlenose dolphins that reside within Mississippi Sound and nearshore coastal waters. The...
GOSAILT: A hybrid of GOMS and SAILT with topography consideration
Wu, S.; Wen, J.
2017-12-01
Heterogeneous terrain significantly complicated the energy, mass and momentum exchange between the atmosphere and terrestrial ecosystem. Understanding of topographic effect on the forest reflectance is critical for biophysical parameters retrieval over rugged area. In this paper, a new hybrid bidirectional reflectance distribution function (BRDF) model of geometric optical mutual shadowing and scattering-from-arbitrarily-inclined-leaves model coupled topography (GOSAILT) for sloping forest was proposed. The effects of slope, aspect, gravity field of tree crown, multiple scattering scheme, and diffuse skylight are considered. The area proportions of scene components estimated by the GOSAILT model were compared with the geometric optical model for sloping terrains (GOST) model. The 3-D discrete anisotropic radiative transfer (DART) simulations were used to evaluate the performance of GOSAILT. The results indicate that the canopy reflectance is distorted by the slopes with a maximum variation of 78.3% in the red band and 17.3% in the NIR band on a steep 60 º slope. Compared with the DART simulations, the proposed GOSAILT model can capture anisotropic reflectance well with a determine coefficient (R2) of 0.9720 and 0.6701, root-mean-square error (RMSE) of 0.0024 and 0.0393, mean absolute percentage error of 2.4% and 4.61% for the red and near-infrared (NIR) band. The comparison results indicate the GOSAIL model can accurately reproducing the angular feature of discrete canopy over rugged terrain conditions. The GOSAILT model is promising for the land surface biophysical parameters retrieval (e.g. albedo, leaf area index) over the heterogeneous terrain.
Analysis and Optimisation of Carcass Production for Flexible Pipes
DEFF Research Database (Denmark)
Nielsen, Peter Søe
Un-bonded flexible pipes are used in the offshore oil and gas industry worldwide transporting hydrocarbons from seafloor to floating production vessels topside. Flexible pipes are advantageous over rigid pipelines in dynamic applications and during installation as they are delivered in full length......-axial tension FLC points were attained. Analysis of weld fracture of duplex stainless steel EN 1.4162 is carried out determining strains with GOM ARAMIS automated strain measurement system, which shows that strain increases faster in the weld zone than the global strain of the parent material. Fracture...... is the analysis and optimisation of the carcass manufacturing process by means of a fundamental investigation in the fields of formability, failure modes / mechanisms, Finite Element Analysis (FEA), simulative testing and tribology. A study of failure mechanisms in carcass production is performed by being present...
Energy Technology Data Exchange (ETDEWEB)
Hangsterfer, A.; Driscoll, N.; Kastner, M. [Scripps Inst. of Oceanography, La Jolla, CA (United States). Geosciences Research Division
2008-07-01
Methane hydrates can form within the gas hydrate stability zone (GHSZ) in sea beds. The Gulf of Mexico (GOM) contains an underlying petroleum system and deeply buried, yet dynamic salt deposits. Salt tectonics and fluid expulsion upward through the sediment column result in the formation of fractures, through which high salinity brines migrate into the GHSZ, destabilizing gas hydrates. Thermogenic and biogenic hydrocarbons also migrate to the seafloor along the GOMs northern slope, originating from the thermal and biogenic degradation of organic matter. Gas hydrate occurrence can be controlled by either primary permeability, forming in coarse-grained sediment layers, or by secondary permeability, forming in areas where hydrofracture and faulting generate conduits through which hydrocarbon-saturated fluids flow. This paper presented a study that attempted to determine the relationship between grain-size, permeability, and gas hydrate distribution. Grain-size analyses were performed on cores taken from Keathley Canyon and Atwater Valley in the GOM, on sections of cores that both contained and lacked gas hydrate. Using thermal anomalies as proxies for the occurrence of methane hydrate within the cores, samples of sediment were taken and the grain-size distributions were measured to see if there was a correlation between gas hydrate distribution and grain-size. The paper described the methods, including determination of hydrate occurrence and core analysis. It was concluded that gas hydrate occurrence in Keathley Canyon and Atwater Valley was constrained by secondary permeability and was structurally controlled by hydrofractures and faulting that acted as conduits through which methane-rich fluids flowed. 11 refs., 2 tabs., 5 figs.
Availability analysis of subsea blowout preventer using Markov model considering demand rate
Directory of Open Access Journals (Sweden)
Sunghee Kim
2014-12-01
Full Text Available Availabilities of subsea Blowout Preventers (BOP in the Gulf of Mexico Outer Continental Shelf (GoM OCS is investigated using a Markov method. An updated β factor model by SINTEF is used for common-cause failures in multiple redundant systems. Coefficient values of failure rates for the Markov model are derived using the β factor model of the PDS (reliability of computer-based safety systems, Norwegian acronym method. The blind shear ram preventer system of the subsea BOP components considers a demand rate to reflect reality more. Markov models considering the demand rate for one or two components are introduced. Two data sets are compared at the GoM OCS. The results show that three or four pipe ram preventers give similar availabilities, but redundant blind shear ram preventers or annular preventers enhance the availability of the subsea BOP. Also control systems (PODs and connectors are contributable components to improve the availability of the subsea BOPs based on sensitivity analysis.
Directory of Open Access Journals (Sweden)
Marcelo de Rezende Pinto
2015-01-01
Full Text Available This paper reports the results of an empirical study undertaken with a sample of 368 undergraduate business administration students from five private universities in a large Brazilian city. The objective was to analyze the differences in perceptions of the course by students from high and low income backgrounds regarding the following issues: the cultural and symbolic elements involving higher education; the relevance of higher education in consumer priorities and the influence on consumption behavior of students; the appropriateness of the course to their reality; and the expected benefits of obtaining a degree. The data were analyzed using the Grade of Membership (GoM and t-test statistical techniques. The results, which were compared with the theoretical framework on consumption in a cultural and symbolic perspective, signaled there is a difference in meaning between the two groups of students.
Directory of Open Access Journals (Sweden)
Heloísa Maria de Assis
2008-12-01
Full Text Available Trata-se de estudo seccional, com base em dados secundários, com o objetivo de traçar o perfil dos óbitos neonatais precoces ocorridos em uma Maternidade Pública de referência no Estado de Minas Gerais (Maternidade Odete Valadares, Belo Horizonte, no período de 2001 a 2006. Foram utilizadas variáveis relacionadas ao recém-nascido (período de ocorrência do óbito, idade ao óbito, sexo, idade gestacional e peso ao nascer, à mãe (tipo de gravidez, tipo de parto, idade, parturição e número de nascidos mortos, bem como causas múltiplas de mortalidade categorizadas. Obtiveram-se três perfis de óbitos neonatais precoces por meio do método Grade of Membership, que possibilitou também encontrar a prevalência destes perfis. O Perfil 1 foi caracterizado por óbitos de difícil redução e teve prevalência de 41,4%; o Perfil 2, pelos óbitos passíveis de redução (prevalência de 28,3%; e o Perfil 3, pelos óbitos redutíveis (prevalência de 30,4%. Estes perfis possibilitaram a compreensão da mortalidade neonatal precoce na Maternidade Odete Valadares e a análise da sua relação com a história reprodutiva e obstétrica materna, bem como com as condições do recém-nascido. Chama a atenção a elevada prevalência de óbitos evitáveis, realidade que deve ser enfrentada pelos profissionais e pela rede pública de saúde.This is a cross-sectional study with the aim of describing the early neonatal deaths that took place at a Public Maternity Hospital, Maternidade Odete Valadares in Belo Horizonte, Minas Gerais, Brazil, from 2001 to 2006. It used variables related to the newborn (period the death took place, age at death, gender, gestational age, and birth weight, to the mother (type of pregnancy, type of delivery, age, parity, and number of stillborn children, and to the multiple causes of death. Three profiles of early neonatal death were obtained through the Grade of Membership method (GoM, which also made it possible to find
Digital Repository Service at National Institute of Oceanography (India)
Madhu, N.V.; Ullas, N.; Ashwini, R.; Meenu, P.; Rehitha, T.V.; Lallu, K.R.
Phytoplankton marker pigments and their functional groups were identified for the first time in the Gulf of Mannar (GoM) and the Palk Bay (PB), located in the southeast coast of India using HPLC–CHEMTAX analytical techniques. The GoM generally...
Directory of Open Access Journals (Sweden)
Maria Eponina de Abreu e Torres
2008-06-01
de las mujeres ante la consulta, siendo mayor entre aquéllas con menor escolaridad. Se notó también que la primera consulta ginecológica ocurrió en momentos muy diferentes para las mujeres de alta y de baja escolaridad - para las primeras está generalmente relacionada con el inicio de la vida sexual y con el uso de contracepción, mientras que para las de menor escolaridad, el motivo suele estar ligado con la gravidez. Sin embargo, independientemente de la escolaridad, de la edad y de la frecuencia con que las entrevistadas buscan esta consulta, quedó evidente la gran importancia atribuida a ella.The present study focuses on gynecological consultations. It aims at investigating visits to gynecologists by women ages 18 to 59 living in the city of Belo Horizonte, Brazil, and their perceptions of the visit as a whole. The research was carried out in two stages. The first was a quantitative analysis using Grade of Membership (GoM, to define profiles of women who had been to a gynecologist in the 12 months prior to the research, compared to those who had not. The second stage consisted of a qualitative analysis of 33 semi-structured interviews with women whose characteristics are similar to the profiles defined in the first stage, in order to capture their perceptions regarding the visit. Quantitative data were obtained from the SRSR Project (Reproductive Health, Sexuality, and Race/Color, carried out by Cedeplar in 2002. The qualitative data were taken from the project entitled Quantitative and Qualitative Aspects of Access to Contraception, Diagnosis, and Treatment of Uterine Cancer: a proposal for analysis in the city of Belo Horizonte, MG, Brazil, carried out in 2005 and 2006 by Cedeplar. The quantitative results suggest that having made a gynecological visit during the previous year is strongly correlated to the socioeconomic and demographic characteristics of the women. Those who visit a gynecologist regularly and either pay for the appointment or have some
Collett, Timothy S.; Lee, Wyung W.; Zyrianova, Margarita V.; Mrozewski, Stefan A.; Guerin, Gilles; Cook, Ann E.; Goldberg, Dave S.
2012-01-01
One of the objectives of the Gulf of Mexico Gas Hydrate Joint Industry Project Leg II (GOM JIP Leg II) was the collection of a comprehensive suite of logging-while-drilling (LWD) data within gas-hydrate-bearing sand reservoirs in order to make accurate estimates of the concentration of gas hydrates under various geologic conditions and to understand the geologic controls on the occurrence of gas hydrate at each of the sites drilled during this expedition. The LWD sensors just above the drill bit provided important information on the nature of the sediments and the occurrence of gas hydrate. There has been significant advancements in the use of downhole well-logging tools to acquire detailed information on the occurrence of gas hydrate in nature: From using electrical resistivity and acoustic logs to identify gas hydrate occurrences in wells to where wireline and advanced logging-while-drilling tools are routinely used to examine the petrophysical nature of gas hydrate reservoirs and the distribution and concentration of gas hydrates within various complex reservoir systems. Recent integrated sediment coring and well-log studies have confirmed that electrical resistivity and acoustic velocity data can yield accurate gas hydrate saturations in sediment grain supported (isotropic) systems such as sand reservoirs, but more advanced log analysis models are required to characterize gas hydrate in fractured (anisotropic) reservoir systems. In support of the GOM JIP Leg II effort, well-log data montages have been compiled and presented in this report which includes downhole logs obtained from all seven wells drilled during this expedition with a focus on identifying and characterizing the potential gas-hydrate-bearing sedimentary section in each of the wells. Also presented and reviewed in this report are the gas-hydrate saturation and sediment porosity logs for each of the wells as calculated from available downhole well logs.
GoM Estuarine Bottlenose Dolphin Photo-identification studies - NRDA
National Oceanic and Atmospheric Administration, Department of Commerce — These data sets include a compilation of small vessel based studies of bottlenose dolphins that reside within Barataria Bay, LA, Mississippi Sound, MS and nearshore...
Empowerment et diversité culturelle : quelques prolégomènes
Gagnon, Alain G.; May, Paul
2010-01-01
Le présent texte explore l’articulation entre l’empowerment et la diversité culturelle, et notamment les voies prometteuses du jumelage entre ces deux termes. L’apport de la philosophie de Charles Taylor conjuguée aux récentes théories du fédéralisme permet de porter un regard nouveau sur la thématique des revendications identitaires en milieu urbain et d’esquisser des solutions institutionnelles pour répondre au double défi posé par le pluralisme culturel et de la nouvelle configuration écon...
Empowerment et diversité culturelle : quelques prolégomènes
Directory of Open Access Journals (Sweden)
Alain G. Gagnon
2010-07-01
Full Text Available Le présent texte explore l’articulation entre l’empowerment et la diversité culturelle, et notamment les voies prometteuses du jumelage entre ces deux termes. L’apport de la philosophie de Charles Taylor conjuguée aux récentes théories du fédéralisme permet de porter un regard nouveau sur la thématique des revendications identitaires en milieu urbain et d’esquisser des solutions institutionnelles pour répondre au double défi posé par le pluralisme culturel et de la nouvelle configuration économico-politique sous-tendant la mondialisation. Malgré ses aspects positifs, cet article entend montrer, dans le sillon des travaux de Bernard Jouve, les limites et l’utilisation équivoque que l’on peut faire de l’empowerment, qui sert parfois à justifier un désengagement de l’État-providence en faveur des populations les plus démunies, et qui ne présente pas toujours les garanties nécessaires pour lutter contre les discriminations survenant au sein des Minorities within Minorities.This article explores linkages between empowerment and cultural diversity largely defined, and at times a promising convergence between the two notions. Charles Taylor’s contribution to current theories of federalism allows us to cast a rejuvenated look on cultural claims in urban centers and to sketch institutional avenues to face the dual challenge posed by cultural pluralism and the new socio-political configuration underlying globalization. In spite of some very positive aspects, and following Bernard Jouve’s intellectual insights, this text illustrates the extent to which the notion of empowerment can also be deceiving as it might add limitations for weaker communities, and create new hurdles as the undermining of the welfare state has showned. In addition, empowerment does not always provide guarantees to counter discrimation as the case of «Minorities witin Minorities» reveal.
Usability engineering of "In Vehicle Information Systems" with multi-tasking GOMS
Urbas, L.; Leuchter, S.
2008-01-01
The developments in vehicle electronics and new services are supposed to promise more convenience in driving. The offers and ideas range from vehicle-related installations, such as accident alert, petrol station assistance, dynamic navigation and travel guide, to communication and entertainment services. There is one central design problem that is essential for achieving the main objective "safe motor vehicle driving", i.e. the use of the new service must not unduly distract the driver. In th...
Directory of Open Access Journals (Sweden)
Diana Oya Sawyer
2002-01-01
Full Text Available Serviços de saúde devem responder às demandas populacionais que resultam da conjugação de fatores sociais, individuais e culturais. Para isso, faz-se necessário o conhecimento do padrão de consumo de serviços de saúde. Neste artigo, quatro perfis de consumo de saúde foram gerados a partir da aplicação da técnica do Grade of Membership (GoM. O modelo teórico de utilização de serviços de saúde proposto por Andersen serviu como marco de referência da análise, permitindo que estimativas da demanda por serviços de saúde fossem feitas segundo níveis altos e baixos de capacitação, necessidade e predisposição para o consumo. Ressalta-se que especial atenção deve ser dada ao grupo de alta necessidade e predisposição, e baixa capacitação, que representa 14% da população brasileira acima de 14 anos de idade (exceto a região Norte e é composto, predominantemente, por idosos que moram sozinhos e têm alta necessidade de serviços especializados.Health care services are responsible for attending to the population's demand, which is the sum of social, individual and cultural factors. A knowledge of health consumption patterns becomes, thus, necessary. Through the Grade of Membership (GoM technique, four health consumption profiles were generated for this article. Andersen's theoretical model of health service consumption served as a frame of reference, allowing for health service demand estimates according to high and low levels of enabling, need and predisposing consumption factors. Worthy of notice is the fact that 14% of the Brazilian population over 14 years of age (excluding the Northern region present high need and predisposal despite their low enabling characteristics. This group consists predominantly of elderly people living alone and in dire need of specialized services.
Directory of Open Access Journals (Sweden)
Palmira de Fátima Bonolo
2008-11-01
Full Text Available The aim of the present study was to describe vulnerability profiles and to verify their association with non-adherence to antiretroviral therapy (ART among 295 HIV-patients receiving their first prescription in two public-referral centers in Minas Gerais States, Brazil. The cumulative incidence of non-adherence was 36.9%. Three pure vulnerability profiles (lower, medium and higher were identified based on the Grade of Membership method (GoM. Pure type patients of the "higher vulnerability" profile had, when compared to the overall sample, an increased probability of being younger, not understanding the need of ART, having a personal reason to be HIV-tested, not disclosing their HIV status, having more than one (non-regular sexual partner, reporting use of alcohol, tobacco and illicit drugs, and having sex among men. Non-adherence to ART was statistically associated (p Este estudo teve como objetivos descrever os perfis de vulnerabilidade e verificar suas associações com a não-adesão à terapia anti-retroviral (TARV entre os 295 pacientes com HIV que recebiam suas primeiras prescrições em dois serviços públicos de referência de Minas Gerais, Brasil. A incidência cumulativa de não-adesão foi 36,9%. Foram identificados três perfis puros de vulnerabilidade (baixa, média e alta baseados no método Grade of Membership (GoM. Os tipos puros de pacientes do perfil de "alta vulnerabilidade" tinham, comparados aos outros, probabilidade maior de serem jovens, de não perceberem a necessidade da TARV, de terem uma razão pessoal para realização do teste HIV, de não terem revelado seu status HIV, de terem mais de um (não fixo parceiro sexual, de relatarem uso de álcool, tabaco e drogas ilícitas e sexo entre homens. Não-adesão à TARV foi associada significativamente a esse perfil (p < 0,001. A heterogeneidade da amostra foi alta, pois mais de 40% dos pacientes eram tipos mistos. Conclui-se que os profissionais de saúde devem ser
DEFF Research Database (Denmark)
Mathiesen, Brian Vad; Liu, Wen; Zhang, Xiliang
2014-01-01
three major technological changes: energy savings on the demand side, efficiency improvements in energy production, and the replacement of fossil fuels by various sources of renewable energy. Consequently, the analysis of these systems must include strategies for integrating renewable sources...
Transferências intergeracionais privadas na Amazônia rural brasileira
Directory of Open Access Journals (Sweden)
Gilvan Ramalho Guedes
2009-08-01
Full Text Available What motivates family members to share resources? Past research argues for, on the one hand, love and altruism, and on the other, the expectation of reciprocity. Drawing on this literature, this paper examines intergenerational transfers between small farmers and their non-coresident children in the rural area around the city of Altamira, Pará, Brazil. We apply GoM (Grade of Membership models to create profiles of private transfers, using data collected in 2005 by a team from Indiana University. The results show three profiles: low intergenerational transfers, high levels of transfers of visits and help, and high levels of transfers of visits and money. There is no clear difference in profile by birth order, but we do find sex differences in profile. Men are more likely to send money while women provide time transfers (work and visits. Upward transfers are most common from children with high levels of education or living in urban areas, suggesting a repayment of prior investments made by parents. Thus, our empirical evidence supports theories arguing that transfers are motivated by intertemporal contracts between parents and children, and that altruistic theories of family transfers should be rethought among rural agricultural populations in contexts characterized by many environmental and institutional challenges.
Directory of Open Access Journals (Sweden)
Haroldo da Gama Torres
2010-01-01
Full Text Available Contrairement aux analyses classiques des politiques publiques qui soulignent le rôle de leurs auteurs, dans cet article on cherche à montrer que les choix des responsables directs de la mise en place de la politique de l'enseignement fondamental se répercutent significativement sur l'impact de ces politiques. Cette analyse, dont le point de départ est un survey réalisé auprès de 800 professeurs des écoles de l'État et de la municipalité de São Paulo, de la première à la quatrième année de l'enseignement fondamental, a été organisée selon l'établissement d'une typologie des enseignants par le moyen de la technique de "grade of membership" (GoM. L'analyse a montré que les types d'enseignants en poste diffèrent selon les réseaux d'enseignement et selon l'emplacement de l'école (en zone de pauvreté ou non.
Directory of Open Access Journals (Sweden)
Selvin J. PITCHAIKANI
2017-06-01
Full Text Available Principal component analysis (PCA is a technique used to emphasize variation and bring out strong patterns in a dataset. It is often used to make data easy to explore and visualize. The primary objective of the present study was to record information of zooplankton diversity in a systematic way and to study the variability and relationships among seasons prevailed in Gulf of Mannar. The PCA for the zooplankton seasonal diversity was investigated using the four seasonal datasets to understand the statistical significance among the four seasons. Two different principal components (PC were segregated in all the seasons homogeneously. PCA analyses revealed that Temora turbinata is an opportunistic species and zooplankton diversity was significantly different from season to season and principally, the zooplankton abundance and its dynamics in Gulf of Mannar is structured by seasonal current patterns. The factor loadings of zooplankton for different seasons in Tiruchendur coastal water (GOM is different compared with the Southwest coast of India; particularly, routine and opportunistic species were found within the positive and negative factors. The copepods Acrocalanus gracilis and Acartia erythrea were dominant in summer and Southwest monsoon due to the rainfall and freshwater discharge during the summer season; however, these species were replaced by Temora turbinata during Northeast monsoon season.
Wei, Shih-Chun; Fan, Shen; Lien, Chia-Wen; Unnikrishnan, Binesh; Wang, Yi-Sheng; Chu, Han-Wei; Huang, Chih-Ching; Hsu, Pang-Hung; Chang, Huan-Tsung
2018-03-20
A graphene oxide (GO) nanosheet-modified N + -nylon membrane (GOM) has been prepared and used as an extraction and spray-ionization substrate for robust mass spectrometric detection of malachite green (MG), a highly toxic disinfectant in liquid samples and fish meat. The GOM is prepared by self-deposition of GO thin film onto an N + -nylon membrane, which has been used for efficient extraction of MG in aquaculture water samples or homogenized fish meat samples. Having a dissociation constant of 2.17 × 10 -9 M -1 , the GOM allows extraction of approximately 98% of 100 nM MG. Coupling of the GOM-spray with an ion-trap mass spectrometer allows quantitation of MG in aquaculture freshwater and seawater samples down to nanomolar levels. Furthermore, the system possesses high selectivity and sensitivity for the quantitation of MG and its metabolite (leucomalachite green) in fish meat samples. With easy extraction and efficient spray ionization properties of GOM, this membrane spray-mass spectrometry technique is relatively simple and fast in comparison to the traditional LC-MS/MS methods for the quantitation of MG and its metabolite in aquaculture products. Copyright © 2017 Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
Xinrong Ren
2014-04-01
Full Text Available During two intensive studies in summer 2010 and spring 2011, measurements of mercury species including gaseous elemental mercury (GEM, gaseous oxidized mercury (GOM, and particulate-bound mercury (PBM, trace chemical species including O3, SO2, CO, NO, NOY, and black carbon, and meteorological parameters were made at an Atmospheric Mercury Network (AMNet site at the Grand Bay National Estuarine Research Reserve (NERR in Moss Point, Mississippi. Surface measurements indicate that the mean mercury concentrations were 1.42 ± 0.12 ng∙m−3 for GEM, 5.4 ± 10.2 pg∙m−3 for GOM, and 3.1 ± 1.9 pg∙m−3 for PBM during the summer 2010 intensive and 1.53 ± 0.11 ng∙m−3 for GEM, 5.3 ± 10.2 pg∙m−3 for GOM, and 5.7 ± 6.2 pg∙m−3 for PBM during the spring 2011 intensive. Elevated daytime GOM levels (>20 pg∙m−3 were observed on a few days in each study and were usually associated with either elevated O3 (>50 ppbv, BrO, and solar radiation or elevated SO2 (>a few ppbv but lower O3 (~20–40 ppbv. This behavior suggests two potential sources of GOM: photochemical oxidation of GEM and direct emissions of GOM from nearby local sources. Lack of correlation between GOM and Beryllium-7 (7Be suggests little influence on surface GOM from downward mixing of GOM from the upper troposphere. These data were analyzed using the HYSPLIT back trajectory model and principal component analysis in order to develop source-receptor relationships for mercury species in this coastal environment. Trajectory frequency analysis shows that high GOM events were generally associated with high frequencies of the trajectories passing through the areas with high mercury emissions, while low GOM levels were largely associated the trajectories passing through relatively clean areas. Principal component analysis also reveals two main factors: direct emission and photochemical processes that were clustered with high GOM and PBM. This study indicates that the receptor site
Huang, J.; Miller, M. B.; Edgerton, E.; Gustin, M. S.
2015-04-01
The highest mercury (Hg) wet deposition in the United States (US) occurs along the Gulf of Mexico, and in the southern and central Mississippi River Valley. Gaseous oxidized Hg (GOM) is thought to be a major contributor due to its high water solubility and reactivity. Therefore, it is critical to understand the concentrations, potential for wet and dry deposition, and GOM compounds present in the air. Concentrations and dry deposition fluxes of GOM were measured at Outlying Landing Field (OLF), Florida, using a Tekran® 2537/1130/1135, and active and passive samplers using cation-exchange and nylon membranes. Relationships with Tekran® derived data must be interpreted with caution, since GOM concentrations can be biased low depending on the chemical compounds in air, and interferences with water vapor and ozone. Only gaseous elemental Hg and GOM are discussed here since the PBM measurement uncertainties are higher. Criteria air pollutants were concurrently measured and Tekran® data were assessed along with these using Principal Component Analysis to identify associations among air pollutants. Based on the diel pattern, high GOM concentrations at this site were associated with fossil fuel combustion and gas phase oxidation during the day, and gas phase oxidation and transport in the free troposphere. The ratio of GEM/CO at OLF (0.008 ng m-3 ppbv-1) was much higher than the numbers reported for the Western United States and central New York for domestic emissions or biomass burning (0.001 ng m-3 ppbv-1), which we suggest is indicative of a marine boundary layer source. Results from nylon membranes with thermal desorption analyses suggest five potential GOM compounds exist in this area, including HgBr2, HgO, Hg(NO3)2, HgSO4, and an unknown compound. This indicates that the site is influenced by different gaseous phase reactions and sources. A~high GOM event related to high CO but average SO2 suggests the air parcels moved from the free troposphere and across
2013-03-04
... a limited number of paper copies. In keeping with the Department of the Interior's mission to.... However, if you require a paper copy, BOEM will provide one upon request if copies are still available. 1...:00 p.m. EDT; Panama City Beach, Florida: Wednesday, March 27, 2013, Wyndham Bay Point Resort, 4114...
2011-04-20
... EIS and the 2009-2012 SEIS and to consider the Deepwater Horizon event. This Draft SEIS provides... activities and accidental events, including a possible large-scale event, associated with the proposed WPA...
2012-11-09
... information in light of the Deepwater Horizon event. This Draft Supplemental EIS provides updates on the... consideration of the Deepwater Horizon event, reviewing scientific journals, available scientific data, and... impacts of routine activities and accidental events, and the proposed lease sales' incremental...
2013-08-23
... Prepare a Supplemental Environmental Impact Statement (EIS). SUMMARY: Consistent with the regulations... Supplemental EIS will update the environmental and socioeconomic analyses in the Gulf of Mexico OCS Oil and Gas... Area Lease Sales 227, 231, 235, 241, and 247, Final Environmental Impact Statement (OCS EIS/EA BOEM...
2013-10-28
... (NOA) of the Draft Supplemental Environmental Impact Statement (EIS) and Public Meetings. SUMMARY: BOEM... Impact Statement (OCS EIS/EA BOEM 2012-019) (2012- 2017 WPA/CPA Multisale EIS) and in the Gulf of Mexico... Lease Sale 231, Final Supplemental Environmental Impact Statement (OCS EIS/EA BOEM 2013-0118) (WPA 233...
2013-04-12
.... ACTION: Notice of Availability (NOA) of the Final Supplemental Environmental Impact Statement (EIS... Environmental Impact Statement (OCS EIS/EA BOEM 2012-019) (2012-2017 Multisale EIS), completed in July 2012, in... to the cumulative impacts on environmental and socioeconomic resources. The oil and gas resource...
2013-07-16
... Prepare a Supplemental Environmental Impact Statement (EIS) SUMMARY: Consistent with the regulations... 248; Central Planning Area (CPA) Lease Sales 227, 231, 235, 241, and 247, Final Environmental Impact... Supplemental Environmental Impact Statement (OCS EIS/EA BOEM 2013-0118) (WPA 233/CPA 231 Supplemental EIS). The...
2010-11-10
... focus on updating the baseline conditions and potential environmental effects of oil and natural gas.... Comments Public meetings will be held in locations near these areas in early to mid November 2010. The...
Tyagi, M.; Zulqarnain, M.
2017-12-01
Offshore oil and gas exploration and production operations, involve the use of some of the cutting edge and challenging technologies of the modern time. These technological complex operations involves the risk of major accidents as well, which have been demonstrated by disasters such as the explosion and fire on the UK production platform piper alpha, the Canadian semi-submersible drilling rig Ocean Ranger and the explosion and capsizing of Deepwater horizon rig in the Gulf of Mexico. By conducting Quantitative Risk Assessment (QRA), safety of various operations as well as their associated risks and significance during the entire life phase of an offshore project can be quantitatively estimated. In an underground blowout, the uncontrolled formation fluids from higher pressure formation may charge up shallower overlying low pressure formations or may migrate to sea floor. Consequences of such underground blowouts range from no visible damage at the surface to the complete loss of well, loss of drilling rig, seafloor subsidence or hydrocarbons discharged to the environment. These blowouts might go unnoticed until the over pressured sands, which are the result of charging from higher pressure reservoir due to an underground blowout. Further, engineering formulas used to estimate the fault permeability and thickness are very simple in nature and may add to uncertainty in the estimated parameters. In this study the potential of a deepwater underground blowout are assessed during drilling life phase of a well in Popeye-Genesis field reservoir in the Gulf of Mexico to estimate the time taken to charge a shallower zone to its leak-off test (LOT) value. Parametric simulation results for selected field case show that for relatively high permeability (k = 40mD) fault connecting a deep over-pressured zone to a shallower low-pressure zone of similar reservoir volumes, the time to recharge the shallower zone up to its threshold LOT value is about 135 years. If the ratio of the reservoir volumes for shallower to deeper zone is about 0.1, the recharging time significantly decreased to 24 years. Also, the hydrocarbons might possibly migrate through casing-wellbore annulus due to delamination fractures between cement interfaces with rock/casing and any other micro annulus gap not isolated by cement.
2012-10-26
... leases in depths less than 400 meters with an initial period longer than 5 years, royalty rates, minimum... $25.00 per acre or fraction thereof for blocks in water depths of less than 400 meters. $100.00 per acre or fraction thereof for blocks in water depths of 400 meters or deeper. Rental Rates Annual rental...
2011-11-14
... period of the lease term for blocks in water depths of 400 meters to less than 1,600 meters, (2) the minimum bonus bid has increased for blocks in water depths of 400 meters or deeper, (3) no deepwater... meters and (2) 400 meters or more. Successful Bidders: The BOEM requires each company that has been...
2010-02-12
... for blocks in water depths of 400 meters to less than 1,600 meters. Blocks in 400 to less than 800... resulting from this lease sale. Leases in water depths of 400 meters to less than 800 meters will be offered... still may require the full 10-year term. In both the 400-800 and 800-1,600 meter cases, the lease...
AMMI and GGE biplot analysis for yield stability of promising bread wheat genotypes in bangladesh
International Nuclear Information System (INIS)
Ashrafulalam, M.; Li, M.; Farhad, M.; Hakim, M. A.
2017-01-01
Identification of stable and high yielding varieties under different environmental conditions prior to release as a variety is the major steps for plant breeding. Eight promising wheat genotypes were evaluated against two standard checks across five locations under terminal heat stress condition. The experimental design was an RCBD with three replications in over one year. AMMI analyses exhibited significant (p<0.01) variation in genotype, location and genotype by location interaction with respect to grain yield. The ASV value revealed that GEN4, GEN9, and GEN8 were stable, while GEN5, GEN1, and GEN6 were the most sensitive genotypes. The GGE results also confirmed GEN3, GEN7, GEN8, GEN9 and GEN4 were the most stable cultivars. Five distant mega-environments were identified including Dinajpur and Jamalpur with GEN3, GEN7 and GEN8 as the most favorable, Joydebpur, Rajshahi and Jessore with GEN4 and GEN9 as the most favorable. Genotype GEN7 and GEN8 showed highly resistant to BpLB, GEN3 and GEN4 showed moderately resistance to BpLB, and GEN9 showed moderate susceptible to BpLB. On the other hand, these five genotypes performed resistance to leaf rust. The genotype GEN7 (BAW 1202) was released as BARI Gom 32. Considering all analysis, GEN3 (BAW 1194), GEN7 (BAW 1202) and GEN8 (BAW 1203) demonstrated more stable genotypes with high mean yield, resistant to BpLB and leaf rust. Thus it is indicated that these genotypes can be used as suitable plant material for future breeding programs. (author)
Speciated atmospheric mercury on haze and non-haze days in an inland city in China
Directory of Open Access Journals (Sweden)
Q. Hong
2016-11-01
Full Text Available Long-term continuous measurements of speciated atmospheric mercury were conducted from July 2013 to June 2014 in Hefei, a midlatitude inland city in eastern central China that experiences frequent haze pollution. The mean concentrations (±standard deviation of gaseous elemental mercury (GEM, gaseous oxidized mercury (GOM and particle-bound mercury (PBM were 3.95 ± 1.93 ng m−3, 2.49 ± 2.41 and 23.3 ± 90.8 pg m−3, respectively, on non-haze days, and 4.74 ± 1.62 ng m−3, 4.32 ± 8.36 and 60.2 ± 131.4 pg m−3, respectively, on haze days. Potential source contribution function (PSCF analysis suggested that atmospheric mercury pollution on haze days was caused primarily by local emissions, instead of via long-range transport. The poorer mixing conditions on haze days also favored the accumulation of atmospheric mercury. Compared to GEM and GOM, PBM was especially sensitive to haze pollution. The mean PBM concentration on haze days was 2.5 times that on non-haze days due to elevated concentrations of particulate matter. PBM also showed a clear seasonal trend; its concentration was the highest in fall and winter, decreased rapidly in spring and was the lowest in summer, following the same order in the frequency of haze days in different seasons. On both non-haze and haze days, GOM concentrations remained low at night, but increased rapidly just before sunrise, which could be due to diurnal variation in air exchange between the boundary layer and free troposphere. However, non-haze and haze days showed different trends in daytime GEM and GOM concentrations. On non-haze days, GEM and GOM declined synchronously through the afternoon, probably due to the retreat of the free tropospheric air as the height of the atmospheric boundary layer increases. In contrast, on haze days, GOM and GEM showed opposite trends with the highest GOM and lowest GEM observed in the afternoon, suggesting the occurrence of
Ultrastructural analysis of small blood vessels in skin biopsies in CADASIL
Directory of Open Access Journals (Sweden)
Lačković Vesna
2008-01-01
Full Text Available Cerebral autosomal dominant arteriopathy with subcortical infarcts and leukoencephalopathy (CADASIL is an inherited small- and medium-artery disease of the brain caused by mutation of the Notch3 gene. Very often, this disease is misdiagnosed. We examined skin biopsies in two members of the first discovered Serbian family affected by CADASIL. Electron microscopy showed that skin blood vessels of both patients contain numerous deposits of granular osmiophilic material (GOM around vascular smooth muscle cells (VSMCs. We observed degeneration of VSMCs, reorganization of their cytoskeleton and dense bodies, disruption of myoendothelial contacts, and apoptosis. Our results suggest that the presence of GOM in small skin arteries represents a specific marker in diagnosis of CADASIL.
Deep-water oilfield development cost analysis and forecasting —— Take gulf of mexico for example
Shi, Mingyu; Wang, Jianjun; Yi, Chenggao; Bai, Jianhui; Wang, Jing
2017-11-01
Gulf of Mexico (GoM) is the earliest offshore oilfield which has ever been developed. It tends to breed increasingly value of efficient, secure and cheap key technology of deep-water development. Thus, the analyze of development expenditure in this area is significantly important the evaluation concept of deep-water oilfield all over the world. This article emphasizes on deep-water development concept and EPC contract value in GoM in recent 10 years in case of comparison and selection to the economic efficiency. Besides, the QUETOR has been put into use in this research processes the largest upstream cost database to simulate and calculate the calculating examples’ expenditure. By analyzing and forecasting the deep-water oilfield development expenditure, this article explores the relevance between expenditure index and oil price.
Pia Miglietta, Maria; Hourdez, Stephane; Cowart, Dominique A.; Schaeffer, Stephen W.; Fisher, Charles
2010-11-01
At least six morphospecies of vestimentiferan tubeworms are associated with cold seeps in the Gulf of Mexico (GOM). The physiology and ecology of the two best-studied species from depths above 1000 m in the upper Louisiana slope (Lamellibrachia luymesi and Seepiophila jonesi) are relatively well understood. The biology of one rare species from the upper slope (escarpiid sp. nov.) and three morphospecies found at greater depths in the GOM (Lamellibrachia sp. 1, L. sp. 2, and Escarpia laminata) are not as well understood. Here we address species distributions and boundaries of cold-seep tubeworms using phylogenetic hypotheses based on two mitochondrial genes. Fragments of the mitochondrial large ribosomal subunit rDNA (16S) and cytochrome oxidase subunit I (COI) genes were sequenced for 167 vestimentiferans collected from the GOM and analyzed in the context of other seep vestimentiferans for which sequence data were available. The analysis supported five monophyletic clades of vestimentiferans in the GOM. Intra-clade variation in both genes was very low, and there was no apparent correlation between the within-clade diversity and collection depth or location. Two of the morphospecies of Lamellibrachia from different depths in the GOM could not be distinguished by either mitochondrial gene. Similarly, E. laminata could not be distinguished from other described species of Escarpia from either the west coast of Africa or the eastern Pacific using COI. We suggest that the mitochondrial COI and 16S genes have little utility as barcoding markers for seep vestimentiferan tubeworms.
Le rapport métropolisation – régionalisation : prolégomènes pour un renouvellement paradigmatique
Directory of Open Access Journals (Sweden)
Jean-Marc Fontan
2002-01-01
Full Text Available L’idée de consacrer une réflexion à l’analyse du lien s’établissant entre des métropoles et leurs régions périphériques émerge des travaux réalisés et présentés au colloque 1998 de l’Association d’économie politique (Fontan, Klein, Tremblay, 1999. Plusieurs conférenciers, dont Veltz et Jalabert, posaient alors un constat de fond : l’analyse de la réalité socio-économique métropolitaine ne peut se limiter au cadre référentiel du paradigme du développement régional. Cet article répond à trois objectifs. Dans un premier temps, j’invoque les motifs qui nous ont guidé dans la réalisation de ce numéro d’Interventions économiques. Dans un deuxième temps, je soulève, à partir de quelques constats empiriques et théoriques, quelques questions sur la nature des liens passés et à venir entre la Grande région de Montréal et les autres régions du Québec. Dans un troisième temps, je présente les contributions des différents auteurs à ce numéro.I am pursuing tree goals in this article. First of all, I present the reasons underlying the production of this first edition of the new electronic format of Interventions Économiques. In second time, from empirical observations and theorithecal new developments, I reconsider the nature of past and present links between the Great Montreal Region and others Quebec’s sub regions. In the last part of the article, I introduce the articles of the others contributors.
International Nuclear Information System (INIS)
Sunith Shine, S.R.; Feroz Khan, M.; Godwin Wesley, S.
2013-01-01
Highlights: • Polonium-210 was quantified in the periwinkle Littorina undulata. • Smaller-sized periwinkles displayed higher Polonium-210. • Marked variation in 210 Po activity between season and sampling site. • The internal dose rate estimated using ERICA Assessment Tool. • The daily intake and committed effective dose estimated. -- Abstract: Polonium-210 activity concentration was analysed in the whole body tissue of periwinkle Littorina undulata collected from intertidal rocky shore along Kudankulam coast. We carried out the study for a period of 12 months (2011–2012) focusing on three seasons. 210 Po was found non-uniformly distributed among the periwinkles depending on the allometry. The 210 Po accumulation showed a significant difference between seasons (p 210 Po compared to larger ones (p 210 Po varied from 13.5 to 58.9 Bq/kg (wet). The activity of 210 Po was also quantified in seawater and intertidal sediments to calculate the biological concentration factor (BCF) and radiation dose rate. The dose rate to the winkles was performed using ERICA Assessment Tool and it was within the prescribed limit. The intake of 210 Po through periwinkles delivered an effective dose in the range of 2.2–9.6 μSv/y to human beings
Fuzzy linear programming approach for solving transportation ...
Indian Academy of Sciences (India)
ALI EBRAHIMNEJAD
Department of Mathematics, Qaemshahr Branch, Islamic Azad University, Qaemshahr, Iran e-mail: ..... est grade of membership at x are μ ˜AL (x) and μ ˜AU (x), respectively. ..... trapezoidal fuzzy numbers transportation problem (12) are.
Energy Technology Data Exchange (ETDEWEB)
Muntean, Marilena, E-mail: marilena.muntean@jrc.ec.europa.eu [European Commission, Joint Research Centre, Institute for Environment and Sustainability, Ispra (Italy); Janssens-Maenhout, Greet [European Commission, Joint Research Centre, Institute for Environment and Sustainability, Ispra (Italy); Song, Shaojie; Selin, Noelle E. [Massachusetts Institute of Technology, Cambridge, MA (United States); Olivier, Jos G.J. [PBL Netherlands Environment Assessment Agency, Bilthoven (Netherlands); Guizzardi, Diego [European Commission, Joint Research Centre, Institute for Environment and Sustainability, Ispra (Italy); Maas, Rob [RIVM National Institute for Public Health and Environment, Bilthoven (Netherlands); Dentener, Frank [European Commission, Joint Research Centre, Institute for Environment and Sustainability, Ispra (Italy)
2014-10-01
The Emission Database for Global Atmospheric Research (EDGAR) provides a time-series of man-made emissions of greenhouse gases and short-lived atmospheric pollutants from 1970 to 2008. Mercury is included in EDGARv4.tox1, thereby enriching the spectrum of multi-pollutant sources in the database. With an average annual growth rate of 1.3% since 1970, EDGARv4 estimates that the global mercury emissions reached 1287 tonnes in 2008. Specifically, gaseous elemental mercury (GEM) (Hg{sup 0}) accounted for 72% of the global total emissions, while gaseous oxidised mercury (GOM) (Hg{sup 2+}) and particle bound mercury (PBM) (Hg-P) accounted for only 22% and 6%, respectively. The less reactive form, i.e., Hg{sup 0}, has a long atmospheric residence time and can be transported long distances from the emission sources. The artisanal and small-scale gold production, accounted for approximately half of the global Hg{sup 0} emissions in 2008 followed by combustion (29%), cement production (12%) and other metal industry (10%). Given the local-scale impacts of mercury, special attention was given to the spatial distribution showing the emission hot-spots on gridded 0.1° × 0.1° resolution maps using detailed proxy data. The comprehensive ex-post analysis of the mitigation of mercury emissions by end-of-pipe abatement measures in the power generation sector and technology changes in the chlor-alkali industry over four decades indicates reductions of 46% and 93%, respectively. Combined, the improved technologies and mitigation measures in these sectors accounted for 401.7 tonnes of avoided mercury emissions in 2008. A comparison shows that EDGARv4 anthropogenic emissions are nearly equivalent to the lower estimates of the United Nations Environment Programme (UNEP)'s mercury emissions inventory for 2005 for most sectors. An evaluation of the EDGARv4 global mercury emission inventory, including mercury speciation, was performed using the GEOS-Chem global 3-D mercury model. The
International Nuclear Information System (INIS)
Muntean, Marilena; Janssens-Maenhout, Greet; Song, Shaojie; Selin, Noelle E.; Olivier, Jos G.J.; Guizzardi, Diego; Maas, Rob; Dentener, Frank
2014-01-01
The Emission Database for Global Atmospheric Research (EDGAR) provides a time-series of man-made emissions of greenhouse gases and short-lived atmospheric pollutants from 1970 to 2008. Mercury is included in EDGARv4.tox1, thereby enriching the spectrum of multi-pollutant sources in the database. With an average annual growth rate of 1.3% since 1970, EDGARv4 estimates that the global mercury emissions reached 1287 tonnes in 2008. Specifically, gaseous elemental mercury (GEM) (Hg 0 ) accounted for 72% of the global total emissions, while gaseous oxidised mercury (GOM) (Hg 2+ ) and particle bound mercury (PBM) (Hg-P) accounted for only 22% and 6%, respectively. The less reactive form, i.e., Hg 0 , has a long atmospheric residence time and can be transported long distances from the emission sources. The artisanal and small-scale gold production, accounted for approximately half of the global Hg 0 emissions in 2008 followed by combustion (29%), cement production (12%) and other metal industry (10%). Given the local-scale impacts of mercury, special attention was given to the spatial distribution showing the emission hot-spots on gridded 0.1° × 0.1° resolution maps using detailed proxy data. The comprehensive ex-post analysis of the mitigation of mercury emissions by end-of-pipe abatement measures in the power generation sector and technology changes in the chlor-alkali industry over four decades indicates reductions of 46% and 93%, respectively. Combined, the improved technologies and mitigation measures in these sectors accounted for 401.7 tonnes of avoided mercury emissions in 2008. A comparison shows that EDGARv4 anthropogenic emissions are nearly equivalent to the lower estimates of the United Nations Environment Programme (UNEP)'s mercury emissions inventory for 2005 for most sectors. An evaluation of the EDGARv4 global mercury emission inventory, including mercury speciation, was performed using the GEOS-Chem global 3-D mercury model. The model can
Pan, Ning; Li, Long; Ding, Jie; Li, Shengke; Wang, Ruibing; Jin, Yongdong; Wang, Xiangke; Xia, Chuanqin
2016-05-15
Manganese dioxide decorated graphene oxide (GOM) was prepared via fixation of crystallographic MnO2 (α, γ) on the surface of graphene oxide (GO) and was explored as an adsorbent material for simultaneous removal of thorium/uranium ions from aqueous solutions. In single component systems (Th(IV) or U(VI)), the α-GOM2 (the weight ratio of GO/α-MnO2 of 2) exhibited higher maximum adsorption capacities toward both Th(IV) (497.5mg/g) and U(VI) (185.2 mg/g) than those of GO. In the binary component system (Th(IV)/U(VI)), the saturated adsorption capacity of Th(IV) (408.8 mg/g)/U(VI) (66.8 mg/g) on α-GOM2 was also higher than those on GO. Based on the analysis of various data, it was proposed that the adsorption process may involve four types of molecular interactions including coordination, electrostatic interaction, cation-pi interaction, and Lewis acid-base interaction between Th(IV)/U(VI) and α-GOM2. Finally, the Th(IV)/U(VI) ions on α-GOM2 can be separated by a two-stage desorption process with Na2CO3/EDTA. Those results displayed that the α-GOM2 may be utilized as an potential adsorbent for removing and separating Th(IV)/U(VI) ions from aqueous solutions. Copyright © 2016 Elsevier B.V. All rights reserved.
2013-06-06
...) Georges Bank winter flounder. Sec. 648.86(l) Zero retention of Atlantic wolffish. Sec. 648.86(o..., fecundity, bioelectrical impedance analysis (BIA), food habits, and genetic research. The yellowtail... (GOM), Georges Bank (GB), Southern New England/Mid-Atlantic (SNE/MA)), or 1,200 fish total from each...
Decision analysis multicriteria analysis
International Nuclear Information System (INIS)
Lombard, J.
1986-09-01
The ALARA procedure covers a wide range of decisions from the simplest to the most complex one. For the simplest one the engineering judgement is generally enough and the use of a decision aiding technique is therefore not necessary. For some decisions the comparison of the available protection option may be performed from two or a few criteria (or attributes) (protection cost, collective dose,...) and the use of rather simple decision aiding techniques, like the Cost Effectiveness Analysis or the Cost Benefit Analysis, is quite enough. For the more complex decisions, involving numerous criteria or for decisions involving large uncertainties or qualitative judgement the use of these techniques, even the extended cost benefit analysis, is not recommended and appropriate techniques like multi-attribute decision aiding techniques are more relevant. There is a lot of such particular techniques and it is not possible to present all of them. Therefore only two broad categories of multi-attribute decision aiding techniques will be presented here: decision analysis and the outranking analysis
Knowledge gained from analyzing mercury speciation data monitored in North America
Zhang, L.; Cheng, I.; Gay, D. A.; Xu, X.; Wu, Z.
2017-12-01
This presentation summarizes knowledge gained in several recent studies through analysis and application of mercury (Hg) speciation data monitored in North America. Annual Hg dry deposition to vegetated surfaces in the rural or remote environment in North America was dominated by leaf uptake of gaseous elemental mercury (GEM), contrary to what was commonly assumed in earlier studies which frequently omitted GEM dry deposition as an important process (Zhang et al., EST, 2016). Dry deposition exceeded wet deposition by a large margin in all of the seasons except in the summer at the majority of the sites. Based on the gaseous oxidized mercury (GOM) concentrations predicted from measured Hg wet deposition using a scavenging ratio method, multi-year average GOM concentrations collected using Tekran speciation instrument were likely biased low by a factor of 2 at about half of the studied sites (Cheng and Zhang, EST, 2017). A decline in the number of source regions impacting ambient GEM and GOM was found from 2005-2014 at an eastern U.S. site through concentration-weighted trajectory (CWT) analysis (Cheng et al., JAS, 2017). Source contributions decreased by up to 20% for GEM, greater than 60% for GOM, and 20-60% for PBM in 2011-2014 than in 2006-2008, largely due to power plant Hg emission reductions since 2009. A study comparing Positive Matrix Factorization (PMF) and Principal Components Analysis (PCA) receptor methods identified similar sources impacting Kejimkujik National Park, Canada, including combustion, industrial sulfur, photochemistry and re-emissions, and oceanic sea-salt emissions. Improving the quality of the Hg data used in receptor methods by imputation did not improve the PMF results, but reducing the fraction of below detection limit data was effective (Xu et al., ACP, 2017). PCA results using reactive mercury (RM=GOM+PBM) or excluding low GOM values were similar to those using the original data. Source contributions from CWT analysis were more
Vulnerabilidades ocupacionais e percepção de saúde em trabalhadores do SUS
Directory of Open Access Journals (Sweden)
Ada Ávila Assunção
2012-06-01
Full Text Available Este artigo desenvolve uma tipologia de precariedade nos vínculos de trabalho, considerando outras dimensões além daquelas tradicionais, como tipo de vínculo, jornada e rendimento de trabalho. Outros aspectos do local de trabalho e da alocação de tempo em atividades e tarefas fora do ambiente laboral afetam a inserção dos indivíduos no mercado de trabalho. Assim, a partir da flexibilidade do método e da relativamente grande disponibilidade de informações sobre o indivíduo e seu desempenho socioeconômico, a presente investigação recorreu à aplicação do Método Grade of Membership (GoM aos dados de 1.808 trabalhadores da rede municipal de saúde de Belo Horizonte que participaram de um inquérito epidemiológico em 2009. Foram considerados elegíveis todos os profissionais vinculados ao serviço público municipal de saúde, independente do vínculo empregatício (permanente, temporário, estágio. O estudo foi aprovado pelo Comitê de Ética em Pesquisa da Universidade Federal de Minas Gerais (parecer nº 542/07. Os resultados apresentados convergem para as reflexões acerca da crescente constatação da vulnerabilidade dos trabalhadores de saúde em função das condições de trabalho precárias, além de indicaram a relevância da abordagem da atividade de trabalho, no sentido de identificar agentes estressores e outros fatores do ambiente relacionados às situações nocivas e de adoecimento. Algumas características dos indivíduos (idade, sexo, tempo de serviço, etc. que conformam a população-alvo não são passíveis de ações externas, no entanto, as políticas podem modificar os fatores localizados em torno do núcleo individual.
International Nuclear Information System (INIS)
2008-05-01
This book introduces energy and resource technology development business with performance analysis, which has business division and definition, analysis of current situation of support, substance of basic plan of national energy, resource technique development, selection of analysis index, result of performance analysis by index, performance result of investigation, analysis and appraisal of energy and resource technology development business in 2007.
Wang, Cuiping; Jia, Weili; Wang, Dong; Song, Zhiguang
2017-07-15
Sediments from the Gulf of Mexico (GOM) and the South China Sea (SCS) were analyzed. The low δ 13 C values of pentamethylicosane (PMIs) and fatty acids (-81.3 to -85.2‰) were found in only the S-1 sample collected from the GOM, indicating that methanogenic archaea associated with gas hydrate formation contributed to the sediment organic matter. Principle component analysis of fatty acids suggested that similar microbial biomass was found in the S-1, S-9, O-3 and O-5 samples. However, a comparison of the alkanes, fatty acids, and alcohols indicated that the percentage of n-alkan-2-ols in the S-1 sample from the GOM was the highest, while n-alkanes and n-fatty acids were the highest percentages in other samples from the GOM and SCS. This finding suggests that microbial species or the oxidation/reduction environment of the sample site of S-1 were different from those of the other samples. The present study provides a basis for detecting gas hydrate sites on the seafloor of the SCS. Copyright © 2017. Published by Elsevier Ltd.
International Nuclear Information System (INIS)
Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok
1989-02-01
This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.
Energy Technology Data Exchange (ETDEWEB)
Jae, Myeong Gi; Lee, Won Seong; Kim, Ha Hyeok
1989-02-15
This book give description of electronic engineering such as circuit element and device, circuit analysis and logic digital circuit, the method of electrochemistry like conductometry, potentiometry and current measuring, spectro chemical analysis with electromagnetic radiant rays, optical components, absorption spectroscopy, X-ray analysis, atomic absorption spectrometry and reference, chromatography like gas-chromatography and liquid-chromatography and automated analysis on control system evaluation of automated analysis and automated analysis system and reference.
Energy Technology Data Exchange (ETDEWEB)
Kim, Seung Jae; Seo, Seong Gyu
1995-03-15
This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.
International Nuclear Information System (INIS)
Kim, Seung Jae; Seo, Seong Gyu
1995-03-01
This textbook deals with instrumental analysis, which consists of nine chapters. It has Introduction of analysis chemistry, the process of analysis and types and form of the analysis, Electrochemistry on basic theory, potentiometry and conductometry, electromagnetic radiant rays and optical components on introduction and application, Ultraviolet rays and Visible spectrophotometry, Atomic absorption spectrophotometry on introduction, flame emission spectrometry and plasma emission spectrometry. The others like infrared spectrophotometry, X-rays spectrophotometry and mass spectrometry, chromatography and the other instrumental analysis like radiochemistry.
... page: //medlineplus.gov/ency/article/003741.htm Sensitivity analysis To use the sharing features on this page, please enable JavaScript. Sensitivity analysis determines the effectiveness of antibiotics against microorganisms (germs) ...
McShane, Edward James
2013-01-01
This text surveys practical elements of real function theory, general topology, and functional analysis. Discusses the maximality principle, the notion of convergence, the Lebesgue-Stieltjes integral, function spaces and harmonic analysis. Includes exercises. 1959 edition.
Cerebrospinal fluid analysis ... Analysis of CSF can help detect certain conditions and diseases. All of the following can be, but ... An abnormal CSF analysis result may be due to many different causes, ... Encephalitis (such as West Nile and Eastern Equine) Hepatic ...
... analysis URL of this page: //medlineplus.gov/ency/article/003627.htm Semen analysis To use the sharing features on this page, please enable JavaScript. Semen analysis measures the amount and quality of a man's semen and sperm. Semen is ...
Kantorovich, L V
1982-01-01
Functional Analysis examines trends in functional analysis as a mathematical discipline and the ever-increasing role played by its techniques in applications. The theory of topological vector spaces is emphasized, along with the applications of functional analysis to applied analysis. Some topics of functional analysis connected with applications to mathematical economics and control theory are also discussed. Comprised of 18 chapters, this book begins with an introduction to the elements of the theory of topological spaces, the theory of metric spaces, and the theory of abstract measure space
International Nuclear Information System (INIS)
Berman, M.; Bischof, L.M.; Breen, E.J.; Peden, G.M.
1994-01-01
This paper provides an overview of modern image analysis techniques pertinent to materials science. The usual approach in image analysis contains two basic steps: first, the image is segmented into its constituent components (e.g. individual grains), and second, measurement and quantitative analysis are performed. Usually, the segmentation part of the process is the harder of the two. Consequently, much of the paper concentrates on this aspect, reviewing both fundamental segmentation tools (commonly found in commercial image analysis packages) and more advanced segmentation tools. There is also a review of the most widely used quantitative analysis methods for measuring the size, shape and spatial arrangements of objects. Many of the segmentation and analysis methods are demonstrated using complex real-world examples. Finally, there is a discussion of hardware and software issues. 42 refs., 17 figs
Thiemann, Francis C.
Semiotic analysis is a method of analyzing signs (e.g., words) to reduce non-numeric data to their component parts without losing essential meanings. Semiotics dates back to Aristotle's analysis of language; it was much advanced by nineteenth-century analyses of style and logic and by Whitehead and Russell's description in this century of the role…
Indian Academy of Sciences (India)
Dimensional analysis is a useful tool which finds important applications in physics and engineering. It is most effective when there exist a maximal number of dimensionless quantities constructed out of the relevant physical variables. Though a complete theory of dimen- sional analysis was developed way back in 1914 in a.
Bravená, Helena
2009-01-01
This bacherlor thesis deals with the importance of job analysis for personnel activities in the company. The aim of this work is to find the most suitable method of job analysis in a particular enterprise, and continues creating descriptions and specifications of each job.
International Nuclear Information System (INIS)
Francois, P.
1996-01-01
We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs
Energy Technology Data Exchange (ETDEWEB)
Francois, P
1997-12-31
We undertook a study programme at the end of 1991. To start with, we performed some exploratory studies aimed at learning some preliminary lessons on this type of analysis: Assessment of the interest of probabilistic incident analysis; possibility of using PSA scenarios; skills and resources required. At the same time, EPN created a working group whose assignment was to define a new approach for analysis of incidents on NPPs. This working group gave thought to both aspects of Operating Feedback that EPN wished to improve: Analysis of significant incidents; analysis of potential consequences. We took part in the work of this group, and for the second aspects, we proposed a method based on an adaptation of the event-tree method in order to establish a link between existing PSA models and actual incidents. Since PSA provides an exhaustive database of accident scenarios applicable to the two most common types of units in France, they are obviously of interest for this sort of analysis. With this method we performed some incident analyses, and at the same time explores some methods employed abroad, particularly ASP (Accident Sequence Precursor, a method used by the NRC). Early in 1994 EDF began a systematic analysis programme. The first, transient phase will set up methods and an organizational structure. 7 figs.
Khabaza, I M
1960-01-01
Numerical Analysis is an elementary introduction to numerical analysis, its applications, limitations, and pitfalls. Methods suitable for digital computers are emphasized, but some desk computations are also described. Topics covered range from the use of digital computers in numerical work to errors in computations using desk machines, finite difference methods, and numerical solution of ordinary differential equations. This book is comprised of eight chapters and begins with an overview of the importance of digital computers in numerical analysis, followed by a discussion on errors in comput
International Nuclear Information System (INIS)
Warner, M.
1987-01-01
What is the current state of quantitative trace analytical chemistry? What are today's research efforts? And what challenges does the future hold? These are some of the questions addressed at a recent four-day symposium sponsored by the National Bureau of Standards (NBS) entitled Accuracy in Trace Analysis - Accomplishments, Goals, Challenges. The two plenary sessions held on the first day of the symposium reviewed the history of quantitative trace analysis, discussed the present situation from academic and industrial perspectives, and summarized future needs. The remaining three days of the symposium consisted of parallel sessions dealing with the measurement process; quantitation in materials; environmental, clinical, and nutrient analysis; and advances in analytical techniques
Goodstein, R L
2010-01-01
Recursive analysis develops natural number computations into a framework appropriate for real numbers. This text is based upon primary recursive arithmetic and presents a unique combination of classical analysis and intuitional analysis. Written by a master in the field, it is suitable for graduate students of mathematics and computer science and can be read without a detailed knowledge of recursive arithmetic.Introductory chapters on recursive convergence and recursive and relative continuity are succeeded by explorations of recursive and relative differentiability, the relative integral, and
Tao, Terence
2016-01-01
This is part one of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...
Tao, Terence
2016-01-01
This is part two of a two-volume book on real analysis and is intended for senior undergraduate students of mathematics who have already been exposed to calculus. The emphasis is on rigour and foundations of analysis. Beginning with the construction of the number systems and set theory, the book discusses the basics of analysis (limits, series, continuity, differentiation, Riemann integration), through to power series, several variable calculus and Fourier analysis, and then finally the Lebesgue integral. These are almost entirely set in the concrete setting of the real line and Euclidean spaces, although there is some material on abstract metric and topological spaces. The book also has appendices on mathematical logic and the decimal system. The entire text (omitting some less central topics) can be taught in two quarters of 25–30 lectures each. The course material is deeply intertwined with the exercises, as it is intended that the student actively learn the material (and practice thinking and writing ri...
International Nuclear Information System (INIS)
1988-01-01
Basic studies in nuclear analytical techniques include the examination of underlying assumptions and the development and extention of techniques involving the use of ion beams for elemental and mass analysis. 1 ref., 1 tab
Energy Technology Data Exchange (ETDEWEB)
2016-06-01
Fact sheet summarizing NREL's techno-economic analysis and life-cycle assessment capabilities to connect research with future commercial process integration, a critical step in the scale-up of biomass conversion technologies.
Gasinski, Leszek
2005-01-01
Hausdorff Measures and Capacity. Lebesgue-Bochner and Sobolev Spaces. Nonlinear Operators and Young Measures. Smooth and Nonsmooth Analysis and Variational Principles. Critical Point Theory. Eigenvalue Problems and Maximum Principles. Fixed Point Theory.
DEFF Research Database (Denmark)
Bauer-Gottwein, Peter; Riegels, Niels; Pulido-Velazquez, Manuel
2017-01-01
Hydroeconomic analysis and modeling provides a consistent and quantitative framework to assess the links between water resources systems and economic activities related to water use, simultaneously modeling water supply and water demand. It supports water managers and decision makers in assessing...... trade-offs between different water uses, different geographic regions, and various economic sectors and between the present and the future. Hydroeconomic analysis provides consistent economic performance criteria for infrastructure development and institutional reform in water policies and management...... organizations. This chapter presents an introduction to hydroeconomic analysis and modeling, and reviews the state of the art in the field. We review available economic water-valuation techniques and summarize the main types of decision problems encountered in hydroeconomic analysis. Popular solution strategies...
Schiffrin, Deborah
1990-01-01
Summarizes the current state of research in conversation analysis, referring primarily to six different perspectives that have developed from the philosophy, sociology, anthropology, and linguistics disciplines. These include pragmatics; speech act theory; interactional sociolinguistics; ethnomethodology; ethnography of communication; and…
Gorsuch, Richard L
2013-01-01
Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.
Adjoint sensitivity studies of loop current and eddy shedding in the Gulf of Mexico
Gopalakrishnan, Ganesh; Cornuelle, Bruce D.; Hoteit, Ibrahim
2013-01-01
the current, while sensitivities to SSH generally extend to deeper layers and propagate more slowly. The adjoint sensitivity to relative vorticity deduced from the sensitivities to velocity fields suggests that advection of cyclonic (positive) relative vorticity anomalies from the YC or the LCFEs accelerate the LC eddy separation. Forward model perturbation experiments were performed to complement and check the adjoint sensitivity analysis as well as sampling the predictability and nonlinearity of the LC evolution. The model and its adjoint can be used in four-dimensional variational assimilation (4D-VAR) to produce dynamically consistent ocean state estimates for analysis and forecasts of the circulation of the GoM.
Adjoint sensitivity studies of loop current and eddy shedding in the Gulf of Mexico
Gopalakrishnan, Ganesh
2013-07-01
the current, while sensitivities to SSH generally extend to deeper layers and propagate more slowly. The adjoint sensitivity to relative vorticity deduced from the sensitivities to velocity fields suggests that advection of cyclonic (positive) relative vorticity anomalies from the YC or the LCFEs accelerate the LC eddy separation. Forward model perturbation experiments were performed to complement and check the adjoint sensitivity analysis as well as sampling the predictability and nonlinearity of the LC evolution. The model and its adjoint can be used in four-dimensional variational assimilation (4D-VAR) to produce dynamically consistent ocean state estimates for analysis and forecasts of the circulation of the GoM.
International Nuclear Information System (INIS)
1959-01-01
Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation
Energy Technology Data Exchange (ETDEWEB)
NONE
1959-07-15
Radioactivation analysis is the technique of radioactivation analysis of the constituents of a very small sample of matter by making the sample artificially radioactive. The first stage is to make the sample radioactive by artificial means, e.g. subject it to neutron bombardment. Once the sample has been activated, or made radioactive, the next task is to analyze the radiations given off by the sample. This analysis would indicate the nature and quantities of the various elements present in the sample. The reason is that the radiation from a particular radioisotope. In 1959 a symposium on 'Radioactivation Analysis' was organized in Vienna by the IAEA and the Joint Commission on Applied Radioactivity (ICSU). It was pointed out that there are certain factors creating uncertainties and elaborated how to overcome them. Attention was drawn to the fact that radioactivation analysis had proven a powerful tool tackling fundamental problems in geo- and cosmochemistry, and a review was given of the recent work in this field. Because of its extreme sensitivity radioactivation analysis had been principally employed for trace detection and its most extensive use has been in control of semiconductors and very pure metals. An account of the experience gained in the USA was given, where radioactivation analysis was being used by many investigators in various scientific fields as a practical and useful tool for elemental analyses. Much of this work had been concerned with determining sub microgramme and microgramme concentration of many different elements in samples of biological materials, drugs, fertilizers, fine chemicals, foods, fuels, glass, ceramic materials, metals, minerals, paints, petroleum products, resinous materials, soils, toxicants, water and other materials. In addition to these studies, radioactivation analysis had been used by other investigators to determine isotopic ratios of the stable isotopes of some of the elements. Another paper dealt with radioactivation
DEFF Research Database (Denmark)
Brænder, Morten; Andersen, Lotte Bøgh
2014-01-01
Based on our 2013-article, ”Does Deployment to War Affect Soldiers' Public Service Motivation – A Panel Study of Soldiers Before and After their Service in Afghanistan”, we present Panel Analysis as a methodological discipline. Panels consist of multiple units of analysis, observed at two or more...... in research settings where it is not possible to distribute units of analysis randomly or where the independent variables cannot be manipulated. The greatest disadvantage in regard to using panel studies is that data may be difficult to obtain. This is most clearly vivid in regard to the use of panel surveys...... points in time. In comparison with traditional cross-sectional studies, the advantage of using panel studies is that the time dimension enables us to study effects. Whereas experimental designs may have a clear advantage in regard to causal inference, the strength of panel studies is difficult to match...
Loeb, Peter A
2016-01-01
This textbook is designed for a year-long course in real analysis taken by beginning graduate and advanced undergraduate students in mathematics and other areas such as statistics, engineering, and economics. Written by one of the leading scholars in the field, it elegantly explores the core concepts in real analysis and introduces new, accessible methods for both students and instructors. The first half of the book develops both Lebesgue measure and, with essentially no additional work for the student, general Borel measures for the real line. Notation indicates when a result holds only for Lebesgue measure. Differentiation and absolute continuity are presented using a local maximal function, resulting in an exposition that is both simpler and more general than the traditional approach. The second half deals with general measures and functional analysis, including Hilbert spaces, Fourier series, and the Riesz representation theorem for positive linear functionals on continuous functions with compact support....
Scott, L Ridgway
2011-01-01
Computational science is fundamentally changing how technological questions are addressed. The design of aircraft, automobiles, and even racing sailboats is now done by computational simulation. The mathematical foundation of this new approach is numerical analysis, which studies algorithms for computing expressions defined with real numbers. Emphasizing the theory behind the computation, this book provides a rigorous and self-contained introduction to numerical analysis and presents the advanced mathematics that underpin industrial software, including complete details that are missing from most textbooks. Using an inquiry-based learning approach, Numerical Analysis is written in a narrative style, provides historical background, and includes many of the proofs and technical details in exercises. Students will be able to go beyond an elementary understanding of numerical simulation and develop deep insights into the foundations of the subject. They will no longer have to accept the mathematical gaps that ex...
Rao, G Shanker
2006-01-01
About the Book: This book provides an introduction to Numerical Analysis for the students of Mathematics and Engineering. The book is designed in accordance with the common core syllabus of Numerical Analysis of Universities of Andhra Pradesh and also the syllabus prescribed in most of the Indian Universities. Salient features: Approximate and Numerical Solutions of Algebraic and Transcendental Equation Interpolation of Functions Numerical Differentiation and Integration and Numerical Solution of Ordinary Differential Equations The last three chapters deal with Curve Fitting, Eigen Values and Eigen Vectors of a Matrix and Regression Analysis. Each chapter is supplemented with a number of worked-out examples as well as number of problems to be solved by the students. This would help in the better understanding of the subject. Contents: Errors Solution of Algebraic and Transcendental Equations Finite Differences Interpolation with Equal Intervals Interpolation with Unequal Int...
Jacques, Ian
1987-01-01
This book is primarily intended for undergraduates in mathematics, the physical sciences and engineering. It introduces students to most of the techniques forming the core component of courses in numerical analysis. The text is divided into eight chapters which are largely self-contained. However, with a subject as intricately woven as mathematics, there is inevitably some interdependence between them. The level of difficulty varies and, although emphasis is firmly placed on the methods themselves rather than their analysis, we have not hesitated to include theoretical material when we consider it to be sufficiently interesting. However, it should be possible to omit those parts that do seem daunting while still being able to follow the worked examples and to tackle the exercises accompanying each section. Familiarity with the basic results of analysis and linear algebra is assumed since these are normally taught in first courses on mathematical methods. For reference purposes a list of theorems used in the t...
DiBenedetto, Emmanuele
2016-01-01
The second edition of this classic textbook presents a rigorous and self-contained introduction to real analysis with the goal of providing a solid foundation for future coursework and research in applied mathematics. Written in a clear and concise style, it covers all of the necessary subjects as well as those often absent from standard introductory texts. Each chapter features a “Problems and Complements” section that includes additional material that briefly expands on certain topics within the chapter and numerous exercises for practicing the key concepts. The first eight chapters explore all of the basic topics for training in real analysis, beginning with a review of countable sets before moving on to detailed discussions of measure theory, Lebesgue integration, Banach spaces, functional analysis, and weakly differentiable functions. More topical applications are discussed in the remaining chapters, such as maximal functions, functions of bounded mean oscillation, rearrangements, potential theory, a...
International Nuclear Information System (INIS)
Romli
1997-01-01
Cluster analysis is the name of group of multivariate techniques whose principal purpose is to distinguish similar entities from the characteristics they process.To study this analysis, there are several algorithms that can be used. Therefore, this topic focuses to discuss the algorithms, such as, similarity measures, and hierarchical clustering which includes single linkage, complete linkage and average linkage method. also, non-hierarchical clustering method, which is popular name K -mean method ' will be discussed. Finally, this paper will be described the advantages and disadvantages of every methods
Rockafellar, Ralph Tyrell
2015-01-01
Available for the first time in paperback, R. Tyrrell Rockafellar's classic study presents readers with a coherent branch of nonlinear mathematical analysis that is especially suited to the study of optimization problems. Rockafellar's theory differs from classical analysis in that differentiability assumptions are replaced by convexity assumptions. The topics treated in this volume include: systems of inequalities, the minimum or maximum of a convex function over a convex set, Lagrange multipliers, minimax theorems and duality, as well as basic results about the structure of convex sets and
Brezinski, C
2012-01-01
Numerical analysis has witnessed many significant developments in the 20th century. This book brings together 16 papers dealing with historical developments, survey papers and papers on recent trends in selected areas of numerical analysis, such as: approximation and interpolation, solution of linear systems and eigenvalue problems, iterative methods, quadrature rules, solution of ordinary-, partial- and integral equations. The papers are reprinted from the 7-volume project of the Journal of Computational and Applied Mathematics on '/homepage/sac/cam/na2000/index.html<
International Nuclear Information System (INIS)
Biehl, F.A.
1984-05-01
This paper presents the criteria, previous nuclear experience in space, analysis techniques, and possible breakup enhancement devices applicable to an acceptable SP-100 reentry from space. Reactor operation in nuclear-safe orbit will minimize the radiological risk; the remaining safeguards criteria need to be defined. A simple analytical point mass reentry technique and a more comprehensive analysis method that considers vehicle dynamics and orbit insertion malfunctions are presented. Vehicle trajectory, attitude, and possible breakup enhancement devices will be integrated in the simulation as required to ensure an adequate representation of the reentry process
Aggarwal, Charu C
2013-01-01
With the increasing advances in hardware technology for data collection, and advances in software technology (databases) for data organization, computer scientists have increasingly participated in the latest advancements of the outlier analysis field. Computer scientists, specifically, approach this field based on their practical experiences in managing large amounts of data, and with far fewer assumptions- the data can be of any type, structured or unstructured, and may be extremely large.Outlier Analysis is a comprehensive exposition, as understood by data mining experts, statisticians and
Everitt, Brian S; Leese, Morven; Stahl, Daniel
2011-01-01
Cluster analysis comprises a range of methods for classifying multivariate data into subgroups. By organizing multivariate data into such subgroups, clustering can help reveal the characteristics of any structure or patterns present. These techniques have proven useful in a wide range of areas such as medicine, psychology, market research and bioinformatics.This fifth edition of the highly successful Cluster Analysis includes coverage of the latest developments in the field and a new chapter dealing with finite mixture models for structured data.Real life examples are used throughout to demons
Snell, K S; Langford, W J; Maxwell, E A
1966-01-01
Elementary Analysis, Volume 2 introduces several of the ideas of modern mathematics in a casual manner and provides the practical experience in algebraic and analytic operations that lays a sound foundation of basic skills. This book focuses on the nature of number, algebraic and logical structure, groups, rings, fields, vector spaces, matrices, sequences, limits, functions and inverse functions, complex numbers, and probability. The logical structure of analysis given through the treatment of differentiation and integration, with applications to the trigonometric and logarithmic functions, is
International Nuclear Information System (INIS)
Baron, J.H.; Nunez McLeod, J.; Rivera, S.S.
1997-01-01
This book contains a selection of research works performed in the CEDIAC Institute (Cuyo National University) in the area of Risk Analysis, with specific orientations to the subjects of uncertainty and sensitivity studies, software reliability, severe accident modeling, etc. This volume presents important material for all those researches who want to have an insight in the risk analysis field, as a tool to solution several problems frequently found in the engineering and applied sciences field, as well as for the academic teachers who want to keep up to date, including the new developments and improvements continuously arising in this field [es
Alan Gallegos
2002-01-01
Watershed analyses and assessments for the Kings River Sustainable Forest Ecosystems Project were done on about 33,000 acres of the 45,500-acre Big Creek watershed and 32,000 acres of the 85,100-acre Dinkey Creek watershed. Following procedures developed for analysis of cumulative watershed effects (CWE) in the Pacific Northwest Region of the USDA Forest Service, the...
Freund, Rudolf J; Sa, Ping
2006-01-01
The book provides complete coverage of the classical methods of statistical analysis. It is designed to give students an understanding of the purpose of statistical analyses, to allow the student to determine, at least to some degree, the correct type of statistical analyses to be performed in a given situation, and have some appreciation of what constitutes good experimental design
International Nuclear Information System (INIS)
Unterberger, A.
1987-01-01
We study the Klein-Gordon symbolic calculus of operators acting on solutions of the free Klein-Gordon equation. It contracts to the Weyl calculus as c→∞. Mathematically, it may also be considered as a pseudodifferential analysis on the unit ball of R n [fr
International Nuclear Information System (INIS)
Woodard, K.
1985-01-01
The objectives of this paper are to: Provide a realistic assessment of consequences; Account for plant and site-specific characteristics; Adjust accident release characteristics to account for results of plant-containment analysis; Produce conditional risk curves for each of five health effects; and Estimate uncertainties
DEFF Research Database (Denmark)
Hjørland, Birger
2017-01-01
The domain-analytic approach to knowledge organization (KO) (and to the broader field of library and information science, LIS) is outlined. The article reviews the discussions and proposals on the definition of domains, and provides an example of a domain-analytic study in the field of art studies....... Varieties of domain analysis as well as criticism and controversies are presented and discussed....
International Nuclear Information System (INIS)
Rhoades, W.A.; Dray, B.J.
1970-01-01
The effect of Gadolinium-155 on the prompt kinetic behavior of a zirconium hydride reactor has been deduced, using experimental data from the SNAPTRAN machine. The poison material makes the temperature coefficient more positive, and the Type IV sleeves were deduced to give a positive coefficient above 1100 0 F. A thorough discussion of the data and analysis is included. (U.S.)
International Nuclear Information System (INIS)
Saadi, Radouan; Marah, Hamid
2014-01-01
This report presents results related to Tritium analysis carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal, within the framework of the RAF7011 project. It describes analytical method and instrumentation including general uncertainty estimation: Electrolytic enrichment and liquid scintillation counting; The results are expressed in Tritium Unit (TU); Low Limit Detection: 0.02 TU
Miller, Rupert G
2011-01-01
A concise summary of the statistical methods used in the analysis of survival data with censoring. Emphasizes recently developed nonparametric techniques. Outlines methods in detail and illustrates them with actual data. Discusses the theory behind each method. Includes numerous worked problems and numerical exercises.
Koornneef, M.; Alonso-Blanco, C.; Stam, P.
2006-01-01
The Mendelian analysis of genetic variation, available as induced mutants or as natural variation, requires a number of steps that are described in this chapter. These include the determination of the number of genes involved in the observed trait's variation, the determination of dominance
DEFF Research Database (Denmark)
Nielsen, Kirsten
2010-01-01
The first part of this article presents the characteristics of Hebrew poetry: features associated with rhythm and phonology, grammatical features, structural elements like parallelism, and imagery and intertextuality. The second part consists of an analysis of Psalm 121. It is argued that assonan...
Adrian Ioana; Tiberiu Socaciu
2013-01-01
The article presents specific aspects of management and models for economic analysis. Thus, we present the main types of economic analysis: statistical analysis, dynamic analysis, static analysis, mathematical analysis, psychological analysis. Also we present the main object of the analysis: the technological activity analysis of a company, the analysis of the production costs, the economic activity analysis of a company, the analysis of equipment, the analysis of labor productivity, the anal...
International Nuclear Information System (INIS)
Smith, M.; Jones, D.R.
1991-01-01
The goal of exploration is to find reserves that will earn an adequate rate of return on the capital invested. Neither exploration nor economics is an exact science. The authors must therefore explore in those trends (plays) that have the highest probability of achieving this goal. Trend analysis is a technique for organizing the available data to make these strategic exploration decisions objectively and is in conformance with their goals and risk attitudes. Trend analysis differs from resource estimation in its purpose. It seeks to determine the probability of economic success for an exploration program, not the ultimate results of the total industry effort. Thus the recent past is assumed to be the best estimate of the exploration probabilities for the near future. This information is combined with economic forecasts. The computer software tools necessary for trend analysis are (1) Information data base - requirements and sources. (2) Data conditioning program - assignment to trends, correction of errors, and conversion into usable form. (3) Statistical processing program - calculation of probability of success and discovery size probability distribution. (4) Analytical processing - Monte Carlo simulation to develop the probability distribution of the economic return/investment ratio for a trend. Limited capital (short-run) effects are analyzed using the Gambler's Ruin concept in the Monte Carlo simulation and by a short-cut method. Multiple trend analysis is concerned with comparing and ranking trends, allocating funds among acceptable trends, and characterizing program risk by using risk profiles. In summary, trend analysis is a reality check for long-range exploration planning
Medina Micolta, Martha; Luna Merchán, Rómulo
2013-01-01
El presente trabajo de tesis se realizó con el objetivo de analizar los actores de la cadena del café para establecer estrategias de mejoras promoviendo la productividad, competitividad y sostenibilidad del sector caficultor en la provincia de Manabí cantón Jipijapa parroquia Pedro Pablo Gómez, la metodología utilizada se basa en el método inductivo, tras una primera etapa de observación, análisis y clasificación de los hechos del sector cafetalero obtendremos la solución del problema plante...
Damour, M.; Hamdan, L. J.; Salerno, J. L.; McGown, C.; Blackwell, C. A.; Church, R.; Warren, D.; Horrell, C.; Jordan, B.; Moore, J.
2016-02-01
Historic shipwrecks and other archaeological sites are protected by a well-established body of historic preservation laws intended to preserve these sensitive, non-renewable resources. While the cultural, historical, and archaeological value of historic shipwrecks is unequivocal, their function and value as ecosystem monitoring platforms following a major environmental disaster is becoming apparent. Shipwrecks have been found in previous studies to serve as artificial reefs and hotspots of biodiversity, essentially providing the basis for an intact ecosystem. This is especially true in the deepwater marine environment where natural hard-bottom is sparse. Micro- and macro-infaunal diversity on shipwrecks and their sensitivity to environmental change demonstrates the suitability of these platforms for monitoring ecosystem impact and recovery. After the 2010 Deepwater Horizon oil spill, the Bureau of Ocean Energy Management (BOEM) and partners initiated a multidisciplinary study to examine spill effects on shipwrecks and their associated microbial communities. To assess these impacts and to perform comparative analyses, the team collected microbiological, geochemical, and archaeological data at wooden- and metal-hulled shipwrecks within and outside of the subsurface spill-impacted area. Microbial community biodiversity informs us of micro-scale changes while 3D laser and sonar data reveal macro-scale changes. A multidisciplinary approach informs us of the roles microorganisms have in shipwreck degradation and corrosion as well as their response to ecosystem impacts. Results of the study identified multiple lines of evidence that sites were impacted by exposure to spill-related contaminants. Future multidisciplinary studies at these sites, as part of a long-term monitoring program, should inform on ecosystem recovery.
Accurate phenotyping: Reconciling approaches through Bayesian model averaging.
Directory of Open Access Journals (Sweden)
Carla Chia-Ming Chen
Full Text Available Genetic research into complex diseases is frequently hindered by a lack of clear biomarkers for phenotype ascertainment. Phenotypes for such diseases are often identified on the basis of clinically defined criteria; however such criteria may not be suitable for understanding the genetic composition of the diseases. Various statistical approaches have been proposed for phenotype definition; however our previous studies have shown that differences in phenotypes estimated using different approaches have substantial impact on subsequent analyses. Instead of obtaining results based upon a single model, we propose a new method, using Bayesian model averaging to overcome problems associated with phenotype definition. Although Bayesian model averaging has been used in other fields of research, this is the first study that uses Bayesian model averaging to reconcile phenotypes obtained using multiple models. We illustrate the new method by applying it to simulated genetic and phenotypic data for Kofendred personality disorder-an imaginary disease with several sub-types. Two separate statistical methods were used to identify clusters of individuals with distinct phenotypes: latent class analysis and grade of membership. Bayesian model averaging was then used to combine the two clusterings for the purpose of subsequent linkage analyses. We found that causative genetic loci for the disease produced higher LOD scores using model averaging than under either individual model separately. We attribute this improvement to consolidation of the cores of phenotype clusters identified using each individual method.
Petroleum hydrocarbons in sediment from the northern Gulf of Mexico shoreline, Texas to Florida
Rosenbauer, Robert J.; Campbell, Pamela L.; Lam, Angela; Lorenson, T.D.; Hostettler, Frances D.; Thomas, Burt; Wong, Florence L.
2011-01-01
Petroleum hydrocarbons were extracted and analyzed from shoreline sediment collected from the northern Gulf of Mexico (nGOM) coastline that could potentially be impacted by Macondo-1 (M-1) well oil. Sediment was collected before M-1 well oil made significant local landfall and analyzed for baseline conditions by a suite of diagnostic petroleum biomarkers. Oil residue in trace quantities was detected in 45 of 69 samples. With the aid of multivariate statistical analysis, three different oil groups, based on biomarker similarity, were identified that were distributed geographically along the nGOM from Texas to Florida. None of the sediment hydrocarbon extracts correlated with the M-1 well oil extract, however, the similarity of tarballs collected at one site (FL-18) with the M-1 well oil suggests that some oil from the Deepwater Horizon spill may have been transported to this site in the Florida Keys, perhaps by a loop current, before that site was sampled.
DEFF Research Database (Denmark)
The 19th Scandinavian Conference on Image Analysis was held at the IT University of Copenhagen in Denmark during June 15-17, 2015. The SCIA conference series has been an ongoing biannual event for more than 30 years and over the years it has nurtured a world-class regional research and development...... area within the four participating Nordic countries. It is a regional meeting of the International Association for Pattern Recognition (IAPR). We would like to thank all authors who submitted works to this year’s SCIA, the invited speakers, and our Program Committee. In total 67 papers were submitted....... The topics of the accepted papers range from novel applications of vision systems, pattern recognition, machine learning, feature extraction, segmentation, 3D vision, to medical and biomedical image analysis. The papers originate from all the Scandinavian countries and several other European countries...
Helson, Henry
2010-01-01
This second edition has been enlarged and considerably rewritten. Among the new topics are infinite product spaces with applications to probability, disintegration of measures on product spaces, positive definite functions on the line, and additional information about Weyl's theorems on equidistribution. Topics that have continued from the first edition include Minkowski's theorem, measures with bounded powers, idempotent measures, spectral sets of bounded functions and a theorem of Szego, and the Wiener Tauberian theorem. Readers of the book should have studied the Lebesgue integral, the elementary theory of analytic and harmonic functions, and the basic theory of Banach spaces. The treatment is classical and as simple as possible. This is an instructional book, not a treatise. Mathematics students interested in analysis will find here what they need to know about Fourier analysis. Physicists and others can use the book as a reference for more advanced topics.
Bray, Hubert L; Mazzeo, Rafe; Sesum, Natasa
2015-01-01
This volume includes expanded versions of the lectures delivered in the Graduate Minicourse portion of the 2013 Park City Mathematics Institute session on Geometric Analysis. The papers give excellent high-level introductions, suitable for graduate students wishing to enter the field and experienced researchers alike, to a range of the most important areas of geometric analysis. These include: the general issue of geometric evolution, with more detailed lectures on Ricci flow and Kähler-Ricci flow, new progress on the analytic aspects of the Willmore equation as well as an introduction to the recent proof of the Willmore conjecture and new directions in min-max theory for geometric variational problems, the current state of the art regarding minimal surfaces in R^3, the role of critical metrics in Riemannian geometry, and the modern perspective on the study of eigenfunctions and eigenvalues for Laplace-Beltrami operators.
Freitag, Eberhard
2005-01-01
The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...
International Nuclear Information System (INIS)
Quinn, C.A.
1983-01-01
The article deals with spectrographic analysis and the analytical methods based on it. The theory of spectrographic analysis is discussed as well as the layout of a spectrometer system. The infrared absorption spectrum of a compound is probably its most unique property. The absorption of infrared radiation depends on increasing the energy of vibration and rotation associated with a covalent bond. The infrared region is intrinsically low in energy thus the design of infrared spectrometers is always directed toward maximising energy throughput. The article also considers atomic absorption - flame atomizers, non-flame atomizers and the source of radiation. Under the section an emission spectroscopy non-electrical energy sources, electrical energy sources and electrical flames are discussed. Digital computers form a part of the development on spectrographic instrumentation
Cheng, Lizhi; Luo, Yong; Chen, Bo
2014-01-01
This book could be divided into two parts i.e. fundamental wavelet transform theory and method and some important applications of wavelet transform. In the first part, as preliminary knowledge, the Fourier analysis, inner product space, the characteristics of Haar functions, and concepts of multi-resolution analysis, are introduced followed by a description on how to construct wavelet functions both multi-band and multi wavelets, and finally introduces the design of integer wavelets via lifting schemes and its application to integer transform algorithm. In the second part, many applications are discussed in the field of image and signal processing by introducing other wavelet variants such as complex wavelets, ridgelets, and curvelets. Important application examples include image compression, image denoising/restoration, image enhancement, digital watermarking, numerical solution of partial differential equations, and solving ill-conditioned Toeplitz system. The book is intended for senior undergraduate stude...
International Nuclear Information System (INIS)
Hwang, Hun
2007-02-01
This book explains potentiometry, voltametry, amperometry and basic conception of conductometry with eleven chapters. It gives the specific descriptions on electrochemical cell and its mode, basic conception of electrochemical analysis on oxidation-reduction reaction, standard electrode potential, formal potential, faradaic current and faradaic process, mass transfer and overvoltage, potentiometry and indirect potentiometry, polarography with TAST, normal pulse and deferential pulse, voltammetry, conductometry and conductometric titration.
International Nuclear Information System (INIS)
Badwe, R.A.
1999-01-01
The primary endpoint in the majority of the studies has been either disease recurrence or death. This kind of analysis requires a special method since all patients in the study experience the endpoint. The standard method for estimating such survival distribution is Kaplan Meier method. The survival function is defined as the proportion of individuals who survive beyond certain time. Multi-variate comparison for survival has been carried out with Cox's proportional hazard model
DEFF Research Database (Denmark)
Andersen, Lars
This book contains the lecture notes for the 9th semester course on elastodynamics. The first chapter gives an overview of the basic theory of stress waves propagating in viscoelastic media. In particular, the effect of surfaces and interfaces in a viscoelastic material is studied, and different....... Thus, in Chapter 3, an alternative semi-analytic method is derived, which may be applied for the analysis of layered half-spaces subject to moving or stationary loads....
Mucha, Hans-Joachim; Sofyan, Hizir
2000-01-01
As an explorative technique, duster analysis provides a description or a reduction in the dimension of the data. It classifies a set of observations into two or more mutually exclusive unknown groups based on combinations of many variables. Its aim is to construct groups in such a way that the profiles of objects in the same groups are relatively homogenous whereas the profiles of objects in different groups are relatively heterogeneous. Clustering is distinct from classification techniques, ...
International Nuclear Information System (INIS)
Garbarino, J.R.; Steinheimer, T.R.; Taylor, H.E.
1985-01-01
This is the twenty-first biennial review of the inorganic and organic analytical chemistry of water. The format of this review differs somewhat from previous reviews in this series - the most recent of which appeared in Analytical Chemistry in April 1983. Changes in format have occurred in the presentation of material concerning review articles and the inorganic analysis of water sections. Organic analysis of water sections are organized as in previous reviews. Review articles have been compiled and tabulated in an Appendix with respect to subject, title, author(s), citation, and number of references cited. The inorganic water analysis sections are now grouped by constituent using the periodic chart; for example, alkali, alkaline earth, 1st series transition metals, etc. Within these groupings the references are roughly grouped by instrumental technique; for example, spectrophotometry, atomic absorption spectrometry, etc. Multiconstituent methods for determining analytes that cannot be grouped in this manner are compiled into a separate section sorted by instrumental technique. References used in preparing this review were compiled from nearly 60 major journals published during the period from October 1982 through September 1984. Conference proceedings, most foreign journals, most trade journals, and most government publications are excluded. References cited were obtained using the American Chemical Society's Chemical Abstracts for sections on inorganic analytical chemistry, organic analytical chemistry, water, and sewage waste. Cross-references of these sections were also included. 860 references
Energy Technology Data Exchange (ETDEWEB)
None
1980-06-01
The Energy Policy and Conservation Act (EPCA) mandated that minimum energy efficiency standards be established for classes of refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, home heating equipment, kitchen ranges and ovens, central air conditioners, and furnaces. EPCA requires that standards be designed to achieve the maximum improvement in energy efficiency that is technologically feasible and economically justified. Following the introductory chapter, Chapter Two describes the methodology used in the economic analysis and its relationship to legislative criteria for consumer product efficiency assessment; details how the CPES Value Model systematically compared and evaluated the economic impacts of regulation on the consumer, manufacturer and Nation. Chapter Three briefly displays the results of the analysis and lists the proposed performance standards by product class. Chapter Four describes the reasons for developing a baseline forecast, characterizes the baseline scenario from which regulatory impacts were calculated and summarizes the primary models, data sources and assumptions used in the baseline formulations. Chapter Five summarizes the methodology used to calculate regulatory impacts; describes the impacts of energy performance standards relative to the baseline discussed in Chapter Four. Also discussed are regional standards and other program alternatives to performance standards. Chapter Six describes the procedure for balancing consumer, manufacturer, and national impacts to select standard levels. Details of models and data bases used in the analysis are included in Appendices A through K.
Newell, Homer E
2006-01-01
When employed with skill and understanding, vector analysis can be a practical and powerful tool. This text develops the algebra and calculus of vectors in a manner useful to physicists and engineers. Numerous exercises (with answers) not only provide practice in manipulation but also help establish students' physical and geometric intuition in regard to vectors and vector concepts.Part I, the basic portion of the text, consists of a thorough treatment of vector algebra and the vector calculus. Part II presents the illustrative matter, demonstrating applications to kinematics, mechanics, and e
Brand, Louis
2006-01-01
The use of vectors not only simplifies treatments of differential geometry, mechanics, hydrodynamics, and electrodynamics, but also makes mathematical and physical concepts more tangible and easy to grasp. This text for undergraduates was designed as a short introductory course to give students the tools of vector algebra and calculus, as well as a brief glimpse into these subjects' manifold applications. The applications are developed to the extent that the uses of the potential function, both scalar and vector, are fully illustrated. Moreover, the basic postulates of vector analysis are brou
Large-scale deposition of weathered oil in the Gulf of Mexico following a deep-water oil spill.
Romero, Isabel C; Toro-Farmer, Gerardo; Diercks, Arne-R; Schwing, Patrick; Muller-Karger, Frank; Murawski, Steven; Hollander, David J
2017-09-01
The blowout of the Deepwater Horizon (DWH) drilling rig in 2010 released an unprecedented amount of oil at depth (1,500 m) into the Gulf of Mexico (GoM). Sedimentary geochemical data from an extensive area (∼194,000 km 2 ) was used to characterize the amount, chemical signature, distribution, and extent of the DWH oil deposited on the seafloor in 2010-2011 from coastal to deep-sea areas in the GoM. The analysis of numerous hydrocarbon compounds (N = 158) and sediment cores (N = 2,613) suggests that, 1.9 ± 0.9 × 10 4 metric tons of hydrocarbons (>C9 saturated and aromatic fractions) were deposited in 56% of the studied area, containing 21± 10% (up to 47%) of the total amount of oil discharged and not recovered from the DWH spill. Examination of the spatial trends and chemical diagnostic ratios indicate large deposition of weathered DWH oil in coastal and deep-sea areas and negligible deposition on the continental shelf (behaving as a transition zone in the northern GoM). The large-scale analysis of deposited hydrocarbons following the DWH spill helps understanding the possible long-term fate of the released oil in 2010, including sedimentary transformation processes, redistribution of deposited hydrocarbons, and persistence in the environment as recycled petrocarbon. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Abbott, Stephen
2015-01-01
This lively introductory text exposes the student to the rewards of a rigorous study of functions of a real variable. In each chapter, informal discussions of questions that give analysis its inherent fascination are followed by precise, but not overly formal, developments of the techniques needed to make sense of them. By focusing on the unifying themes of approximation and the resolution of paradoxes that arise in the transition from the finite to the infinite, the text turns what could be a daunting cascade of definitions and theorems into a coherent and engaging progression of ideas. Acutely aware of the need for rigor, the student is much better prepared to understand what constitutes a proper mathematical proof and how to write one. Fifteen years of classroom experience with the first edition of Understanding Analysis have solidified and refined the central narrative of the second edition. Roughly 150 new exercises join a selection of the best exercises from the first edition, and three more project-sty...
DEFF Research Database (Denmark)
Moore, R; Brødsgaard, I; Miller, ML
1997-01-01
A quantitative method for validating qualitative interview results and checking sample parameters is described and illustrated using common pain descriptions among a sample of Anglo-American and mandarin Chinese patients and dentists matched by age and gender. Assumptions were that subjects were ...... of covalidating questionnaires that reflect results of qualitative interviews are recommended in order to estimate sample parameters such as intersubject agreement, individual subject accuracy, and minimum required sample sizes.......A quantitative method for validating qualitative interview results and checking sample parameters is described and illustrated using common pain descriptions among a sample of Anglo-American and mandarin Chinese patients and dentists matched by age and gender. Assumptions were that subjects were...... of consistency in use of descriptors within groups, validity of description, accuracy of individuals compared with others in their group, and minimum required sample size were calculated using Cronbach's alpha, factor analysis, and Bayesian probability. Ethnic and professional differences within and across...
International Nuclear Information System (INIS)
Iorio, A.F.; Crespi, J.C.
1987-01-01
After ten years of operation at the Atucha I Nuclear Power Station a gear belonging to a pressurized heavy water reactor refuelling machine, failed. The gear box was used to operate the inlet-outlet heavy-water valve of the machine. Visual examination of the gear device showed an absence of lubricant and that several gear teeth were broken at the root. Motion was transmitted with a speed-reducing device with controlled adjustable times in order to produce a proper fitness of the valve closure. The aim of this paper is to discuss the results of the gear failure analysis in order to recommend the proper solution to prevent further failures. (Author)
International Nuclear Information System (INIS)
1988-01-01
In a search for correlations between the elemental composition of trace elements in human stones and the stone types with relation to their growth pattern, a combined PIXE and x-ray diffraction spectrometry approach was implemented. The combination of scanning PIXE and XRD has proved to be an advance in the methodology of stone analysis and may point to the growth pattern in the body. The exact role of trace elements in the formation and growth of urinary stones is not fully understood. Efforts are thus continuing firstly to solve the analytical problems concerned and secondly to design suitable experiments that would provide information about the occurrence and distribution of trace elements in urine. 1 fig., 1 ref
Bianchi, Thomas S; Osburn, Christopher; Shields, Michael R; Yvon-Lewis, Shari; Young, Jordan; Guo, Laodong; Zhou, Zhengzhen
2014-08-19
Recent work has shown the presence of anomalous dissolved organic matter (DOM), with high optical yields, in deep waters 15 months after the Deepwater Horizon (DWH) oil spill in the Gulf of Mexico (GOM). Here, we continue to use the fluorescence excitation-emission matrix (EEM) technique coupled with parallel factor analysis (PARAFAC) modeling, measurements of bulk organic carbon, dissolved inorganic carbon (DIC), oil indices, and other optical properties to examine the chemical evolution and transformation of oil components derived from the DWH in the water column of the GOM. Seawater samples were collected from the GOM during July 2012, 2 years after the oil spill. This study shows that, while dissolved organic carbon (DOC) values have decreased since just after the DWH spill, they remain higher at some stations than typical deep-water values for the GOM. Moreover, we continue to observe fluorescent DOM components in deep waters, similar to those of degraded oil observed in lab and field experiments, which suggest that oil-related fluorescence signatures, as part of the DOM pool, have persisted for 2 years in the deep waters. This supports the notion that some oil-derived chromophoric dissolved organic matter (CDOM) components could still be identified in deep waters after 2 years of degradation, which is further supported by the lower DIC and partial pressure of carbon dioxide (pCO2) associated with greater amounts of these oil-derived components in deep waters, assuming microbial activity on DOM in the current water masses is only the controlling factor of DIC and pCO2 concentrations.
Schrandt, Meagan N; Andres, Michael J; Powers, Sean P; Overstreet, Robin M
2016-06-01
An undescribed, cryptic species of Didymocystis, as determined from sequences of 2 ribosomal genes and superficially similar to Didymocystis scomberomori ( MacCallum and MacCallum, 1916 ), infected the skin of the Spanish mackerel, Scomberomorus maculatus , in the north-central Gulf of Mexico (GOM). An analysis of 558 fish from 2011 to 2013 from Louisiana, Mississippi, Alabama, and the Florida panhandle showed the prevalence of the trematode varied both spatially and temporally but not with sex of the fish host. Month, year, and geographic location were identified by a negative binomial generalized linear model as indicators of the abundance and intensity of infection. Prevalence, abundance, and intensity of infection were greatest in spring and fall months off the Florida panhandle. Furthermore, the abundance and intensity of infection correlated negatively with fork length, weight, and gonad weight of mature fish but positively with longitude. Therefore, smaller adult fish tended to be more infected than larger adults, and prevalence and intensity increased from west to east (Louisiana to Florida). Spatial and temporal trends seemed to result from physical factors (e.g., water temperature, salinity, bottom type), but they also coincided with the annual migration of S. maculatus as fish moved northward along the GOM coastline from the southern tip of Florida in the spring months and returned in the fall, being present in the north-central GOM from late spring through fall. This pattern suggests the possibility that acquisition of infections occurred from a molluscan host in waters off the Florida panhandle.
International Nuclear Information System (INIS)
Straub, W.A.
1987-01-01
This review is the seventh in the series compiled by using the Dialog on-line CA Search facilities at the Information Resource Center of USS Technical Center covering the period from Oct. 1984 to Nov. 1, 1986. The quest for better surface properties, through the application of various electrochemical and other coating techniques, seems to have increased and reinforces the notion that only through the value added to a steel by proper finishing steps can a major supplier hope to compete profitably. The detection, determination, and control of microalloying constituents has also been generating a lot of interest as evidenced by the number of publications devoted to this subject in the last few years. Several recent review articles amplify on the recent trends in the application of modern analytical technology to steelmaking. Another review has been devoted to the determination of trace elements and the simultaneous determination of elements in metals by mass spectrometry, atomic absorption spectrometry, and multielement emission spectrometry. Problems associated with the analysis of electroplating wastewaters have been reviewed in a recent publication that has described the use of various spectrophotometric methods for this purpose. The collection and treatment of analytical data in the modern steel making environment have been extensively reviewed emphasis on the interaction of the providers and users of the analytical data, its quality, and the cost of its collection. Raw material treatment and beneficiation was the dominant theme
Bhatia, Rajendra
1997-01-01
A good part of matrix theory is functional analytic in spirit. This statement can be turned around. There are many problems in operator theory, where most of the complexities and subtleties are present in the finite-dimensional case. My purpose in writing this book is to present a systematic treatment of methods that are useful in the study of such problems. This book is intended for use as a text for upper division and gradu ate courses. Courses based on parts of the material have been given by me at the Indian Statistical Institute and at the University of Toronto (in collaboration with Chandler Davis). The book should also be useful as a reference for research workers in linear algebra, operator theory, mathe matical physics and numerical analysis. A possible subtitle of this book could be Matrix Inequalities. A reader who works through the book should expect to become proficient in the art of deriving such inequalities. Other authors have compared this art to that of cutting diamonds. One first has to...
International Nuclear Information System (INIS)
Thomas, R.E.
1982-03-01
An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software
Professional organisation profile: a faculty of expedition and wilderness medicine for Australasia.
Leggat, Peter A; Shaw, Marc T M
2012-05-01
A profile of the recent genesis of the Sub-Faculty of Expedition Medicine into a Faculty of Expedition and Wilderness Medicine of The Australasian College of Tropical Medicine is presented. Information is given on aims, structure, professional grades of membership, and the various activities of the Faculty, including publications and scientific meetings. Copyright © 2012 Elsevier Ltd. All rights reserved.
Professional organisation profile: a sub-Faculty of expedition medicine for Australasia.
Leggat, Peter A; Shaw, Marc T M
2010-05-01
A review of the recent foundation by The Australasian College of Tropical Medicine of the Sub-Faculty of Expedition Medicine is presented. Information is given on aims, professional grades of membership, and the various activities of the Sub-Faculty, including publications and scientific meetings. Copyright 2010 Elsevier Ltd. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Ibsen, Lars Bo; Liingaard, M.
2006-12-15
This technical report concerns the basic theory and principles for experimental modal analysis. The sections within the report are: Output-only modal analysis software, general digital analysis, basics of structural dynamics and modal analysis and system identification. (au)
Theoretical numerical analysis a functional analysis framework
Atkinson, Kendall
2005-01-01
This textbook prepares graduate students for research in numerical analysis/computational mathematics by giving to them a mathematical framework embedded in functional analysis and focused on numerical analysis. This helps the student to move rapidly into a research program. The text covers basic results of functional analysis, approximation theory, Fourier analysis and wavelets, iteration methods for nonlinear equations, finite difference methods, Sobolev spaces and weak formulations of boundary value problems, finite element methods, elliptic variational inequalities and their numerical solu
International Nuclear Information System (INIS)
2003-08-01
This book deals with analysis of heat transfer which includes nonlinear analysis examples, radiation heat transfer, analysis of heat transfer in ANSYS, verification of analysis result, analysis of heat transfer of transition with automatic time stepping and open control, analysis of heat transfer using arrangement of ANSYS, resistance of thermal contact, coupled field analysis such as of thermal-structural interaction, cases of coupled field analysis, and phase change.
Information security risk analysis
Peltier, Thomas R
2001-01-01
Effective Risk AnalysisQualitative Risk AnalysisValue AnalysisOther Qualitative MethodsFacilitated Risk Analysis Process (FRAP)Other Uses of Qualitative Risk AnalysisCase StudyAppendix A: QuestionnaireAppendix B: Facilitated Risk Analysis Process FormsAppendix C: Business Impact Analysis FormsAppendix D: Sample of ReportAppendix E: Threat DefinitionsAppendix F: Other Risk Analysis OpinionsIndex
International Nuclear Information System (INIS)
Son, Seung Hui
2004-02-01
This book deals with information technology and business process, information system architecture, methods of system development, plan on system development like problem analysis and feasibility analysis, cases for system development, comprehension of analysis of users demands, analysis of users demands using traditional analysis, users demands analysis using integrated information system architecture, system design using integrated information system architecture, system implementation, and system maintenance.
Louisiana Geographic Information Center — Offshore Minerals Management Pipeline Locations for the Gulf of Mexico (GOM). Contains the lines of the pipeline in the GOM. All pipelines existing in the databases...
Louisiana Geographic Information Center — Offshore Minerals Management Pipeline Locations for the Gulf of Mexico (GOM). Contains the points of the pipeline in the GOM. All pipelines existing in the databases...
Analysis of Project Finance | Energy Analysis | NREL
Analysis of Project Finance Analysis of Project Finance NREL analysis helps potential renewable energy developers and investors gain insights into the complex world of project finance. Renewable energy project finance is complex, requiring knowledge of federal tax credits, state-level incentives, renewable
International Nuclear Information System (INIS)
Wright, A.C.D.
2002-01-01
This paper discusses the safety analysis fundamentals in reactor design. This study includes safety analysis done to show consequences of postulated accidents are acceptable. Safety analysis is also used to set design of special safety systems and includes design assist analysis to support conceptual design. safety analysis is necessary for licensing a reactor, to maintain an operating license, support changes in plant operations
An example of multidimensional analysis: Discriminant analysis
International Nuclear Information System (INIS)
Lutz, P.
1990-01-01
Among the approaches on the data multi-dimensional analysis, lectures on the discriminant analysis including theoretical and practical aspects are presented. The discrimination problem, the analysis steps and the discrimination categories are stressed. Examples on the descriptive historical analysis, the discrimination for decision making, the demonstration and separation of the top quark are given. In the linear discriminant analysis the following subjects are discussed: Huyghens theorem, projection, discriminant variable, geometrical interpretation, case for g=2, classification method, separation of the top events. Criteria allowing the obtention of relevant results are included [fr
... Sources Ask Us Also Known As Sperm Analysis Sperm Count Seminal Fluid Analysis Formal Name Semen Analysis This ... semen Viscosity—consistency or thickness of the semen Sperm count—total number of sperm Sperm concentration (density)—number ...
Papageorgiou, Nikolaos S
2009-01-01
Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.
Shape analysis in medical image analysis
Tavares, João
2014-01-01
This book contains thirteen contributions from invited experts of international recognition addressing important issues in shape analysis in medical image analysis, including techniques for image segmentation, registration, modelling and classification, and applications in biology, as well as in cardiac, brain, spine, chest, lung and clinical practice. This volume treats topics such as, anatomic and functional shape representation and matching; shape-based medical image segmentation; shape registration; statistical shape analysis; shape deformation; shape-based abnormity detection; shape tracking and longitudinal shape analysis; machine learning for shape modeling and analysis; shape-based computer-aided-diagnosis; shape-based medical navigation; benchmark and validation of shape representation, analysis and modeling algorithms. This work will be of interest to researchers, students, and manufacturers in the fields of artificial intelligence, bioengineering, biomechanics, computational mechanics, computationa...
Organization Search Go Search Polar Go MMAB SST Analysis Main page About MMAB Our Mission Our Personnel EMC Branches Global Climate & Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Contact EMC (RTG_SST_HR) analysis For a regional map, click the desired area in the global SST analysis and anomaly maps
Foundations of factor analysis
Mulaik, Stanley A
2009-01-01
Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti
International Nuclear Information System (INIS)
PECH, S.H.
2000-01-01
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report
Quantitative analysis chemistry
International Nuclear Information System (INIS)
Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung
1995-02-01
This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.
Energy Technology Data Exchange (ETDEWEB)
PECH, S.H.
2000-08-23
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.
International Nuclear Information System (INIS)
WEBB, R.H.
1999-01-01
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062/Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report
Philipp Mayring
2000-01-01
The article describes an approach of systematic, rule guided qualitative text analysis, which tries to preserve some methodological strengths of quantitative content analysis and widen them to a concept of qualitative procedure. First the development of content analysis is delineated and the basic principles are explained (units of analysis, step models, working with categories, validity and reliability). Then the central procedures of qualitative content analysis, inductive development of ca...
RELIABILITY ANALYSIS OF BENDING ELIABILITY ANALYSIS OF ...
African Journals Online (AJOL)
eobe
Reliability analysis of the safety levels of the criteria slabs, have been .... was also noted [2] that if the risk level or β < 3.1), the ... reliability analysis. A study [6] has shown that all geometric variables, ..... Germany, 1988. 12. Hasofer, A. M and ...
DTI analysis methods : Voxel-based analysis
Van Hecke, Wim; Leemans, Alexander; Emsell, Louise
2016-01-01
Voxel-based analysis (VBA) of diffusion tensor imaging (DTI) data permits the investigation of voxel-wise differences or changes in DTI metrics in every voxel of a brain dataset. It is applied primarily in the exploratory analysis of hypothesized group-level alterations in DTI parameters, as it does
Analysis of Precision of Activation Analysis Method
DEFF Research Database (Denmark)
Heydorn, Kaj; Nørgaard, K.
1973-01-01
The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...
Hazard Analysis Database Report
Grams, W H
2000-01-01
The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from t...
Santiago, John
2013-01-01
Circuits overloaded from electric circuit analysis? Many universities require that students pursuing a degree in electrical or computer engineering take an Electric Circuit Analysis course to determine who will ""make the cut"" and continue in the degree program. Circuit Analysis For Dummies will help these students to better understand electric circuit analysis by presenting the information in an effective and straightforward manner. Circuit Analysis For Dummies gives you clear-cut information about the topics covered in an electric circuit analysis courses to help
Cluster analysis for applications
Anderberg, Michael R
1973-01-01
Cluster Analysis for Applications deals with methods and various applications of cluster analysis. Topics covered range from variables and scales to measures of association among variables and among data units. Conceptual problems in cluster analysis are discussed, along with hierarchical and non-hierarchical clustering methods. The necessary elements of data analysis, statistics, cluster analysis, and computer implementation are integrated vertically to cover the complete path from raw data to a finished analysis.Comprised of 10 chapters, this book begins with an introduction to the subject o
Activation analysis in food analysis. Pt. 9
International Nuclear Information System (INIS)
Szabo, S.A.
1992-01-01
An overview is presented on the application of activation analysis (AA) techniques for food analysis, as reflected at a recent international conference titled Activation Analysis and its Applications. The most popular analytical techniques include instrumental neutron AA, (INAA or NAA), radiochemical NAA (RNAA), X-ray fluorescence analysis and mass spectrometry. Data are presented for the multielemental NAA of instant soups, for elemental composition of drinking water in Iraq, for Na, K, Mn contents of various Indian rices, for As, Hg, Sb and Se determination in various seafoods, for daily microelement takeup in China, for the elemental composition of Chinese teas. Expected development trends in AA are outlined. (R.P.) 24 refs.; 8 tabs
Joint fluid analysis; Joint fluid aspiration ... El-Gabalawy HS. Synovial fluid analysis, synovial biopsy, and synovial pathology. In: Firestein GS, Budd RC, Gabriel SE, McInnes IB, O'Dell JR, eds. Kelly's Textbook of ...
International Nuclear Information System (INIS)
Burgess, R.L.
1978-01-01
Progress is reported on the following research programs: analysis and modeling of ecosystems; EDFB/IBP data center; biome analysis studies; land/water interaction studies; and computer programs for development of models
Confirmatory Composite Analysis
Schuberth, Florian; Henseler, Jörg; Dijkstra, Theo K.
2018-01-01
We introduce confirmatory composite analysis (CCA) as a structural equation modeling technique that aims at testing composite models. CCA entails the same steps as confirmatory factor analysis: model specification, model identification, model estimation, and model testing. Composite models are
Introductory numerical analysis
Pettofrezzo, Anthony J
2006-01-01
Written for undergraduates who require a familiarity with the principles behind numerical analysis, this classical treatment encompasses finite differences, least squares theory, and harmonic analysis. Over 70 examples and 280 exercises. 1967 edition.
Gap Analysis: Application to Earned Value Analysis
Langford, Gary O.; Franck, Raymond (Chip)
2008-01-01
Sponsored Report (for Acquisition Research Program) Earned Value is regarded as a useful tool to monitor commercial and defense system acquisitions. This paper applies the theoretical foundations and systematics of Gap Analysis to improve Earned Value Management. As currently implemented, Earned Value inaccurately provides a higher value for the work performed. This preliminary research indicates that Earned Value calculations can be corrected. Value Analysis, properly defined and enacted,...
Importance-performance analysis based SWOT analysis
Phadermrod, Boonyarat; Crowder, Richard M.; Wills, Gary B.
2016-01-01
SWOT analysis, a commonly used tool for strategic planning, is traditionally a form of brainstorming. Hence, it has been criticised that it is likely to hold subjective views of the individuals who participate in a brainstorming session and that SWOT factors are not prioritized by their significance thus it may result in an improper strategic action. While most studies of SWOT analysis have only focused on solving these shortcomings separately, this study offers an approach to diminish both s...
Discourse analysis and Foucault's
Directory of Open Access Journals (Sweden)
Jansen I.
2008-01-01
Full Text Available Discourse analysis is a method with up to now was less recognized in nursing science, althoughmore recently nursing scientists are discovering it for their purposes. However, several authors have criticized thatdiscourse analysis is often misinterpreted because of a lack of understanding of its theoretical backgrounds. In thisarticle, I reconstruct Foucault’s writings in his “Archaeology of Knowledge” to provide a theoretical base for futurearchaeological discourse analysis, which can be categorized as a socio-linguistic discourse analysis.
Yuan, Ying; MacKinnon, David P.
2009-01-01
This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...
Fitzmaurice, Garrett M; Ware, James H
2012-01-01
Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo
Regression analysis by example
Chatterjee, Samprit
2012-01-01
Praise for the Fourth Edition: ""This book is . . . an excellent source of examples for regression analysis. It has been and still is readily readable and understandable."" -Journal of the American Statistical Association Regression analysis is a conceptually simple method for investigating relationships among variables. Carrying out a successful application of regression analysis, however, requires a balance of theoretical results, empirical rules, and subjective judgment. Regression Analysis by Example, Fifth Edition has been expanded
2014-01-01
M.Ing. (Electrical & Electronic Engineering) One of the most important steps to be taken before a site is to be selected for the extraction of wind energy is the analysis of the energy within the wind on that particular site. No wind energy analysis system exists for the measurement and analysis of wind power. This dissertation documents the design and development of a Wind Energy Analysis System (WEAS). Using a micro-controller based design in conjunction with sensors, WEAS measure, calcu...
Slice hyperholomorphic Schur analysis
Alpay, Daniel; Sabadini, Irene
2016-01-01
This book defines and examines the counterpart of Schur functions and Schur analysis in the slice hyperholomorphic setting. It is organized into three parts: the first introduces readers to classical Schur analysis, while the second offers background material on quaternions, slice hyperholomorphic functions, and quaternionic functional analysis. The third part represents the core of the book and explores quaternionic Schur analysis and its various applications. The book includes previously unpublished results and provides the basis for new directions of research.
Computational movement analysis
Laube, Patrick
2014-01-01
This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi
Directory of Open Access Journals (Sweden)
Marcia Caldas de Castro
2007-12-01
Full Text Available A transmissão de malária em projetos de assentamento na Amazônia, definida como malária de fronteira, é resultado de um intrincado processo envolvendo fatores biológicos, ecológicos, socioeconômicos e comportamentais, apresentando uma transição temporal de altas a baixas taxas ao longo de aproximadamente oito anos. Como resultado, um dos grandes desafios é a compreensão desse processo, através da identificação das variáveis determinantes da transmissão, considerando-se dimensões temporais e espaciais. Neste artigo é apresentada uma abordagem metodológica que caracteriza perfis de risco de malária em projetos de assentamento, a partir de uma análise multidisciplinar. Composta de três etapas, a abordagem combina análise espacial, geoestatística e modelos de Grade of Membership. Os resultados ressaltam a importância de medidas de controle diferenciadas de acordo com o estágio do projeto de assentamento (implementação recente ou antiga e o nível de transmissão em cada localidade.La transmisión de malaria en proyectos de asentamiento en la Amazonia, definida como malaria de frontera, es el resultado de un intrincado proceso, involucrando factores biológicos, ecológicos, socioeconómicos y comportamentales, presentando una transición temporal de altas a bajas tasas a lo largo de aproximadamente ocho años. Como resultado, uno de los grandes desafíos es la comprensión de ese proceso, a través de la identificación de las variables determinantes de la transmisión, habiéndose considerado dimensiones temporales y espaciales. En este artículo es presentado un abordaje metodológico que caracteriza perfiles de riesgo de malaria en proyectos de asentamiento, a partir de un análisis multidisciplinario. Compuesto de tres etapas, el abordaje combina análisis espacial, geoestadística y modelos de Grade of Membership. Los resultados resaltan la importancia de medidas de control diferenciadas, de acuerdo a la etapa del
Thompson, Cheryl Bagley
2009-01-01
This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.
Trend Analysis Using Microcomputers.
Berger, Carl F.
A trend analysis statistical package and additional programs for the Apple microcomputer are presented. They illustrate strategies of data analysis suitable to the graphics and processing capabilities of the microcomputer. The programs analyze data sets using examples of: (1) analysis of variance with multiple linear regression; (2) exponential…
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
Automation of activation analysis
International Nuclear Information System (INIS)
Ivanov, I.N.; Ivanets, V.N.; Filippov, V.V.
1985-01-01
The basic data on the methods and equipment of activation analysis are presented. Recommendations on the selection of activation analysis techniques, and especially the technique envisaging the use of short-lived isotopes, are given. The equipment possibilities to increase dataway carrying capacity, using modern computers for the automation of the analysis and data processing procedure, are shown
Cuesta, Hector
2013-01-01
Each chapter of the book quickly introduces a key 'theme' of Data Analysis, before immersing you in the practical aspects of each theme. You'll learn quickly how to perform all aspects of Data Analysis.Practical Data Analysis is a book ideal for home and small business users who want to slice & dice the data they have on hand with minimum hassle.
Mathematical analysis fundamentals
Bashirov, Agamirza
2014-01-01
The author's goal is a rigorous presentation of the fundamentals of analysis, starting from elementary level and moving to the advanced coursework. The curriculum of all mathematics (pure or applied) and physics programs include a compulsory course in mathematical analysis. This book will serve as can serve a main textbook of such (one semester) courses. The book can also serve as additional reading for such courses as real analysis, functional analysis, harmonic analysis etc. For non-math major students requiring math beyond calculus, this is a more friendly approach than many math-centric o
Foundations of mathematical analysis
Johnsonbaugh, Richard
2010-01-01
This classroom-tested volume offers a definitive look at modern analysis, with views of applications to statistics, numerical analysis, Fourier series, differential equations, mathematical analysis, and functional analysis. Upper-level undergraduate students with a background in calculus will benefit from its teachings, along with beginning graduate students seeking a firm grounding in modern analysis. A self-contained text, it presents the necessary background on the limit concept, and the first seven chapters could constitute a one-semester introduction to limits. Subsequent chapters discuss
Analysis in usability evaluations
DEFF Research Database (Denmark)
Følstad, Asbjørn; Lai-Chong Law, Effie; Hornbæk, Kasper
2010-01-01
While the planning and implementation of usability evaluations are well described in the literature, the analysis of the evaluation data is not. We present interviews with 11 usability professionals on how they conduct analysis, describing the resources, collaboration, creation of recommendations......, and prioritization involved. The interviews indicate a lack of structure in the analysis process and suggest activities, such as generating recommendations, that are unsupported by existing methods. We discuss how to better support analysis, and propose four themes for future research on analysis in usability...
DEFF Research Database (Denmark)
Bøving, Kristian Billeskov; Simonsen, Jesper
2004-01-01
This article documents how log analysis can inform qualitative studies concerning the usage of web-based information systems (WIS). No prior research has used http log files as data to study collaboration between multiple users in organisational settings. We investigate how to perform http log...... analysis; what http log analysis says about the nature of collaborative WIS use; and how results from http log analysis may support other data collection methods such as surveys, interviews, and observation. The analysis of log files initially lends itself to research designs, which serve to test...... hypotheses using a quantitative methodology. We show that http log analysis can also be valuable in qualitative research such as case studies. The results from http log analysis can be triangulated with other data sources and for example serve as a means of supporting the interpretation of interview data...
Amir Farbin
The ATLAS Analysis Model is a continually developing vision of how to reconcile physics analysis requirements with the ATLAS offline software and computing model constraints. In the past year this vision has influenced the evolution of the ATLAS Event Data Model, the Athena software framework, and physics analysis tools. These developments, along with the October Analysis Model Workshop and the planning for CSC analyses have led to a rapid refinement of the ATLAS Analysis Model in the past few months. This article introduces some of the relevant issues and presents the current vision of the future ATLAS Analysis Model. Event Data Model The ATLAS Event Data Model (EDM) consists of several levels of details, each targeted for a specific set of tasks. For example the Event Summary Data (ESD) stores calorimeter cells and tracking system hits thereby permitting many calibration and alignment tasks, but will be only accessible at particular computing sites with potentially large latency. In contrast, the Analysis...
Multivariate analysis with LISREL
Jöreskog, Karl G; Y Wallentin, Fan
2016-01-01
This book traces the theory and methodology of multivariate statistical analysis and shows how it can be conducted in practice using the LISREL computer program. It presents not only the typical uses of LISREL, such as confirmatory factor analysis and structural equation models, but also several other multivariate analysis topics, including regression (univariate, multivariate, censored, logistic, and probit), generalized linear models, multilevel analysis, and principal component analysis. It provides numerous examples from several disciplines and discusses and interprets the results, illustrated with sections of output from the LISREL program, in the context of the example. The book is intended for masters and PhD students and researchers in the social, behavioral, economic and many other sciences who require a basic understanding of multivariate statistical theory and methods for their analysis of multivariate data. It can also be used as a textbook on various topics of multivariate statistical analysis.
International Nuclear Information System (INIS)
Actis, Oxana; Brodski, Michael; Erdmann, Martin; Fischer, Robert; Hinzmann, Andreas; Mueller, Gero; Muenzer, Thomas; Plum, Matthias; Steggemann, Jan; Winchen, Tobias; Klimkovich, Tatsiana
2010-01-01
VISPA is a development environment for high energy physics analyses which enables physicists to combine graphical and textual work. A physics analysis cycle consists of prototyping, performing, and verifying the analysis. The main feature of VISPA is a multipurpose window for visual steering of analysis steps, creation of analysis templates, and browsing physics event data at different steps of an analysis. VISPA follows an experiment-independent approach and incorporates various tools for steering and controlling required in a typical analysis. Connection to different frameworks of high energy physics experiments is achieved by using different types of interfaces. We present the look-and-feel for an example physics analysis at the LHC and explain the underlying software concepts of VISPA.
Cost benefit analysis cost effectiveness analysis
International Nuclear Information System (INIS)
Lombard, J.
1986-09-01
The comparison of various protection options in order to determine which is the best compromise between cost of protection and residual risk is the purpose of the ALARA procedure. The use of decision-aiding techniques is valuable as an aid to selection procedures. The purpose of this study is to introduce two rather simple and well known decision aiding techniques: the cost-effectiveness analysis and the cost-benefit analysis. These two techniques are relevant for the great part of ALARA decisions which need the use of a quantitative technique. The study is based on an hypothetical case of 10 protection options. Four methods are applied to the data
International Nuclear Information System (INIS)
Sommer, S; Tinh Tran, T.
2008-01-01
Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process
Functional analysis and applications
Siddiqi, Abul Hasan
2018-01-01
This self-contained textbook discusses all major topics in functional analysis. Combining classical materials with new methods, it supplies numerous relevant solved examples and problems and discusses the applications of functional analysis in diverse fields. The book is unique in its scope, and a variety of applications of functional analysis and operator-theoretic methods are devoted to each area of application. Each chapter includes a set of problems, some of which are routine and elementary, and some of which are more advanced. The book is primarily intended as a textbook for graduate and advanced undergraduate students in applied mathematics and engineering. It offers several attractive features making it ideally suited for courses on functional analysis intended to provide a basic introduction to the subject and the impact of functional analysis on applied and computational mathematics, nonlinear functional analysis and optimization. It introduces emerging topics like wavelets, Gabor system, inverse pro...
DEFF Research Database (Denmark)
Bemman, Brian; Meredith, David
it with a “ground truth” analysis of the same music pro- duced by a human expert (see, in particular, [5]). In this paper, we explore the problem of generating an encoding of the musical surface of a work automatically from a systematic encoding of an analysis. The ability to do this depends on one having...... an effective (i.e., comput- able), correct and complete description of some aspect of the structure of the music. Generating the surface struc- ture of a piece from an analysis in this manner serves as a proof of the analysis' correctness, effectiveness and com- pleteness. We present a reductive analysis......In recent years, a significant body of research has focused on developing algorithms for computing analyses of mu- sical works automatically from encodings of these works' surfaces [3,4,7,10,11]. The quality of the output of such analysis algorithms is typically evaluated by comparing...
Fundamentals of functional analysis
Farenick, Douglas
2016-01-01
This book provides a unique path for graduate or advanced undergraduate students to begin studying the rich subject of functional analysis with fewer prerequisites than is normally required. The text begins with a self-contained and highly efficient introduction to topology and measure theory, which focuses on the essential notions required for the study of functional analysis, and which are often buried within full-length overviews of the subjects. This is particularly useful for those in applied mathematics, engineering, or physics who need to have a firm grasp of functional analysis, but not necessarily some of the more abstruse aspects of topology and measure theory normally encountered. The reader is assumed to only have knowledge of basic real analysis, complex analysis, and algebra. The latter part of the text provides an outstanding treatment of Banach space theory and operator theory, covering topics not usually found together in other books on functional analysis. Written in a clear, concise manner,...
DEFF Research Database (Denmark)
This book provides an in-depth introduction and overview of current research in computational music analysis. Its seventeen chapters, written by leading researchers, collectively represent the diversity as well as the technical and philosophical sophistication of the work being done today...... on well-established theories in music theory and analysis, such as Forte's pitch-class set theory, Schenkerian analysis, the methods of semiotic analysis developed by Ruwet and Nattiez, and Lerdahl and Jackendoff's Generative Theory of Tonal Music. The book is divided into six parts, covering...... music analysis, the book provides an invaluable resource for researchers, teachers and students in music theory and analysis, computer science, music information retrieval and related disciplines. It also provides a state-of-the-art reference for practitioners in the music technology industry....
Analysis apparatus and method of analysis
International Nuclear Information System (INIS)
1976-01-01
A continuous streaming method developed for the excution of immunoassays is described in this patent. In addition, a suitable apparatus for the method was developed whereby magnetic particles are automatically employed for the consecutive analysis of a series of liquid samples via the RIA technique
International Nuclear Information System (INIS)
Dougherty, E.M.; Fragola, J.R.
1988-01-01
The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach
Emission spectrochemical analysis
International Nuclear Information System (INIS)
Rives, R.D.; Bruks, R.R.
1983-01-01
The emission spectrochemical method of analysis based on the fact that atoms of elements can be excited in the electric arc or in the laser beam and will emit radiation with characteristic wave lengths is considered. The review contains the data on spectrochemical analysis, of liquids geological materials, scheme of laser microprobe. The main characteristics of emission spectroscopy, atomic absorption spectroscopy and X-ray fluorescent analysis, are aeneralized
International Nuclear Information System (INIS)
Crawford, H.J.; Lindstrom, P.J.
1983-06-01
Our analysis program LULU has proven very useful in all stages of experiment analysis, from prerun detector debugging through final data reduction. It has solved our problem of having arbitrary word length events and is easy enough to use that many separate experimenters are now analyzing with LULU. The ability to use the same software for all stages of experiment analysis greatly eases the programming burden. We may even get around to making the graphics elegant someday
Mastering Clojure data analysis
Rochester, Eric
2014-01-01
This book consists of a practical, example-oriented approach that aims to help you learn how to use Clojure for data analysis quickly and efficiently.This book is great for those who have experience with Clojure and who need to use it to perform data analysis. This book will also be hugely beneficial for readers with basic experience in data analysis and statistics.
Fast neutron activation analysis
International Nuclear Information System (INIS)
Pepelnik, R.
1986-01-01
Since 1981 numerous 14 MeV neutron activation analyses were performed at Korona. On the basis of that work the advantages of this analysis technique and therewith obtained results are compared with other analytical methods. The procedure of activation analysis, the characteristics of Korona, some analytical investigations in environmental research and material physics, as well as sources of systematic errors in trace analysis are described. (orig.) [de
Crisan, Dan
2011-01-01
"Stochastic Analysis" aims to provide mathematical tools to describe and model high dimensional random systems. Such tools arise in the study of Stochastic Differential Equations and Stochastic Partial Differential Equations, Infinite Dimensional Stochastic Geometry, Random Media and Interacting Particle Systems, Super-processes, Stochastic Filtering, Mathematical Finance, etc. Stochastic Analysis has emerged as a core area of late 20th century Mathematics and is currently undergoing a rapid scientific development. The special volume "Stochastic Analysis 2010" provides a sa
The ATLAS Analysis Architecture
International Nuclear Information System (INIS)
Cranmer, K.S.
2008-01-01
We present an overview of the ATLAS analysis architecture including the relevant aspects of the computing model and the major architectural aspects of the Athena framework. Emphasis will be given to the interplay between the analysis use cases and the technical aspects of the architecture including the design of the event data model, transient-persistent separation, data reduction strategies, analysis tools, and ROOT interoperability
Circuit analysis with Multisim
Baez-Lopez, David
2011-01-01
This book is concerned with circuit simulation using National Instruments Multisim. It focuses on the use and comprehension of the working techniques for electrical and electronic circuit simulation. The first chapters are devoted to basic circuit analysis.It starts by describing in detail how to perform a DC analysis using only resistors and independent and controlled sources. Then, it introduces capacitors and inductors to make a transient analysis. In the case of transient analysis, it is possible to have an initial condition either in the capacitor voltage or in the inductor current, or bo
Textile Technology Analysis Lab
Federal Laboratory Consortium — The Textile Analysis Labis built for evaluating and characterizing the physical properties of an array of textile materials, but specifically those used in aircrew...
DEFF Research Database (Denmark)
Sørensen, Olav Jull
2009-01-01
The review presents the book International Market Analysis: Theories and Methods, written by John Kuiada, professor at Centre of International Business, Department of Business Studies, Aalborg University. The book is refreshingly new in its way of looking at a classical problem. It looks at market...... analysis from the point of vie of ways of thinking about markets. Furthermore, the book includes the concept of learning in the analysis of markets og how the way we understand business reality influneces our choice of methodology for market analysis....
Chemical Security Analysis Center
Federal Laboratory Consortium — In 2006, by Presidential Directive, DHS established the Chemical Security Analysis Center (CSAC) to identify and assess chemical threats and vulnerabilities in the...
Geospatial Data Analysis Facility
Federal Laboratory Consortium — Geospatial application development, location-based services, spatial modeling, and spatial analysis are examples of the many research applications that this facility...
National Research Council Canada - National Science Library
Gilbert, John
1984-01-01
... quantification methods used in the analysis of mycotoxins in foods - Confirmation and quantification of trace organic food contaminants by mass spectrometry-selected ion monitoring - Chemiluminescence...
Federal Laboratory Consortium — FUNCTION: Uses state-of-the-art instrumentation for qualitative and quantitative analysis of organic and inorganic compounds, and biomolecules from gas, liquid, and...
Thermogravimetric Analysis Laboratory
Federal Laboratory Consortium — At NETL’s Thermogravimetric Analysis Laboratory in Morgantown, WV, researchers study how chemical looping combustion (CLC) can be applied to fossil energy systems....
Sensitivity and uncertainty analysis
Cacuci, Dan G; Navon, Ionel Michael
2005-01-01
As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c
Rosenbauer, Robert J.; Campbell, Pamela L.; Lam, Angela; Lorenson, T.D.; Hostettler, Frances D.; Thomas, Burt; Wong, Florence L.
2010-01-01
Hydrocarbons were extracted and analyzed from sediment and tarballs collected from the northern Gulf of Mexico (nGOM) coast that is potentially impacted by Macondo-1 (M-1) well oil. The samples were analyzed for a suite of diagnostic geochemical biomarkers. Aided by multivariate statistical analysis, the M-1 well oil has been identified in sediment and tarballs collected from Louisiana, Alabama, Mississippi, and Florida. None of the sediment hydrocarbon extracts from Texas correlated with the M-1 well oil. Oil-impacted sediments are confined to the shoreline adjacent to the cumulative oil slick of the Deepwater Horizon oil spill, and no impact was observed outside of this area.
Hox, J.J.; Maas, C.J.M.; Lensvelt-Mulders, G.J.L.M.
2004-01-01
The goal of meta-analysis is to integrate the research results of a number of studies on a specific topic. Characteristic for meta-analysis is that in general only the summary statistics of the studies are used and not the original data. When the published research results to be integrated
International Nuclear Information System (INIS)
Hahn, A.A.
1994-11-01
The complexity of instrumentation sometimes requires data analysis to be done before the result is presented to the control room. This tutorial reviews some of the theoretical assumptions underlying the more popular forms of data analysis and presents simple examples to illuminate the advantages and hazards of different techniques
Activation analysis. Detection limits
International Nuclear Information System (INIS)
Revel, G.
1999-01-01
Numerical data and limits of detection related to the four irradiation modes, often used in activation analysis (reactor neutrons, 14 MeV neutrons, photon gamma and charged particles) are presented here. The technical presentation of the activation analysis is detailed in the paper P 2565 of Techniques de l'Ingenieur. (A.L.B.)
SMART performance analysis methodology
International Nuclear Information System (INIS)
Lim, H. S.; Kim, H. C.; Lee, D. J.
2001-04-01
To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis
Contrast analysis : A tutorial
Haans, A.
2018-01-01
Contrast analysis is a relatively simple but effective statistical method for testing theoretical predictions about differences between group means against the empirical data. Despite its advantages, contrast analysis is hardly used to date, perhaps because it is not implemented in a convenient
Interactive Controls Analysis (INCA)
Bauer, Frank H.
1989-01-01
Version 3.12 of INCA provides user-friendly environment for design and analysis of linear control systems. System configuration and parameters easily adjusted, enabling INCA user to create compensation networks and perform sensitivity analysis in convenient manner. Full complement of graphical routines makes output easy to understand. Written in Pascal and FORTRAN.
Marketing research cluster analysis
Directory of Open Access Journals (Sweden)
Marić Nebojša
2002-01-01
Full Text Available One area of applications of cluster analysis in marketing is identification of groups of cities and towns with similar demographic profiles. This paper considers main aspects of cluster analysis by an example of clustering 12 cities with the use of Minitab software.
SWOT ANALYSIS - CHINESE PETROLEUM
Directory of Open Access Journals (Sweden)
Chunlan Wang
2014-01-01
Full Text Available This article was written in early December 2013,combined with the historical development andthe latest data on the Chinese Petroleum carried SWOT- analysis. This paper discusses corporate resources, cost, management and external factorssuch as the political environment and the marketsupply and demand, conducted a comprehensiveand profound analysis.
de Roon, F.A.; Nijman, T.E.; Ter Horst, J.R.
2000-01-01
In this paper we evaluate applications of (return based) style analysis.The portfolio and positivity constraints imposed by style analysis are useful in constructing mimicking portfolios without short positions.Such mimicking portfolios can be used, e.g., to construct efficient portfolios of mutual
F.A. de Roon (Frans); T.E. Nijman (Theo); B.J.M. Werker
2000-01-01
textabstractIn this paper we evaluate applications of (return based) style analysis. The portfolio and positivity constraints imposed by style analysis are useful in constructing mimicking portfolios without short positions. Such mimicking portfolios can be used e.g. to construct efficient
Directory of Open Access Journals (Sweden)
Satu Elo
2014-02-01
Full Text Available Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studies, our own experiences, and methodological textbooks. Trustworthiness was described for the main qualitative content analysis phases from data collection to reporting of the results. We concluded that it is important to scrutinize the trustworthiness of every phase of the analysis process, including the preparation, organization, and reporting of results. Together, these phases should give a reader a clear indication of the overall trustworthiness of the study. Based on our findings, we compiled a checklist for researchers attempting to improve the trustworthiness of a content analysis study. The discussion in this article helps to clarify how content analysis should be reported in a valid and understandable manner, which would be of particular benefit to reviewers of scientific articles. Furthermore, we discuss that it is often difficult to evaluate the trustworthiness of qualitative content analysis studies because of defective data collection method description and/or analysis description.
Schraagen, J.M.C.
2000-01-01
Cognitive task analysis is defined as the extension of traditional task analysis techniques to yield information about the knowledge, thought processes and goal structures that underlie observable task performance. Cognitive task analyses are conducted for a wide variety of purposes, including the
DEFF Research Database (Denmark)
Damkilde, Lars
2007-01-01
Limit State analysis has a long history and many prominent researchers have contributed. The theoretical foundation is based on the upper- and lower-bound theorems which give a very comprehensive and elegant formulation on complicated physical problems. In the pre-computer age Limit State analysis...... also enabled engineers to solve practical problems within reinforced concrete, steel structures and geotechnics....
Verhoosel, C.V.; Scott, M.A.; Borden, M.J.; Borst, de R.; Hughes, T.J.R.; Mueller-Hoeppe, D.; Loehnert, S.; Reese, S.
2011-01-01
Isogeometric analysis is a versatile tool for failure analysis. On the one hand, the excellent control over the inter-element continuity conditions enables a natural incorporation of continuum constitutive relations that incorporate higher-order strain gradients, as in gradient plasticity or damage.
DEFF Research Database (Denmark)
Durbin, Richard; Eddy, Sean; Krogh, Anders Stærmose
This book provides an up-to-date and tutorial-level overview of sequence analysis methods, with particular emphasis on probabilistic modelling. Discussed methods include pairwise alignment, hidden Markov models, multiple alignment, profile searches, RNA secondary structure analysis, and phylogene...
International Nuclear Information System (INIS)
Arien, B.
2000-01-01
The objective of SCK-CEN's programme on reactor safety is to develop expertise in probabilistic and deterministic reactor safety analysis. The research programme consists of two main activities, in particular the development of software for reliability analysis of large systems and participation in the international PHEBUS-FP programme for severe accidents. Main achievements in 1999 are reported
Factorial Analysis of Profitability
Georgeta VINTILA; Ilie GHEORGHE; Ioana Mihaela POCAN; Madalina Gabriela ANGHEL
2012-01-01
The DuPont analysis system is based on decomposing the profitability ratio in factors of influence. This paper describes the factorial analysis of profitability based on the DuPont system. Significant importance is given to the impact on various indicators on the shares value and profitability.
Spool assembly support analysis
International Nuclear Information System (INIS)
Norman, B.F.
1994-01-01
This document provides the wind/seismic analysis and evaluation for the pump pit spool assemblies. Hand calculations were used for the analysis. UBC, AISC, and load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met
International Nuclear Information System (INIS)
Hansen, J.D.
1976-01-01
This article discusses the partial wave analysis of two, three and four meson systems. The difference between the two approaches, referred to as amplitude and Ascoli analysis is discussed. Some of the results obtained with these methods are shown. (B.R.H.)
Enabling interdisciplinary analysis
L. M. Reid
1996-01-01
'New requirements for evaluating environmental conditions in the Pacific Northwest have led to increased demands for interdisciplinary analysis of complex environmental problems. Procedures for watershed analysis have been developed for use on public and private lands in Washington State (Washington Forest Practices Board 1993) and for federal lands in the Pacific...
Shot loading platform analysis
International Nuclear Information System (INIS)
Norman, B.F.
1994-01-01
This document provides the wind/seismic analysis and evaluation for the shot loading platform. Hand calculations were used for the analysis. AISC and UBC load factors were used in this evaluation. The results show that the actual loads are under the allowable loads and all requirements are met
Marketing research cluster analysis
Marić Nebojša
2002-01-01
One area of applications of cluster analysis in marketing is identification of groups of cities and towns with similar demographic profiles. This paper considers main aspects of cluster analysis by an example of clustering 12 cities with the use of Minitab software.
Towards Cognitive Component Analysis
DEFF Research Database (Denmark)
Hansen, Lars Kai; Ahrendt, Peter; Larsen, Jan
2005-01-01
Cognitive component analysis (COCA) is here defined as the process of unsupervised grouping of data such that the ensuing group structure is well-aligned with that resulting from human cognitive activity. We have earlier demonstrated that independent components analysis is relevant for representing...
Satu Elo; Maria Kääriäinen; Outi Kanste; Tarja Pölkki; Kati Utriainen; Helvi Kyngäs
2014-01-01
Qualitative content analysis is commonly used for analyzing qualitative data. However, few articles have examined the trustworthiness of its use in nursing science studies. The trustworthiness of qualitative content analysis is often presented by using terms such as credibility, dependability, conformability, transferability, and authenticity. This article focuses on trustworthiness based on a review of previous studie...
Interaction Analysis and Supervision.
Amidon, Edmund
This paper describes a model that uses interaction analysis as a tool to provide feedback to a teacher in a microteaching situation. The author explains how interaction analysis can be used for teacher improvement, describes the category system used in the model, the data collection methods used, and the feedback techniques found in the model. (JF)
Activation analysis. Chapter 4
International Nuclear Information System (INIS)
1976-01-01
The principle, sample and calibration standard preparation, activation by neutrons, charged particles and gamma radiation, sample transport after activation, activity measurement, and chemical sample processing are described for activation analysis. Possible applications are shown of nondestructive activation analysis. (J.P.)
Donahue, Craig J.; Rais, Elizabeth A.
2009-01-01
This lab experiment illustrates the use of thermogravimetric analysis (TGA) to perform proximate analysis on a series of coal samples of different rank. Peat and coke are also examined. A total of four exercises are described. These are dry exercises as students interpret previously recorded scans. The weight percent moisture, volatile matter,…
Ian M. Franks; Mike Hughes
2004-01-01
This book addresses and appropriately explains the notational analysis of technique, tactics, individual athlete/team exercise and work-rate in sport. The book offers guidance in: developing a system, analyzes of data, effective coaching using notational performance analysis and modeling sport behaviors. It updates and improves the 1997 edition
Directory of Open Access Journals (Sweden)
Ian M. Franks
2004-06-01
Full Text Available This book addresses and appropriately explains the notational analysis of technique, tactics, individual athlete/team exercise and work-rate in sport. The book offers guidance in: developing a system, analyzes of data, effective coaching using notational performance analysis and modeling sport behaviors. It updates and improves the 1997 edition
Analysis of Suspended-Sediment Dynamics in Gulf of Mexico Estuaries Using MODIS/Terra 250-m Imagery
McCarthy, M. J.; Otis, D. B.; Muller-Karger, F. E.; Mendez-Lazaro, P.; Chen, F. R.
2016-02-01
Suspended sediments in coastal ecosystems reduce light penetration, degrade water quality, and inhibit primary production. In this study, a 15-year Moderate Resolution Imaging Spectroradiometer (MODIS/Terra) turbidity time-series was developed for use in the estuaries of the Gulf of Mexico (GOM). Remote-sensing reflectance (Rrs) at 645 nm and 250-m resolution was validated with in-situ turbidity measurements in these estuaries: Coastal Bend Bays (TX), Galveston Bay (TX), Barataria and Terrebonne Bays (LA), Mobile Bay (AL), Tampa Bay (FL), Sarasota Bay (FL), and Charlotte Harbor (FL). Mean values of turbidity over the time-series ranged from 2.5 NTU to over 10 NTU. Turbidity patterns exhibited seasonal cycles with peak values generally found during spring months, although there is considerable variability in the timing of peak turbidity. Episodes of elevated turbidity ranged from 6 episodes in Galveston Bay to 15 in Mobile Bay. The spatial extent of elevated turbidity within estuaries, frequency and duration of turbidity events, and potential driving factors behind episodes of elevated turbidity were also examined.
Allain, Rhett
2016-05-01
We currently live in a world filled with videos. There are videos on YouTube, feature movies and even videos recorded with our own cameras and smartphones. These videos present an excellent opportunity to not only explore physical concepts, but also inspire others to investigate physics ideas. With video analysis, we can explore the fantasy world in science-fiction films. We can also look at online videos to determine if they are genuine or fake. Video analysis can be used in the introductory physics lab and it can even be used to explore the make-believe physics embedded in video games. This book covers the basic ideas behind video analysis along with the fundamental physics principles used in video analysis. The book also includes several examples of the unique situations in which video analysis can be used.
Ramsay, J O
1997-01-01
Scientists today collect samples of curves and other functional observations. This monograph presents many ideas and techniques for such data. Included are expressions in the functional domain of such classics as linear regression, principal components analysis, linear modelling, and canonical correlation analysis, as well as specifically functional techniques such as curve registration and principal differential analysis. Data arising in real applications are used throughout for both motivation and illustration, showing how functional approaches allow us to see new things, especially by exploiting the smoothness of the processes generating the data. The data sets exemplify the wide scope of functional data analysis; they are drwan from growth analysis, meterology, biomechanics, equine science, economics, and medicine. The book presents novel statistical technology while keeping the mathematical level widely accessible. It is designed to appeal to students, to applied data analysts, and to experienced researc...
Systems engineering and analysis
Blanchard, Benjamin S
2010-01-01
For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.
International Nuclear Information System (INIS)
Ishii, Keizo
1997-01-01
Elemental analysis based on the particle induced x-ray emission (PIXE) is a novel technique to analyze trace elements. It is a very simple method, its sensitivity is very high, multiple elements in a sample can be simultaneously analyzed and a few 10 μg of a sample is enough to be analyzed. Owing to these characteristics, the PIXE analysis is now used in many fields (e.g. biology, medicine, dentistry, environmental pollution, archaeology, culture assets etc.). Fundamentals of the PIXE analysis are described here: the production of characteristic x-rays and inner shell ionization by heavy charged particles, the continuous background in PIXE spectrum, quantitative formulae of the PIXE analysis, the detection limit of PIXE analysis, etc. (author)
International Nuclear Information System (INIS)
Porten, D.R.; Crowe, R.D.
1994-01-01
The purpose of this accident safety analysis is to document in detail, analyses whose results were reported in summary form in the K Basins Safety Analysis Report WHC-SD-SNF-SAR-001. The safety analysis addressed the potential for release of radioactive and non-radioactive hazardous material located in the K Basins and their supporting facilities. The safety analysis covers the hazards associated with normal K Basin fuel storage and handling operations, fuel encapsulation, sludge encapsulation, and canister clean-up and disposal. After a review of the Criticality Safety Evaluation of the K Basin activities, the following postulated events were evaluated: Crane failure and casks dropped into loadout pit; Design basis earthquake; Hypothetical loss of basin water accident analysis; Combustion of uranium fuel following dryout; Crane failure and cask dropped onto floor of transfer area; Spent ion exchange shipment for burial; Hydrogen deflagration in ion exchange modules and filters; Release of Chlorine; Power availability and reliability; and Ashfall
International Nuclear Information System (INIS)
Goetz, A.; Gerring, M.; Svensson, O.; Brockhauser, S.
2012-01-01
Data Analysis Workbench (DAWB) is a new software tool being developed at the ESRF. Its goal is to provide a tool for both online data analysis which can be used on the beamlines and for offline data analysis which users can use during experiments or take home. The tool includes support for data visualization and work-flows. work-flows allow algorithms which exploit parallel architectures to be designed from existing high level modules for data analysis in combination with data collection. The workbench uses Passerelle as the work-flow engine and EDNA plug-ins for data analysis. Actors talking to Tango are used for sending commands to a limited set of hardware to start existing data collection algorithms. A Tango server allows work-flows to be executed from existing applications. There are scripting interfaces to Python, Javascript and SPEC. The current state at the ESRF is the workbench is in test on a selected number of beamlines. (authors)
J Olive, David
2017-01-01
This text presents methods that are robust to the assumption of a multivariate normal distribution or methods that are robust to certain types of outliers. Instead of using exact theory based on the multivariate normal distribution, the simpler and more applicable large sample theory is given. The text develops among the first practical robust regression and robust multivariate location and dispersion estimators backed by theory. The robust techniques are illustrated for methods such as principal component analysis, canonical correlation analysis, and factor analysis. A simple way to bootstrap confidence regions is also provided. Much of the research on robust multivariate analysis in this book is being published for the first time. The text is suitable for a first course in Multivariate Statistical Analysis or a first course in Robust Statistics. This graduate text is also useful for people who are familiar with the traditional multivariate topics, but want to know more about handling data sets with...
Field, Michael
2017-01-01
This book provides a rigorous introduction to the techniques and results of real analysis, metric spaces and multivariate differentiation, suitable for undergraduate courses. Starting from the very foundations of analysis, it offers a complete first course in real analysis, including topics rarely found in such detail in an undergraduate textbook such as the construction of non-analytic smooth functions, applications of the Euler-Maclaurin formula to estimates, and fractal geometry. Drawing on the author’s extensive teaching and research experience, the exposition is guided by carefully chosen examples and counter-examples, with the emphasis placed on the key ideas underlying the theory. Much of the content is informed by its applicability: Fourier analysis is developed to the point where it can be rigorously applied to partial differential equations or computation, and the theory of metric spaces includes applications to ordinary differential equations and fractals. Essential Real Analysis will appeal t...
Real analysis and applications
Botelho, Fabio Silva
2018-01-01
This textbook introduces readers to real analysis in one and n dimensions. It is divided into two parts: Part I explores real analysis in one variable, starting with key concepts such as the construction of the real number system, metric spaces, and real sequences and series. In turn, Part II addresses the multi-variable aspects of real analysis. Further, the book presents detailed, rigorous proofs of the implicit theorem for the vectorial case by applying the Banach fixed-point theorem and the differential forms concept to surfaces in Rn. It also provides a brief introduction to Riemannian geometry. With its rigorous, elegant proofs, this self-contained work is easy to read, making it suitable for undergraduate and beginning graduate students seeking a deeper understanding of real analysis and applications, and for all those looking for a well-founded, detailed approach to real analysis.
Nonactivation interaction analysis. Chapter 5
International Nuclear Information System (INIS)
1976-01-01
Analyses are described including the alpha scattering analysis, beta absorption and scattering analysis, gamma and X-ray absorption and scattering analysis, X-ray fluorescence analysis, neutron absorption and scattering analysis, Moessbauer effect application and an analysis based on the application of radiation ionizing effects. (J.P.)
Is activation analysis still active?
International Nuclear Information System (INIS)
Chai Zhifang
2001-01-01
This paper reviews some aspects of neutron activation analysis (NAA), covering instrumental neutron activation analysis (INAA), k 0 method, prompt gamma-ray neutron activation analysis (PGNAA), radiochemical neutron activation analysis (RNAA) and molecular activation analysis (MAA). The comparison of neutron activation analysis with other analytical techniques are also made. (author)
International Nuclear Information System (INIS)
González Caballero, I; Cuesta Noriega, A; Rodríguez Marrero, A; Fernández del Castillo, E
2012-01-01
The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of the routine tasks as much as possible. We describe the details of the PAF implementation as well as how we succeeded in engaging a group of CMS physicists to use PAF as their daily analysis framework.
Hazard Analysis Database Report
Energy Technology Data Exchange (ETDEWEB)
GAULT, G.W.
1999-10-13
The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.
Containment vessel stability analysis
International Nuclear Information System (INIS)
Harstead, G.A.; Morris, N.F.; Unsal, A.I.
1983-01-01
The stability analysis for a steel containment shell is presented herein. The containment is a freestanding shell consisting of a vertical cylinder with a hemispherical dome. It is stiffened by large ring stiffeners and relatively small longitudinal stiffeners. The containment vessel is subjected to both static and dynamic loads which can cause buckling. These loads must be combined prior to their use in a stability analysis. The buckling loads were computed with the aid of the ASME Code case N-284 used in conjunction with general purpose computer codes and in-house programs. The equations contained in the Code case were used to compute the knockdown factors due to shell imperfections. After these knockdown factors were applied to the critical stress states determined by freezing the maximum dynamic stresses and combining them with other static stresses, a linear bifurcation analysis was carried out with the aid of the BOSOR4 program. Since the containment shell contained large penetrations, the Code case had to be supplemented by a local buckling analysis of the shell area surrounding the largest penetration. This analysis was carried out with the aid of the NASTRAN program. Although the factor of safety against buckling obtained in this analysis was satisfactory, it is claimed that the use of the Code case knockdown factors are unduly conservative when applied to the analysis of buckling around penetrations. (orig.)
International Nuclear Information System (INIS)
Thompson, W.A. Jr.
1979-11-01
This paper briefly describes WASH 1400 and the Lewis report. It attempts to define basic concepts such as risk and risk analysis, common mode failure, and rare event. Several probabilistic models which go beyond the WASH 1400 methodology are introduced; the common characteristic of these models is that they recognize explicitly that risk analysis is time dependent whereas WASH 1400 takes a per demand failure rate approach which obscures the important fact that accidents are time related. Further, the presentation of a realistic risk analysis should recognize that there are various risks which compete with one another for the lives of the individuals at risk. A way of doing this is suggested
International Nuclear Information System (INIS)
Kartiwa Sumadi; Yayah Rohayati
1996-01-01
The 'monazit' analytical program has been set up for routine work of Rare Earth Elements analysis in the monazite and xenotime minerals samples. Total relative error of the analysis is very low, less than 2.50%, and the reproducibility of counting statistic and stability of the instrument were very excellent. The precision and accuracy of the analytical program are very good with the maximum percentage relative are 5.22% and 1.61%, respectively. The mineral compositions of the 30 monazite samples have been also calculated using their chemical constituents, and the results were compared to the grain counting microscopic analysis
Methods of Multivariate Analysis
Rencher, Alvin C
2012-01-01
Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit
Dunham, Ken
2014-01-01
The rapid growth and development of Android-based devices has resulted in a wealth of sensitive information on mobile devices that offer minimal malware protection. This has created an immediate demand for security professionals that understand how to best approach the subject of Android malware threats and analysis.In Android Malware and Analysis, Ken Dunham, renowned global malware expert and author, teams up with international experts to document the best tools and tactics available for analyzing Android malware. The book covers both methods of malware analysis: dynamic and static.This tact
Aven, Terje
2012-01-01
Foundations of Risk Analysis presents the issues core to risk analysis - understanding what risk means, expressing risk, building risk models, addressing uncertainty, and applying probability models to real problems. The author provides the readers with the knowledge and basic thinking they require to successfully manage risk and uncertainty to support decision making. This updated edition reflects recent developments on risk and uncertainty concepts, representations and treatment. New material in Foundations of Risk Analysis includes:An up to date presentation of how to understand, define and
International Nuclear Information System (INIS)
Ko, Myeong Su; Kim, Tae Hwa; Park, Gyu Hyeon; Yang, Jong Beom; Oh, Chang Hwan; Lee, Kyoung Hye
2010-04-01
This textbook describes instrument analysis in easy way with twelve chapters. The contents of the book are pH measurement on principle, pH meter, pH measurement, examples of the experiments, centrifugation, Absorptiometry, Fluorescent method, Atomic absorption analysis, Gas-chromatography, Gas chromatography-mass spectrometry, High performance liquid chromatography liquid chromatograph-mass spectrometry, Electrophoresis on practical case and analysis of the result and examples, PCR on principle, device, application and examples and Enzyme-linked immunosorbent assay with indirect ELISA, sandwich ELISA and ELISA reader.
International Nuclear Information System (INIS)
Williams, Mike; Egede, Ulrik; Paterson, Stuart
2011-01-01
The distributed analysis experience to date at LHCb has been positive: job success rates are high and wait times for high-priority jobs are low. LHCb users access the grid using the GANGA job-management package, while the LHCb virtual organization manages its resources using the DIRAC package. This clear division of labor has benefitted LHCb and its users greatly; it is a major reason why distributed analysis at LHCb has been so successful. The newly formed LHCb distributed analysis support team has also proved to be a success.
Factor analysis and scintigraphy
International Nuclear Information System (INIS)
Di Paola, R.; Penel, C.; Bazin, J.P.; Berche, C.
1976-01-01
The goal of factor analysis is usually to achieve reduction of a large set of data, extracting essential features without previous hypothesis. Due to the development of computerized systems, the use of largest sampling, the possibility of sequential data acquisition and the increase of dynamic studies, the problem of data compression can be encountered now in routine. Thus, results obtained for compression of scintigraphic images were first presented. Then possibilities given by factor analysis for scan processing were discussed. At last, use of this analysis for multidimensional studies and specially dynamic studies were considered for compression and processing [fr
Energy Technology Data Exchange (ETDEWEB)
Ko, Myeong Su; Kim, Tae Hwa; Park, Gyu Hyeon; Yang, Jong Beom; Oh, Chang Hwan; Lee, Kyoung Hye
2010-04-15
This textbook describes instrument analysis in easy way with twelve chapters. The contents of the book are pH measurement on principle, pH meter, pH measurement, examples of the experiments, centrifugation, Absorptiometry, Fluorescent method, Atomic absorption analysis, Gas-chromatography, Gas chromatography-mass spectrometry, High performance liquid chromatography liquid chromatograph-mass spectrometry, Electrophoresis on practical case and analysis of the result and examples, PCR on principle, device, application and examples and Enzyme-linked immunosorbent assay with indirect ELISA, sandwich ELISA and ELISA reader.
DEFF Research Database (Denmark)
Raket, Lars Lau
We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...
Bayesian nonparametric data analysis
Müller, Peter; Jara, Alejandro; Hanson, Tim
2015-01-01
This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.
International Nuclear Information System (INIS)
Ramirez T, J.J.; Lopez M, J.; Sandoval J, A.R.; Villasenor S, P.; Aspiazu F, J.A.
2001-01-01
An elemental analysis, metallographic and of phases was realized in order to determine the oxidation states of Fe contained in three metallic pieces: block, plate and cylinder of unknown material. Results are presented from the elemental analysis which was carried out in the Tandem Accelerator of ININ by Proton induced X-ray emission (PIXE). The phase analysis was carried out by X-ray diffraction which allowed to know the type of alloy or alloys formed. The combined application of nuclear techniques with metallographic techniques allows the integral characterization of industrial metals. (Author)
Ash, Robert B; Lukacs, E
1972-01-01
Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var
Iremonger, M J
1982-01-01
BASIC Stress Analysis aims to help students to become proficient at BASIC programming by actually using it in an important engineering subject. It also enables the student to use computing as a means of learning stress analysis because writing a program is analogous to teaching-it is necessary to understand the subject matter. The book begins by introducing the BASIC approach and the concept of stress analysis at first- and second-year undergraduate level. Subsequent chapters contain a summary of relevant theory, worked examples containing computer programs, and a set of problems. Topics c
Fundamentals of mathematical analysis
Paul J Sally, Jr
2013-01-01
This is a textbook for a course in Honors Analysis (for freshman/sophomore undergraduates) or Real Analysis (for junior/senior undergraduates) or Analysis-I (beginning graduates). It is intended for students who completed a course in "AP Calculus", possibly followed by a routine course in multivariable calculus and a computational course in linear algebra. There are three features that distinguish this book from many other books of a similar nature and which are important for the use of this book as a text. The first, and most important, feature is the collection of exercises. These are spread
Systems analysis-independent analysis and verification
Energy Technology Data Exchange (ETDEWEB)
Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)
1995-09-01
The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.
Gopalakrishnan, Ganesh
2013-07-01
An ocean state estimate has been developed for the Gulf of Mexico (GoM) using the MIT general circulation model and its adjoint. The estimate has been tested by forecasting loop current (LC) evolution and eddy shedding in the GoM. The adjoint (or four-dimensional variational) method was used to match the model evolution to observations by adjusting model temperature and salinity initial conditions, open boundary conditions, and atmospheric forcing fields. The model was fit to satellite-derived along-track sea surface height, separated into temporal mean and anomalies, and gridded sea surface temperature for 2 month periods. The optimized state at the end of the assimilation period was used to initialize the forecast for 2 months. Forecasts explore practical LC predictability and provide a cross-validation test of the state estimate by comparing it to independent future observations. The model forecast was tested for several LC eddy separation events, including Eddy Franklin in May 2010 during the deepwater horizon oil spill disaster in the GoM. The forecast used monthly climatological open boundary conditions, atmospheric forcing, and run-off fluxes. The model performance was evaluated by computing model-observation root-mean-square difference (rmsd) during both the hindcast and forecast periods. The rmsd metrics for the forecast generally outperformed persistence (keeping the initial state fixed) and reference (forecast initialized using assimilated Hybrid Coordinate Ocean Model 1/12° global analysis) model simulations during LC eddy separation events for a period of 1̃2 months.
Gopalakrishnan, Ganesh; Cornuelle, Bruce D.; Hoteit, Ibrahim; Rudnick, Daniel L.; Owens, W. Brechner
2013-01-01
An ocean state estimate has been developed for the Gulf of Mexico (GoM) using the MIT general circulation model and its adjoint. The estimate has been tested by forecasting loop current (LC) evolution and eddy shedding in the GoM. The adjoint (or four-dimensional variational) method was used to match the model evolution to observations by adjusting model temperature and salinity initial conditions, open boundary conditions, and atmospheric forcing fields. The model was fit to satellite-derived along-track sea surface height, separated into temporal mean and anomalies, and gridded sea surface temperature for 2 month periods. The optimized state at the end of the assimilation period was used to initialize the forecast for 2 months. Forecasts explore practical LC predictability and provide a cross-validation test of the state estimate by comparing it to independent future observations. The model forecast was tested for several LC eddy separation events, including Eddy Franklin in May 2010 during the deepwater horizon oil spill disaster in the GoM. The forecast used monthly climatological open boundary conditions, atmospheric forcing, and run-off fluxes. The model performance was evaluated by computing model-observation root-mean-square difference (rmsd) during both the hindcast and forecast periods. The rmsd metrics for the forecast generally outperformed persistence (keeping the initial state fixed) and reference (forecast initialized using assimilated Hybrid Coordinate Ocean Model 1/12° global analysis) model simulations during LC eddy separation events for a period of 1̃2 months.
Pasqueron de Fommervault, Orens; Perez-Brunius, Paula; Damien, Pierre; Camacho-Ibar, Victor F.; Sheinbaum, Julio
2017-12-01
Chlorophyll concentration is a key oceanic biogeochemical variable. In the Gulf of Mexico (GOM), its distribution, which is mainly obtained from satellite surface observations and scarce in situ experiments, is still poorly understood. In 2011-2012, eight profiling floats equipped with biogeochemical sensors were deployed for the first time in the GOM and generated an unprecedented dataset that significantly increased the number of chlorophyll vertical distribution measurements in the region. The analysis of these data, once calibrated, permits us to reconsider the spatial and temporal variability of the chlorophyll concentration in the water column. At a seasonal scale, results confirm the surface signal seen by satellites, presenting maximum concentrations in winter and low values in summer. It is shown that the deepening of the mixed layer is the primary factor triggering the chlorophyll surface increase in winter. In the GOM, a possible interpretation is that this surface increase corresponds to a biomass increase. However, the present dataset suggests that the basin-scale climatological surface increase in chlorophyll content results from a vertical redistribution of subsurface chlorophyll and/or photoacclimation processes, rather than a net increase of biomass. One plausible explanation for this is the decoupling between the mixed-layer depth and the deep nutrient reservoir since mixed-layer depth only reaches the nitracline in sporadic events in the observations. Float measurements also provide evidence that the depth and the magnitude of the deep chlorophyll maximum is strongly controlled by the mesoscale variability, with higher chlorophyll biomass generally observed in cyclones rather than anticyclones.
Manufacture of a four-sheet complex component from different titanium alloys by superplastic forming
Allazadeh, M. R.; Zuelli, N.
2017-10-01
A superplastic forming (SPF) technology process was deployed to form a complex component with eight-pocket from a four-sheet sandwich panel sheetstock. Six sheetstock packs were composed of two core sheets made of Ti-6Al-4V or Ti-5Al-4Cr-4Mo-2Sn-2Zr titanium alloy and two skin sheets made of Ti-6Al-4V or Ti-6Al-2Sn-4Zr-2Mo titanium alloy in three different combinations. The sheets were welded with two subsequent welding patterns over the core and skin sheets to meet the required component's details. The applied welding methods were intermittent and continuous resistance seam welding for bonding the core sheets to each other and the skin sheets over the core panel, respectively. The final component configuration was predicted based on the die drawings and finite element method (FEM) simulations for the sandwich panels. An SPF system set-up with two inlet gas pipe feeding facilitated the trials to deliver two pressure-time load cycles acting simultaneously which were extracted from FEM analysis for specific forming temperature and strain rate. The SPF pressure-time cycles were optimized via GOM scanning and visually inspecting some sections of the packs in order to assess the levels of core panel formation during the inflation process of the sheetstock. Two sets of GOM scan results were compared via GOM software to inspect the surface and internal features of the inflated multisheet packs. The results highlighted the capability of the tested SPF process to form complex components from a flat multisheet pack made of different titanium alloys.
Plasma data analysis using statistical analysis system
International Nuclear Information System (INIS)
Yoshida, Z.; Iwata, Y.; Fukuda, Y.; Inoue, N.
1987-01-01
Multivariate factor analysis has been applied to a plasma data base of REPUTE-1. The characteristics of the reverse field pinch plasma in REPUTE-1 are shown to be explained by four independent parameters which are described in the report. The well known scaling laws F/sub chi/ proportional to I/sub p/, T/sub e/ proportional to I/sub p/, and tau/sub E/ proportional to N/sub e/ are also confirmed. 4 refs., 8 figs., 1 tab
Summary Analysis: Hanford Site Composite Analysis Update
Energy Technology Data Exchange (ETDEWEB)
Nichols, W. E. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Lehman, L. L. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)
2017-06-05
The Hanford Site’s currently maintained Composite Analysis, originally completed in 1998, requires an update. A previous update effort was undertaken by the U.S. Department of Energy (DOE) in 2001-2005, but was ended before completion to allow the Tank Closure & Waste Management Environmental Impact Statement (TC&WM EIS) (DOE/EIS-0391) to be prepared without potential for conflicting sitewide models. This EIS was issued in 2012, and the deferral was ended with guidance in memorandum “Modeling to Support Regulatory Decision Making at Hanford” (Williams, 2012) provided with the aim of ensuring subsequent modeling is consistent with the EIS.
He, Jingrui
2012-01-01
This book focuses on rare category analysis where the majority classes have smooth distributions and the minority classes exhibit the compactness property. It focuses on challenging cases where the support regions of the majority and minority classes overlap.
Longitudinal categorical data analysis
Sutradhar, Brajendra C
2014-01-01
This is the first book in longitudinal categorical data analysis with parametric correlation models developed based on dynamic relationships among repeated categorical responses. This book is a natural generalization of the longitudinal binary data analysis to the multinomial data setup with more than two categories. Thus, unlike the existing books on cross-sectional categorical data analysis using log linear models, this book uses multinomial probability models both in cross-sectional and longitudinal setups. A theoretical foundation is provided for the analysis of univariate multinomial responses, by developing models systematically for the cases with no covariates as well as categorical covariates, both in cross-sectional and longitudinal setups. In the longitudinal setup, both stationary and non-stationary covariates are considered. These models have also been extended to the bivariate multinomial setup along with suitable covariates. For the inferences, the book uses the generalized quasi-likelihood as w...
International Nuclear Information System (INIS)
1981-09-01
Suggestion are made concerning the method of the fault tree analysis, the use of certain symbols in the examination of system failures. This purpose of the fault free analysis is to find logical connections of component or subsystem failures leading to undesirable occurrances. The results of these examinations are part of the system assessment concerning operation and safety. The objectives of the analysis are: systematical identification of all possible failure combinations (causes) leading to a specific undesirable occurrance, finding of reliability parameters such as frequency of failure combinations, frequency of the undesirable occurrance or non-availability of the system when required. The fault tree analysis provides a near and reconstructable documentation of the examination. (orig./HP) [de
Denker, A; Rauschenberg, J; Röhrich, J; Strub, E
2006-01-01
Materials analysis with ion beams exploits the interaction of ions with the electrons and nuclei in the sample. Among the vast variety of possible analytical techniques available with ion beams we will restrain to ion beam analysis with ion beams in the energy range from one to several MeV per mass unit. It is possible to use either the back-scattered projectiles (RBS – Rutherford Back Scattering) or the recoiled atoms itself (ERDA – Elastic Recoil Detection Analysis) from the elastic scattering processes. These techniques allow the simultaneous and absolute determination of stoichiometry and depth profiles of the detected elements. The interaction of the ions with the electrons in the sample produces holes in the inner electronic shells of the sample atoms, which recombine and emit X-rays characteristic for the element in question. Particle Induced X-ray Emission (PIXE) has shown to be a fast technique for the analysis of elements with an atomic number above 11.
DEFF Research Database (Denmark)
Vatrapu, Ravi; Mukkamala, Raghava Rao; Hussain, Abid
2016-01-01
, conceptual and formal models of social data, and an analytical framework for combining big social data sets with organizational and societal data sets. Three empirical studies of big social data are presented to illustrate and demonstrate social set analysis in terms of fuzzy set-theoretical sentiment...... automata and agent-based modeling). However, when it comes to organizational and societal units of analysis, there exists no approach to conceptualize, model, analyze, explain, and predict social media interactions as individuals' associations with ideas, values, identities, and so on. To address...... analysis, crisp set-theoretical interaction analysis, and event-studies-oriented set-theoretical visualizations. Implications for big data analytics, current limitations of the set-theoretical approach, and future directions are outlined....
PWR systems transient analysis
International Nuclear Information System (INIS)
Kennedy, M.F.; Peeler, G.B.; Abramson, P.B.
1985-01-01
Analysis of transients in pressurized water reactor (PWR) systems involves the assessment of the response of the total plant, including primary and secondary coolant systems, steam piping and turbine (possibly including the complete feedwater train), and various control and safety systems. Transient analysis is performed as part of the plant safety analysis to insure the adequacy of the reactor design and operating procedures and to verify the applicable plant emergency guidelines. Event sequences which must be examined are developed by considering possible failures or maloperations of plant components. These vary in severity (and calculational difficulty) from a series of normal operational transients, such as minor load changes, reactor trips, valve and pump malfunctions, up to the double-ended guillotine rupture of a primary reactor coolant system pipe known as a Large Break Loss of Coolant Accident (LBLOCA). The focus of this paper is the analysis of all those transients and accidents except loss of coolant accidents
Full closure strategic analysis.
2014-07-01
The full closure strategic analysis was conducted to create a decision process whereby full roadway : closures for construction and maintenance activities can be evaluated and approved or denied by CDOT : Traffic personnel. The study reviewed current...
Electrical Subsurface Grounding Analysis
International Nuclear Information System (INIS)
J.M. Calle
2000-01-01
The purpose and objective of this analysis is to determine the present grounding requirements of the Exploratory Studies Facility (ESF) subsurface electrical system and to verify that the actual grounding system and devices satisfy the requirements
DEFF Research Database (Denmark)
Skrypnyuk, Nataliya; Nielson, Flemming; Pilegaard, Henrik
2009-01-01
We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced into the...... into the syntax of IMC in order to make our analysis feasible. Finally we describe the analysis itself together with several theoretical results that we have proved for it.......We present the ongoing work on the pathway analysis of a stochastic calculus. Firstly we present a particular stochastic calculus that we have chosen for our modeling - the Interactive Markov Chains calculus, IMC for short. After that we specify a few restrictions that we have introduced...
Canonical Information Analysis
DEFF Research Database (Denmark)
Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg
2015-01-01
is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator......Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables...... for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical...
Qualitative Data Analysis Strategies
Greaves, Kristoffer
2014-01-01
A set of concept maps for qualitative data analysis strategies, inspired by Corbin, JM & Strauss, AL 2008, Basics of qualitative research: Techniques and procedures for developing grounded theory, 3rd edn, Sage Publications, Inc, Thousand Oaks, California.
Statistical data analysis handbook
National Research Council Canada - National Science Library
Wall, Francis J
1986-01-01
It must be emphasized that this is not a text book on statistics. Instead it is a working tool that presents data analysis in clear, concise terms which can be readily understood even by those without formal training in statistics...
Fanfani, Alessandra; Sanches, Jose Afonso; Andreeva, Julia; Bagliesi, Giusepppe; Bauerdick, Lothar; Belforte, Stefano; Bittencourt Sampaio, Patricia; Bloom, Ken; Blumenfeld, Barry; Bonacorsi, Daniele; Brew, Chris; Calloni, Marco; Cesini, Daniele; Cinquilli, Mattia; Codispoti, Giuseppe; D'Hondt, Jorgen; Dong, Liang; Dongiovanni, Danilo; Donvito, Giacinto; Dykstra, David; Edelmann, Erik; Egeland, Ricky; Elmer, Peter; Eulisse, Giulio; Evans, Dave; Fanzago, Federica; Farina, Fabio; Feichtinger, Derek; Fisk, Ian; Flix, Josep; Grandi, Claudio; Guo, Yuyi; Happonen, Kalle; Hernandez, Jose M; Huang, Chih-Hao; Kang, Kejing; Karavakis, Edward; Kasemann, Matthias; Kavka, Carlos; Khan, Akram; Kim, Bockjoo; Klem, Jukka; Koivumaki, Jesper; Kress, Thomas; Kreuzer, Peter; Kurca, Tibor; Kuznetsov, Valentin; Lacaprara, Stefano; Lassila-Perini, Kati; Letts, James; Linden, Tomas; Lueking, Lee; Maes, Joris; Magini, Nicolo; Maier, Gerhild; McBride, Patricia; Metson, Simon; Miccio, Vincenzo; Padhi, Sanjay; Pi, Haifeng; Riahi, Hassen; Riley, Daniel; Rossman, Paul; Saiz, Pablo; Sartirana, Andrea; Sciaba, Andrea; Sekhri, Vijay; Spiga, Daniele; Tuura, Lassi; Vaandering, Eric; Vanelderen, Lukas; Van Mulders, Petra; Vedaee, Aresh; Villella, Ilaria; Wicklund, Eric; Wildish, Tony; Wissing, Christoph; Wurthwein, Frank
2009-01-01
The CMS experiment expects to manage several Pbytes of data each year during the LHC programme, distributing them over many computing sites around the world and enabling data access at those centers for analysis. CMS has identified the distributed sites as the primary location for physics analysis to support a wide community with thousands potential users. This represents an unprecedented experimental challenge in terms of the scale of distributed computing resources and number of user. An overview of the computing architecture, the software tools and the distributed infrastructure is reported. Summaries of the experience in establishing efficient and scalable operations to get prepared for CMS distributed analysis are presented, followed by the user experience in their current analysis activities.
NOAA's Inundation Analysis Tool
National Oceanic and Atmospheric Administration, Department of Commerce — Coastal storms and other meteorological phenomenon can have a significant impact on how high water levels rise and how often. The inundation analysis program is...
Multidimensional nonlinear descriptive analysis
Nishisato, Shizuhiko
2006-01-01
Quantification of categorical, or non-numerical, data is a problem that scientists face across a wide range of disciplines. Exploring data analysis in various areas of research, such as the social sciences and biology, Multidimensional Nonlinear Descriptive Analysis presents methods for analyzing categorical data that are not necessarily sampled randomly from a normal population and often involve nonlinear relations. This reference not only provides an overview of multidimensional nonlinear descriptive analysis (MUNDA) of discrete data, it also offers new results in a variety of fields. The first part of the book covers conceptual and technical preliminaries needed to understand the data analysis in subsequent chapters. The next two parts contain applications of MUNDA to diverse data types, with each chapter devoted to one type of categorical data, a brief historical comment, and basic skills peculiar to the data types. The final part examines several problems and then concludes with suggestions for futu...
Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars
2016-04-12
A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.
Water Quality Analysis Simulation
The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural phenomena and man-made pollution for variious pollution management decisions.
... Plasma Free Metanephrines Platelet Count Platelet Function Tests Pleural Fluid Analysis PML-RARA Porphyrin Tests Potassium Prealbumin ... is being tested? Synovial fluid is a thick liquid that acts as a lubricant for the body's ...
Hytönen, Tuomas; Veraar, Mark; Weis, Lutz
The present volume develops the theory of integration in Banach spaces, martingales and UMD spaces, and culminates in a treatment of the Hilbert transform, Littlewood-Paley theory and the vector-valued Mihlin multiplier theorem. Over the past fifteen years, motivated by regularity problems in evolution equations, there has been tremendous progress in the analysis of Banach space-valued functions and processes. The contents of this extensive and powerful toolbox have been mostly scattered around in research papers and lecture notes. Collecting this diverse body of material into a unified and accessible presentation fills a gap in the existing literature. The principal audience that we have in mind consists of researchers who need and use Analysis in Banach Spaces as a tool for studying problems in partial differential equations, harmonic analysis, and stochastic analysis. Self-contained and offering complete proofs, this work is accessible to graduate students and researchers with a background in functional an...
Analysis Streamlining in ATLAS
Heinrich, Lukas; The ATLAS collaboration
2018-01-01
We present recent work within the ATLAS collaboration centrally provide tools to facilitate analysis management and highly automated container-based analysis execution in order to both enable non-experts to benefit from these best practices as well as the collaboration to track and re-execute analyses indpendently, e.g. during their review phase. Through integration with the ATLAS GLANCE system, users can request a pre-configured, but customizable version control setup, including continuous integration for automated build and testing as well as continuous Linux Container image building for software preservation purposes. As analyses typically require many individual steps, analysis workflow pipelines can then be defined using such images and the yadage workflow description language. The integration into the workflow exection service REANA allows the interactive or automated reproduction of the main analysis results by orchestrating a large number of container jobs using the Kubernetes. For long-term archival,...
Wolff, Thomas H; Shubin, Carol
2003-01-01
This book demonstrates how harmonic analysis can provide penetrating insights into deep aspects of modern analysis. It is both an introduction to the subject as a whole and an overview of those branches of harmonic analysis that are relevant to the Kakeya conjecture. The usual background material is covered in the first few chapters: the Fourier transform, convolution, the inversion theorem, the uncertainty principle and the method of stationary phase. However, the choice of topics is highly selective, with emphasis on those frequently used in research inspired by the problems discussed in the later chapters. These include questions related to the restriction conjecture and the Kakeya conjecture, distance sets, and Fourier transforms of singular measures. These problems are diverse, but often interconnected; they all combine sophisticated Fourier analysis with intriguing links to other areas of mathematics and they continue to stimulate first-rate work. The book focuses on laying out a solid foundation for fu...
Water Quality Analysis Simulation
U.S. Environmental Protection Agency — The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural...
Federal Laboratory Consortium — Provides engineering design of aircraft components, subsystems and installations using Pro/E, Anvil 1000, CADKEY 97, AutoCAD 13. Engineering analysis tools include...
CSIR Research Space (South Africa)
Khuluse, S
2009-04-01
Full Text Available ) determination of the distribution of the damage and (iii) preparation of products that enable prediction of future risk events. The methodology provided by extreme value theory can also be a powerful tool in risk analysis...
Ziemer, William P
2017-01-01
This first year graduate text is a comprehensive resource in real analysis based on a modern treatment of measure and integration. Presented in a definitive and self-contained manner, it features a natural progression of concepts from simple to difficult. Several innovative topics are featured, including differentiation of measures, elements of Functional Analysis, the Riesz Representation Theorem, Schwartz distributions, the area formula, Sobolev functions and applications to harmonic functions. Together, the selection of topics forms a sound foundation in real analysis that is particularly suited to students going on to further study in partial differential equations. This second edition of Modern Real Analysis contains many substantial improvements, including the addition of problems for practicing techniques, and an entirely new section devoted to the relationship between Lebesgue and improper integrals. Aimed at graduate students with an understanding of advanced calculus, the text will also appeal to mo...
DEFF Research Database (Denmark)
Fischer, Paul; Hilbert, Astrid
2012-01-01
We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...
International Nuclear Information System (INIS)
Holland, W.E.
1980-02-01
A method was developed to determine if boron-loaded polymeric material contained enriched boron or natural boron. A prototype analyzer was constructed, and initial planning was done for an actual analysis facility
Stakeholder Analysis Worksheet
Stakeholder Analysis WorksheetA worksheet that can be used to document potential stakeholder groups, the information or expertise they hold, the role that they can play, their interests or concerns about the HIA
Energy Technology Data Exchange (ETDEWEB)
Arent, D.; Benioff, R.; Mosey, G.; Bird, L.; Brown, J.; Brown, E.; Vimmerstedt, L.; Aabakken, J.; Parks, K.; Lapsa, M.; Davis, S.; Olszewski, M.; Cox, D.; McElhaney, K.; Hadley, S.; Hostick, D.; Nicholls, A.; McDonald, S.; Holloman, B.
2006-10-01
This paper presents the results of energy market analysis sponsored by the Department of Energy's (DOE) Weatherization and International Program (WIP) within the Office of Energy Efficiency and Renewable Energy (EERE). The analysis was conducted by a team of DOE laboratory experts from the National Renewable Energy Laboratory (NREL), Oak Ridge National Laboratory (ORNL), and Pacific Northwest National Laboratory (PNNL), with additional input from Lawrence Berkeley National Laboratory (LBNL). The analysis was structured to identify those markets and niches where government can create the biggest impact by informing management decisions in the private and public sectors. The analysis identifies those markets and niches where opportunities exist for increasing energy efficiency and renewable energy use.
Principles of Fourier analysis
Howell, Kenneth B
2001-01-01
Fourier analysis is one of the most useful and widely employed sets of tools for the engineer, the scientist, and the applied mathematician. As such, students and practitioners in these disciplines need a practical and mathematically solid introduction to its principles. They need straightforward verifications of its results and formulas, and they need clear indications of the limitations of those results and formulas.Principles of Fourier Analysis furnishes all this and more. It provides a comprehensive overview of the mathematical theory of Fourier analysis, including the development of Fourier series, "classical" Fourier transforms, generalized Fourier transforms and analysis, and the discrete theory. Much of the author''s development is strikingly different from typical presentations. His approach to defining the classical Fourier transform results in a much cleaner, more coherent theory that leads naturally to a starting point for the generalized theory. He also introduces a new generalized theory based ...
Quantitative investment analysis
DeFusco, Richard
2007-01-01
In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.
Energy Technology Data Exchange (ETDEWEB)
Wood, William Monford [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-02-23
Presenting a systematic study of the standard analysis of rod-pinch radiographs for obtaining quantitative measurements of areal mass densities, and making suggestions for improving the methodology of obtaining quantitative information from radiographed objects.
Biodiesel Emissions Analysis Program
Using existing data, the EPA's biodiesel emissions analysis program sought to quantify the air pollution emission effects of biodiesel for diesel engines that have not been specifically modified to operate on biodiesel.
Introduction to global analysis
Kahn, Donald W
2007-01-01
This text introduces the methods of mathematical analysis as applied to manifolds, including the roles of differentiation and integration, infinite dimensions, Morse theory, Lie groups, and dynamical systems. 1980 edition.
Biorefinery Sustainability Analysis
DEFF Research Database (Denmark)
J. S. M. Silva, Carla; Prunescu, Remus Mihail; Gernaey, Krist
2017-01-01
This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system and of t......This chapter deals with sustainability analysis of biorefinery systems in terms of environmental and socio-economic indicators . Life cycle analysis has methodological issues related to the functional unit (FU), allocation , land use and biogenic carbon neutrality of the reference system...... and of the biorefinery-based system. Socio-economic criteria and indicators used in sustainability frameworks assessment are presented and discussed. There is not one single methodology that can aptly cover the synergies of environmental, economic, social and governance issues required to assess the sustainable...
Pesticide Instrumental Analysis
International Nuclear Information System (INIS)
Samir, E.; Fonseca, E.; Baldyga, N.; Acosta, A.; Gonzalez, F.; Felicita, F.; Tomasso, M.; Esquivel, D.; Parada, A.; Enriquez, P.; Amilibia, M.
2012-01-01
This workshop was the evaluation of the pesticides impact on the vegetable matrix with the purpose to determine the analysis by GC / M S. The working material were lettuce matrix, chard and a mix of green leaves and pesticides.
Kansas Data Access and Support Center — The Kansas GAP Analysis Land Cover database depicts 43 land cover classes for the state of Kansas. The database was generated using a two-stage hybrid classification...
Perspectives in shape analysis
Bruckstein, Alfred; Maragos, Petros; Wuhrer, Stefanie
2016-01-01
This book presents recent advances in the field of shape analysis. Written by experts in the fields of continuous-scale shape analysis, discrete shape analysis and sparsity, and numerical computing who hail from different communities, it provides a unique view of the topic from a broad range of perspectives. Over the last decade, it has become increasingly affordable to digitize shape information at high resolution. Yet analyzing and processing this data remains challenging because of the large amount of data involved, and because modern applications such as human-computer interaction require real-time processing. Meeting these challenges requires interdisciplinary approaches that combine concepts from a variety of research areas, including numerical computing, differential geometry, deformable shape modeling, sparse data representation, and machine learning. On the algorithmic side, many shape analysis tasks are modeled using partial differential equations, which can be solved using tools from the field of n...
National Research Council Canada - National Science Library
1998-01-01
.... Establishing proper job procedures is one of the benefits of conducting a job hazard analysis carefully studying and recording each step of a job, identifying existing or potential job hazards...
Main: Nucleotide Analysis [KOME
Lifescience Database Archive (English)
Full Text Available Nucleotide Analysis Japonica genome blast search result Result of blastn search against jap...onica genome sequence kome_japonica_genome_blast_search_result.zip kome_japonica_genome_blast_search_result ...
Schuller, Björn W
2013-01-01
This book provides the reader with the knowledge necessary for comprehension of the field of Intelligent Audio Analysis. It firstly introduces standard methods and discusses the typical Intelligent Audio Analysis chain going from audio data to audio features to audio recognition. Further, an introduction to audio source separation, and enhancement and robustness are given. After the introductory parts, the book shows several applications for the three types of audio: speech, music, and general sound. Each task is shortly introduced, followed by a description of the specific data and methods applied, experiments and results, and a conclusion for this specific task. The books provides benchmark results and standardized test-beds for a broader range of audio analysis tasks. The main focus thereby lies on the parallel advancement of realism in audio analysis, as too often today’s results are overly optimistic owing to idealized testing conditions, and it serves to stimulate synergies arising from transfer of ...
Scientific stream pollution analysis
National Research Council Canada - National Science Library
Nemerow, Nelson Leonard
1974-01-01
A comprehensive description of the analysis of water pollution that presents a careful balance of the biological,hydrological, chemical and mathematical concepts involved in the evaluation of stream...
International Nuclear Information System (INIS)
Andreeva, J; Maier, G; Spiga, D; Calloni, M; Colling, D; Fanzago, F; D'Hondt, J; Maes, J; Van Mulders, P; Villella, I; Klem, J; Letts, J; Padhi, S; Sarkar, S
2010-01-01
During normal data taking CMS expects to support potentially as many as 2000 analysis users. Since the beginning of 2008 there have been more than 800 individuals who submitted a remote analysis job to the CMS computing infrastructure. The bulk of these users will be supported at the over 40 CMS Tier-2 centres. Supporting a globally distributed community of users on a globally distributed set of computing clusters is a task that requires reconsidering the normal methods of user support for Analysis Operations. In 2008 CMS formed an Analysis Support Task Force in preparation for large-scale physics analysis activities. The charge of the task force was to evaluate the available support tools, the user support techniques, and the direct feedback of users with the goal of improving the success rate and user experience when utilizing the distributed computing environment. The task force determined the tools needed to assess and reduce the number of non-zero exit code applications submitted through the grid interfaces and worked with the CMS experiment dashboard developers to obtain the necessary information to quickly and proactively identify issues with user jobs and data sets hosted at various sites. Results of the analysis group surveys were compiled. Reference platforms for testing and debugging problems were established in various geographic regions. The task force also assessed the resources needed to make the transition to a permanent Analysis Operations task. In this presentation the results of the task force will be discussed as well as the CMS Analysis Operations plans for the start of data taking.
Invitation to classical analysis
Duren, Peter
2012-01-01
This book gives a rigorous treatment of selected topics in classical analysis, with many applications and examples. The exposition is at the undergraduate level, building on basic principles of advanced calculus without appeal to more sophisticated techniques of complex analysis and Lebesgue integration. Among the topics covered are Fourier series and integrals, approximation theory, Stirling's formula, the gamma function, Bernoulli numbers and polynomials, the Riemann zeta function, Tauberian theorems, elliptic integrals, ramifications of the Cantor set, and a theoretical discussion of differ
Analysis of irradiated materials
International Nuclear Information System (INIS)
Bellamy, B.A.
1988-01-01
Papers presented at the UKAEA Conference on Materials Analysis by Physical Techniques (1987) covered a wide range of techniques as applied to the analysis of irradiated materials. These varied from reactor component materials, materials associated with the Authority's radwaste disposal programme, fission products and products associated with the decommissioning of nuclear reactors. An invited paper giving a very comprehensive review of Laser Ablation Microprobe Mass Spectroscopy (LAMMS) was included in the programme. (author)
Oktavianto, Digit
2013-01-01
This book is a step-by-step, practical tutorial for analyzing and detecting malware and performing digital investigations. This book features clear and concise guidance in an easily accessible format.Cuckoo Malware Analysis is great for anyone who wants to analyze malware through programming, networking, disassembling, forensics, and virtualization. Whether you are new to malware analysis or have some experience, this book will help you get started with Cuckoo Sandbox so you can start analysing malware effectively and efficiently.
International Nuclear Information System (INIS)
Niehaus, F.
1988-01-01
In this paper, the risks of various energy systems are discussed considering severe accidents analysis, particularly the probabilistic safety analysis, and probabilistic safety criteria, and the applications of these criteria and analysis. The comparative risk analysis has demonstrated that the largest source of risk in every society is from daily small accidents. Nevertheless, we have to be more concerned about severe accidents. The comparative risk analysis of five different energy systems (coal, oil, gas, LWR and STEC (Solar)) for the public has shown that the main sources of risks are coal and oil. The latest comparative risk study of various energy has been conducted in the USA and has revealed that the number of victims from coal is 42 as many than victims from nuclear. A study for severe accidents from hydro-dams in United States has estimated the probability of dam failures at 1 in 10,000 years and the number of victims between 11,000 and 260,000. The average occupational risk from coal is one fatal accident in 1,000 workers/year. The probabilistic safety analysis is a method that can be used to assess nuclear energy risks, and to analyze the severe accidents, and to model all possible accident sequences and consequences. The 'Fault tree' analysis is used to know the probability of failure of the different systems at each point of accident sequences and to calculate the probability of risks. After calculating the probability of failure, the criteria for judging the numerical results have to be developed, that is the quantitative and qualitative goals. To achieve these goals, several systems have been devised by various countries members of AIEA. The probabilistic safety ana-lysis method has been developed by establishing a computer program permit-ting to know different categories of safety related information. 19 tabs. (author)
International Nuclear Information System (INIS)
Arien, B.
1998-01-01
The objective of SCK-CEN's programme on reactor safety is to develop expertise in probabilistic and deterministic reactor safety analysis. The research programme consists of four main activities, in particular the development of software for reliability analysis of large systems and participation in the international PHEBUS-FP programme for severe accidents, the development of an expert system for the aid to diagnosis; the development and application of a probabilistic reactor dynamics method. Main achievements in 1999 are reported
Brady, Sir Michael; Highnam, Ralph; Irving, Benjamin; Schnabel, Julia A
2016-10-01
Cancer is one of the world's major healthcare challenges and, as such, an important application of medical image analysis. After a brief introduction to cancer, we summarise some of the major developments in oncological image analysis over the past 20 years, but concentrating those in the authors' laboratories, and then outline opportunities and challenges for the next decade. Copyright © 2016 Elsevier B.V. All rights reserved.
International Nuclear Information System (INIS)
Tibari, Elghali; Taous, Fouad; Marah, Hamid
2014-01-01
This report presents results related to stable isotopes analysis carried out at the CNESTEN DASTE in Rabat (Morocco), on behalf of Senegal. These analyzes cover 127 samples. These results demonstrate that Oxygen-18 and Deuterium in water analysis were performed by infrared Laser spectroscopy using a LGR / DLT-100 with Autosampler. Also, the results are expressed in δ values (‰) relative to V-SMOW to ± 0.3 ‰ for oxygen-18 and ± 1 ‰ for deuterium.
Oden, J Tinsley
2010-01-01
The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010
Griffel, DH
2002-01-01
A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the
Ivan Stosic; Drasko Nikolic; Aleksandar Zdravkovic
2012-01-01
The main purpose of this paper is to examine the impact of the current Serbian macro-environment on the businesses through the implementation of PEST analysis as a framework for assessing general or macro environment in which companies are operating. The authors argue the elements in presented PEST analysis indicate that the current macro-environment is characterized by the dominance of threats and weaknesses with few opportunities and strengths. Consequently, there is a strong need for faste...
Forensic neutron activation analysis
International Nuclear Information System (INIS)
Kishi, T.
1987-01-01
The progress of forensic neutron activation analysis (FNAA) in Japan is described. FNAA began in 1965 and during the past 20 years many cases have been handled; these include determination of toxic materials, comparison examination of physical evidences (e.g., paints, metal fragments, plastics and inks) and drug sample differentiation. Neutron activation analysis is applied routinely to the scientific criminal investigation as one of multielement analytical techniques. This paper also discusses these routine works. (author) 14 refs
Probabilistic Structural Analysis Program
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
Directory of Open Access Journals (Sweden)
Iulian N. BUJOREANU
2011-01-01
Full Text Available Sensitivity analysis represents such a well known and deeply analyzed subject that anyone to enter the field feels like not being able to add anything new. Still, there are so many facets to be taken into consideration.The paper introduces the reader to the various ways sensitivity analysis is implemented and the reasons for which it has to be implemented in most analyses in the decision making processes. Risk analysis is of outmost importance in dealing with resource allocation and is presented at the beginning of the paper as the initial cause to implement sensitivity analysis. Different views and approaches are added during the discussion about sensitivity analysis so that the reader develops an as thoroughly as possible opinion on the use and UTILITY of the sensitivity analysis. Finally, a round-up conclusion brings us to the question of the possibility of generating the future and analyzing it before it unfolds so that, when it happens it brings less uncertainty.
International Nuclear Information System (INIS)
Grimanis, A.P.
1985-01-01
A review of research and development on NAA as well as examples of applications of this method are presented, taken from work carried out over the last 21 years at the Radioanalytical Laboratory of the Department of Chemistry in the Greek Nuclear Research Center ''Demokritos''. Improved and faster radiochemical NAA methods have been developed for the determination of Au, Ni, Cl, As, Cu, U, Cr, Eu, Hg and Mo in several materials, for the simultaneous determination of Br and I; Mg, Sr and Ni; As and Cu; As, Sb and Hg; Mn, Sr and Ba; Cd and Zn; Se and As; Mo and Cr in biological materials. Instrumental NAA methods have also been developed for the determination of Ag, Cl and Na in lake waters, Al, Ca, Mg and V in wines, 7 trace elements in biological materials, 17 trace elements in sediments and 20 minor and trace elements in ceramics. A comprehensive computer program for routine activation analysis using Ge(Li) detectors have been worked out. A rather extended charged-particle activation analysis program is carried out for the last 10 years, including particle induced X-ray emission (PIXE) analysis, particle induced prompt gamma-ray emission analysis (PIGE), other nuclear reactions and proton activation analysis. A special neutron activation method, the delayed fission neutron counting method is used for the analysis of fissionable elements, as U, Th, Pu, in samples of the whole nuclear fuel cycle including geological, enriched and nuclear safeguards samples
Integrated genetic analysis microsystems
International Nuclear Information System (INIS)
Lagally, Eric T; Mathies, Richard A
2004-01-01
With the completion of the Human Genome Project and the ongoing DNA sequencing of the genomes of other animals, bacteria, plants and others, a wealth of new information about the genetic composition of organisms has become available. However, as the demand for sequence information grows, so does the workload required both to generate this sequence and to use it for targeted genetic analysis. Microfabricated genetic analysis systems are well poised to assist in the collection and use of these data through increased analysis speed, lower analysis cost and higher parallelism leading to increased assay throughput. In addition, such integrated microsystems may point the way to targeted genetic experiments on single cells and in other areas that are otherwise very difficult. Concomitant with these advantages, such systems, when fully integrated, should be capable of forming portable systems for high-speed in situ analyses, enabling a new standard in disciplines such as clinical chemistry, forensics, biowarfare detection and epidemiology. This review will discuss the various technologies available for genetic analysis on the microscale, and efforts to integrate them to form fully functional robust analysis devices. (topical review)
INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES
Directory of Open Access Journals (Sweden)
Caescu Stefan Claudiu
2011-12-01
Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such
Macondo-1 well oil in sediment and tarballs from the northern Gulf of Mexico shoreline
Wong, Florence L.; Rosenbauer, Robert J.; Campbell, Pamela L.; Lam, Angela; Lorenson, T.D.; Hostettler, Frances D.; Thomas, Burt
2011-01-01
From April 20 through July 15, 2010, an estimated 4.4 million barrels (1 barrel = 42 gallons [~700,000 cu m]) of crude oil spilled into the northern Gulf of Mexico (nGOM) from the ruptured British Petroleum (BP) Macondo-1 (M-1) well after the explosion of the drilling platform Deepwater Horizon. In addition, ~1.84 million gallons (~7,000 cu m) of hydrocarbon-based Corexit dispersants were applied to the oil both on and below the sea surface (Operational Science Advisory Team, 2010). An estimate of the total extent of the surface oil slick, derived from wind, ocean currents, aerial photography, and satellite imagery, was 68,000 square miles (~180,000 sq km; Amos and Norse, 2010). Spilled oil from this event impacted sensitive habitat along the shores of the nGOM. In response to this environmental catastrophe, the U.S. Geological Survey (USGS) collected coastal sediment and tarball samples along the shores of the nGOM from Texas to Florida before and after oil made landfall. These sites included priority areas of the nGOM at highest risk for oil contamination. These areas included coastal wetlands, shorelines, and barrier islands that could suffer severe environmental damage if a significant amount of oil came ashore. Samples were collected before oil reached land from 69 sites; 49 were revisited to collect samples after oil landfall. This poster focuses on the samples from locations that were sampled on both occasions. The USGS samples and one M-1 well-oil sample provided by BP were analyzed for a suite of diagnostic geochemical biomarkers. Aided by multivariate statistical analysis, the M-1 well oil was not detected in the samples collected before landfall but have been identified in sediment and tarballs collected from Louisiana, Alabama, Mississippi, and Florida after landfall. None of the sediment hydrocarbon extracts from Texas correlated with the M-1 well oil. Oil-impacted sediment is confined to the shoreline adjacent to the cumulative oil slick of the
Directory of Open Access Journals (Sweden)
Christina A. Kellogg
2017-05-01
Full Text Available Over the last decade, publications on deep-sea corals have tripled. Most attention has been paid to Lophelia pertusa, a globally distributed scleractinian coral that creates critical three-dimensional habitat in the deep ocean. The bacterial community associated with L. pertusa has been previously described by a number of studies at sites in the Mediterranean Sea, Norwegian fjords, off Great Britain, and in the Gulf of Mexico (GOM. However, use of different methodologies prevents direct comparisons in most cases. Our objectives were to address intra-regional variation and to identify any conserved bacterial core community. We collected samples from three distinct colonies of L. pertusa at each of four locations within the western Atlantic: three sites within the GOM and one off the east coast of the United States. Amplicon libraries of 16S rRNA genes were generated using primers targeting the V4–V5 hypervariable region and 454 pyrosequencing. The dominant phylum was Proteobacteria (75–96%. At the family level, 80–95% of each sample was comprised of five groups: Pirellulaceae, Pseudonocardiaceae, Rhodobacteraceae, Sphingomonadaceae, and unclassified Oceanospirillales. Principal coordinate analysis based on weighted UniFrac distances showed a clear distinction between the GOM and Atlantic samples. Interestingly, the replicate samples from each location did not always cluster together, indicating there is not a strong site-specific influence. The core bacterial community, conserved in 100% of the samples, was dominated by the operational taxonomic units of genera Novosphingobium and Pseudonocardia, both known degraders of aromatic hydrocarbons. The sequence of another core member, Propionibacterium, was also found in prior studies of L. pertusa from Norway and Great Britain, suggesting a role as a conserved symbiont. By examining more than 40,000 sequences per sample, we found that GOM samples were dominated by the identified conserved core
Professionalizing Intelligence Analysis
Directory of Open Access Journals (Sweden)
James B. Bruce
2015-09-01
Full Text Available This article examines the current state of professionalism in national security intelligence analysis in the U.S. Government. Since the introduction of major intelligence reforms directed by the Intelligence Reform and Terrorism Prevention Act (IRTPA in December, 2004, we have seen notable strides in many aspects of intelligence professionalization, including in analysis. But progress is halting, uneven, and by no means permanent. To consolidate its gains, and if it is to continue improving, the U.S. intelligence community (IC should commit itself to accomplishing a new program of further professionalization of analysis to ensure that it will develop an analytic cadre that is fully prepared to deal with the complexities of an emerging multipolar and highly dynamic world that the IC itself is forecasting. Some recent reforms in intelligence analysis can be assessed against established standards of more fully developed professions; these may well fall short of moving the IC closer to the more fully professionalized analytical capability required for producing the kind of analysis needed now by the United States.
Harmonic and geometric analysis
Citti, Giovanna; Pérez, Carlos; Sarti, Alessandro; Zhong, Xiao
2015-01-01
This book presents an expanded version of four series of lectures delivered by the authors at the CRM. Harmonic analysis, understood in a broad sense, has a very wide interplay with partial differential equations and in particular with the theory of quasiconformal mappings and its applications. Some areas in which real analysis has been extremely influential are PDE's and geometric analysis. Their foundations and subsequent developments made extensive use of the Calderón–Zygmund theory, especially the Lp inequalities for Calderón–Zygmund operators (Beurling transform and Riesz transform, among others) and the theory of Muckenhoupt weights. The first chapter is an application of harmonic analysis and the Heisenberg group to understanding human vision, while the second and third chapters cover some of the main topics on linear and multilinear harmonic analysis. The last serves as a comprehensive introduction to a deep result from De Giorgi, Moser and Nash on the regularity of elliptic partial differen...
Hansson, Sven Ove; Aven, Terje
2014-07-01
This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.
Zhou, Qing; Son, Kyungjin; Liu, Ying; Revzin, Alexander
2015-01-01
Biosensors first appeared several decades ago to address the need for monitoring physiological parameters such as oxygen or glucose in biological fluids such as blood. More recently, a new wave of biosensors has emerged in order to provide more nuanced and granular information about the composition and function of living cells. Such biosensors exist at the confluence of technology and medicine and often strive to connect cell phenotype or function to physiological or pathophysiological processes. Our review aims to describe some of the key technological aspects of biosensors being developed for cell analysis. The technological aspects covered in our review include biorecognition elements used for biosensor construction, methods for integrating cells with biosensors, approaches to single-cell analysis, and the use of nanostructured biosensors for cell analysis. Our hope is that the spectrum of possibilities for cell analysis described in this review may pique the interest of biomedical scientists and engineers and may spur new collaborations in the area of using biosensors for cell analysis.
Clinical reasoning: concept analysis.
Simmons, Barbara
2010-05-01
This paper is a report of a concept analysis of clinical reasoning in nursing. Clinical reasoning is an ambiguous term that is often used synonymously with decision-making and clinical judgment. Clinical reasoning has not been clearly defined in the literature. Healthcare settings are increasingly filled with uncertainty, risk and complexity due to increased patient acuity, multiple comorbidities, and enhanced use of technology, all of which require clinical reasoning. Data sources. Literature for this concept analysis was retrieved from several databases, including CINAHL, PubMed, PsycINFO, ERIC and OvidMEDLINE, for the years 1980 to 2008. Rodgers's evolutionary method of concept analysis was used because of its applicability to concepts that are still evolving. Multiple terms have been used synonymously to describe the thinking skills that nurses use. Research in the past 20 years has elucidated differences among these terms and identified the cognitive processes that precede judgment and decision-making. Our concept analysis defines one of these terms, 'clinical reasoning,' as a complex process that uses cognition, metacognition, and discipline-specific knowledge to gather and analyse patient information, evaluate its significance, and weigh alternative actions. This concept analysis provides a middle-range descriptive theory of clinical reasoning in nursing that helps clarify meaning and gives direction for future research. Appropriate instruments to operationalize the concept need to be developed. Research is needed to identify additional variables that have an impact on clinical reasoning and what are the consequences of clinical reasoning in specific situations.
International Nuclear Information System (INIS)
Sitek, J.; Degmova, J.; Dekan, J.
2011-01-01
Meteorite Kosice fell down 28 th of February 2010 near the Kosice and represents an unique find, because the last fall of meteorite was observed in Slovakia at the year 1895. It supposes that for this kind of meteorite the orbit in cosmic space could be calculated. This is one of most important part because until now 13 places of meteorite find are known in the world of which cosmic orbit in space have been calculated. Slovakia is member of international bolide net, dealing with meteorite analysis in Middle Europe .Analysis of Kosice meteorite will also concern at the long live and short live nuclides. Results should be a contribution to determination of radiation and formative ages. From structural analysis of meteorite it will be possible to compare it with similar types of meteorites. In this work Moessbauer spectroscopy will be used for phase analysis from point of view iron contain components with the aim to identify magnetic and non magnetic fractions. From the analysis of magnetic part we can find that the first sextet with hyperfine magnetic field 33.5 T corresponds to bcc Fe-Ni alloy (kamacite) and second with field 31.5 T to FeS (triolite). Meteorites with mentioned composition belong to the mineral group of chondrites. Comparing our parameters with results of measurements at the similar meteorites we can conclude that Kosice meteorite contains the same components. According all Moessbauer parameters we can also include this meteorite in the mineral group of chondrites. (authors)
Foundations of VISAR analysis.
Energy Technology Data Exchange (ETDEWEB)
Dolan, Daniel H.
2006-06-01
The Velocity Interferometer System for Any Reflector (VISAR) is a widely used diagnostic at Sandia National Laboratories. Although the operating principles of the VISAR are well established, recently deployed systems (such as the fast push-pull and air delay VISAR) require more careful consideration, and many common assumptions about VISAR are coming into question. This report presents a comprehensive review of VISAR analysis to address these issues. Detailed treatment of several interferometer configurations is given to identify important aspects of the operation and characterization of VISAR systems. The calculation of velocity from interferometer measurements is also described. The goal is to derive the standard VISAR analysis relationships, indicate when these relationships are valid, and provide alternative methods when the standard analysis fails.
Pugh, Charles C
2015-01-01
Based on an honors course taught by the author at UC Berkeley, this introduction to undergraduate real analysis gives a different emphasis by stressing the importance of pictures and hard problems. Topics include: a natural construction of the real numbers, four-dimensional visualization, basic point-set topology, function spaces, multivariable calculus via differential forms (leading to a simple proof of the Brouwer Fixed Point Theorem), and a pictorial treatment of Lebesgue theory. Over 150 detailed illustrations elucidate abstract concepts and salient points in proofs. The exposition is informal and relaxed, with many helpful asides, examples, some jokes, and occasional comments from mathematicians, such as Littlewood, Dieudonné, and Osserman. This book thus succeeds in being more comprehensive, more comprehensible, and more enjoyable, than standard introductions to analysis. New to the second edition of Real Mathematical Analysis is a presentation of Lebesgue integration done almost entirely using the un...
Digital Fourier analysis fundamentals
Kido, Ken'iti
2015-01-01
This textbook is a thorough, accessible introduction to digital Fourier analysis for undergraduate students in the sciences. Beginning with the principles of sine/cosine decomposition, the reader walks through the principles of discrete Fourier analysis before reaching the cornerstone of signal processing: the Fast Fourier Transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Fundamentals" includes practice problems and thorough Appendices for the advanced reader. As a special feature, the book includes interactive applets (available online) that mirror the illustrations. These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. For example, a real sine signal can be treated as a sum of clockwise and counter-clockwise rotating vectors. The applet illustration included with the book animates the rotating vectors and the resulting sine signal. By changing parameters such as amplitude and frequency, the reader ca...
Frank, IE
1994-01-01
Analyzing observed or measured data is an important step in applied sciences. The recent increase in computer capacity has resulted in a revolution both in data collection and data analysis. An increasing number of scientists, researchers and students are venturing into statistical data analysis; hence the need for more guidance in this field, which was previously dominated mainly by statisticians. This handbook fills the gap in the range of textbooks on data analysis. Written in a dictionary format, it will serve as a comprehensive reference book in a rapidly growing field. However, this book is more structured than an ordinary dictionary, where each entry is a separate, self-contained entity. The authors provide not only definitions and short descriptions, but also offer an overview of the different topics. Therefore, the handbook can also be used as a companion to textbooks for undergraduate or graduate courses. 1700 entries are given in alphabetical order grouped into 20 topics and each topic is organized...
International Nuclear Information System (INIS)
Arien, B.
1998-01-01
Risk assessments of nuclear installations require accurate safety and reliability analyses to estimate the consequences of accidental events and their probability of occurrence. The objective of the work performed in this field at the Belgian Nuclear Research Centre SCK-CEN is to develop expertise in probabilistic and deterministic reactor safety analysis. The four main activities of the research project on reactor safety analysis are: (1) the development of software for the reliable analysis of large systems; (2) the development of an expert system for the aid to diagnosis; (3) the development and the application of a probabilistic reactor-dynamics method, and (4) to participate in the international PHEBUS-FP programme for severe accidents. Progress in research during 1997 is described
Energy Technology Data Exchange (ETDEWEB)
Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-05-12
The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.
International Nuclear Information System (INIS)
Deville, J.P.
1998-01-01
Nowadays, there are a lot of surfaces analysis methods, each having its specificity, its qualities, its constraints (for instance vacuum) and its limits. Expensive in time and in investment, these methods have to be used deliberately. This article appeals to non specialists. It gives some elements of choice according to the studied information, the sensitivity, the use constraints or the answer to a precise question. After having recalled the fundamental principles which govern these analysis methods, based on the interaction between radiations (ultraviolet, X) or particles (ions, electrons) with matter, two methods will be more particularly described: the Auger electron spectroscopy (AES) and x-rays photoemission spectroscopy (ESCA or XPS). Indeed, they are the most widespread methods in laboratories, the easier for use and probably the most productive for the analysis of surface of industrial materials or samples submitted to treatments in aggressive media. (O.M.)
Power electronics reliability analysis.
Energy Technology Data Exchange (ETDEWEB)
Smith, Mark A.; Atcitty, Stanley
2009-12-01
This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.
International Nuclear Information System (INIS)
Gregg, H.R.; Meltzer, M.P.
1996-01-01
The portable Contamination Analysis Unit (CAU) measures trace quantities of surface contamination in real time. The detector head of the portable contamination analysis unit has an opening with an O-ring seal, one or more vacuum valves and a small mass spectrometer. With the valve closed, the mass spectrometer is evacuated with one or more pumps. The O-ring seal is placed against a surface to be tested and the vacuum valve is opened. Data is collected from the mass spectrometer and a portable computer provides contamination analysis. The CAU can be used to decontaminate and decommission hazardous and radioactive surfaces by measuring residual hazardous surface contamination, such as tritium and trace organics. It provides surface contamination data for research and development applications as well as real-time process control feedback for industrial cleaning operations and can be used to determine the readiness of a surface to accept bonding or coatings. 1 fig
Jorgensen, Palle
2017-01-01
The book features new directions in analysis, with an emphasis on Hilbert space, mathematical physics, and stochastic processes. We interpret 'non-commutative analysis' broadly to include representations of non-Abelian groups, and non-Abelian algebras; emphasis on Lie groups and operator algebras (C* algebras and von Neumann algebras.)A second theme is commutative and non-commutative harmonic analysis, spectral theory, operator theory and their applications. The list of topics includes shift invariant spaces, group action in differential geometry, and frame theory (over-complete bases) and their applications to engineering (signal processing and multiplexing), projective multi-resolutions, and free probability algebras.The book serves as an accessible introduction, offering a timeless presentation, attractive and accessible to students, both in mathematics and in neighboring fields.
DEFF Research Database (Denmark)
Frigaard, Peter; Andersen, Thomas Lykke
The present book describes the most important aspects of wave analysis techniques applied to physical model tests. Moreover, the book serves as technical documentation for the wave analysis software WaveLab 3, cf. Aalborg University (2012). In that respect it should be mentioned that supplementary...... to the present technical documentation exists also the online help document describing the WaveLab software in detail including all the inputs and output fields. In addition to the two main authors also Tue Hald, Jacob Helm-Petersen and Morten Møller Jakobsen have contributed to the note. Their input is highly...... acknowledged. The outline of the book is as follows: • Chapter 2 and 3 describes analysis of waves in time and frequency domain. • Chapter 4 and 5 describes the separation of incident and reflected waves for the two-dimensional case. • Chapter 6 describes the estimation of the directional spectra which also...
Canuto, Claudio
2015-01-01
The purpose of the volume is to provide a support textbook for a second lecture course on Mathematical Analysis. The contents are organised to suit, in particular, students of Engineering, Computer Science and Physics, all areas in which mathematical tools play a crucial role. The basic notions and methods concerning integral and differential calculus for multivariable functions, series of functions and ordinary differential equations are presented in a manner that elicits critical reading and prompts a hands-on approach to concrete applications. The pedagogical layout echoes the one used in the companion text Mathematical Analysis I. The book’s structure has a specifically-designed modular nature, which allows for great flexibility in the preparation of a lecture course on Mathematical Analysis. The style privileges clarity in the exposition and a linear progression through the theory. The material is organised on two levels. The first, reflected in this book, allows students to grasp the essential ideas, ...
Bandemer, Hans
1992-01-01
Fuzzy data such as marks, scores, verbal evaluations, imprecise observations, experts' opinions and grey tone pictures, are quite common. In Fuzzy Data Analysis the authors collect their recent results providing the reader with ideas, approaches and methods for processing such data when looking for sub-structures in knowledge bases for an evaluation of functional relationship, e.g. in order to specify diagnostic or control systems. The modelling presented uses ideas from fuzzy set theory and the suggested methods solve problems usually tackled by data analysis if the data are real numbers. Fuzzy Data Analysis is self-contained and is addressed to mathematicians oriented towards applications and to practitioners in any field of application who have some background in mathematics and statistics.
Zorich, Vladimir A
2015-01-01
VLADIMIR A. ZORICH is professor of mathematics at Moscow State University. His areas of specialization are analysis, conformal geometry, quasiconformal mappings, and mathematical aspects of thermodynamics. He solved the problem of global homeomorphism for space quasiconformal mappings. He holds a patent in the technology of mechanical engineering, and he is also known by his book Mathematical Analysis of Problems in the Natural Sciences . This second English edition of a very popular two-volume work presents a thorough first course in analysis, leading from real numbers to such advanced topics as differential forms on manifolds; asymptotic methods; Fourier, Laplace, and Legendre transforms; elliptic functions; and distributions. Especially notable in this course are the clearly expressed orientation toward the natural sciences and the informal exploration of the essence and the roots of the basic concepts and theorems of calculus. Clarity of exposition is matched by a wealth of instructive exercises, problems...
DEFF Research Database (Denmark)
Christensen, Ole; Feichtinger, Hans G.; Paukner, Stephan
2015-01-01
, it characterizes a function by its transform over phase space, which is the time–frequency plane (TF-plane) in a musical context or the location–wave-number domain in the context of image processing. Since the transition from the signal domain to the phase space domain introduces an enormous amount of data...... of the generalities relevant for an understanding of Gabor analysis of functions on Rd. We pay special attention to the case d = 2, which is the most important case for image processing and image analysis applications. The chapter is organized as follows. Section 2 presents central tools from functional analysis......, the application of Gabor expansions to image representation is considered in Sect. 6....
Sohrab, Houshang H
2014-01-01
This expanded second edition presents the fundamentals and touchstone results of real analysis in full rigor, but in a style that requires little prior familiarity with proofs or mathematical language. The text is a comprehensive and largely self-contained introduction to the theory of real-valued functions of a real variable. The chapters on Lebesgue measure and integral have been rewritten entirely and greatly improved. They now contain Lebesgue’s differentiation theorem as well as his versions of the Fundamental Theorem(s) of Calculus. With expanded chapters, additional problems, and an expansive solutions manual, Basic Real Analysis, Second Edition, is ideal for senior undergraduates and first-year graduate students, both as a classroom text and a self-study guide. Reviews of first edition: The book is a clear and well-structured introduction to real analysis aimed at senior undergraduate and beginning graduate students. The prerequisites are few, but a certain mathematical sophistication is required. ....
Software safety hazard analysis
International Nuclear Information System (INIS)
Lawrence, J.D.
1996-02-01
Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper
Vágner, Petr; Pavelka, Michal; Maršík, František
2017-04-01
The well-known Gouy-Stodola theorem states that a device produces maximum useful power when working reversibly, that is with no entropy production inside the device. This statement then leads to a method of thermodynamic optimization based on entropy production minimization. Exergy destruction (difference between exergy of fuel and exhausts) is also given by entropy production inside the device. Therefore, assessing efficiency of a device by exergy analysis is also based on the Gouy-Stodola theorem. However, assumptions that had led to the Gouy-Stodola theorem are not satisfied in several optimization scenarios, e.g. non-isothermal steady-state fuel cells, where both entropy production minimization and exergy analysis should be used with caution. We demonstrate, using non-equilibrium thermodynamics, a few cases where entropy production minimization and exergy analysis should not be applied.
Directory of Open Access Journals (Sweden)
Sutawanir Darwis
2012-05-01
Full Text Available Empirical decline curve analysis of oil production data gives reasonable answer in hyperbolic type curves situations; however the methodology has limitations in fitting real historical production data in present of unusual observations due to the effect of the treatment to the well in order to increase production capacity. The development ofrobust least squares offers new possibilities in better fitting production data using declinecurve analysis by down weighting the unusual observations. This paper proposes a robustleast squares fitting lmRobMM approach to estimate the decline rate of daily production data and compares the results with reservoir simulation results. For case study, we usethe oil production data at TBA Field West Java. The results demonstrated that theapproach is suitable for decline curve fitting and offers a new insight in decline curve analysis in the present of unusual observations.
Applied multivariate statistical analysis
Härdle, Wolfgang Karl
2015-01-01
Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners. It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added. All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior. All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...
Trajectory Based Traffic Analysis
DEFF Research Database (Denmark)
Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin
2013-01-01
We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...
International Nuclear Information System (INIS)
Johnstad, H.
1989-06-01
The Physics Analysis Workstation (PAW) is a high-level program providing data presentation and statistical or mathematical analysis. PAW has been developed at CERN as an instrument to assist physicists in the analysis and presentation of their data. The program is interfaced to a high level graphics package, based on basic underlying graphics. 3-D graphics capabilities are being implemented. The major objects in PAW are 1 or 2 dimensional binned event data with fixed number of entries per event, vectors, functions, graphics pictures, and macros. Command input is handled by an integrated user interface package, which allows for a variety of choices for input, either with typed commands, or in a tree structure menu driven mode. 6 refs., 1 fig
Choudary, A D R
2014-01-01
The book targets undergraduate and postgraduate mathematics students and helps them develop a deep understanding of mathematical analysis. Designed as a first course in real analysis, it helps students learn how abstract mathematical analysis solves mathematical problems that relate to the real world. As well as providing a valuable source of inspiration for contemporary research in mathematics, the book helps students read, understand and construct mathematical proofs, develop their problem-solving abilities and comprehend the importance and frontiers of computer facilities and much more. It offers comprehensive material for both seminars and independent study for readers with a basic knowledge of calculus and linear algebra. The first nine chapters followed by the appendix on the Stieltjes integral are recommended for graduate students studying probability and statistics, while the first eight chapters followed by the appendix on dynamical systems will be of use to students of biology and environmental scie...
Banks, David L; Rios Insua, David
2015-01-01
Flexible Models to Analyze Opponent Behavior A relatively new area of research, adversarial risk analysis (ARA) informs decision making when there are intelligent opponents and uncertain outcomes. Adversarial Risk Analysis develops methods for allocating defensive or offensive resources against intelligent adversaries. Many examples throughout illustrate the application of the ARA approach to a variety of games and strategic situations. The book shows decision makers how to build Bayesian models for the strategic calculation of their opponents, enabling decision makers to maximize their expected utility or minimize their expected loss. This new approach to risk analysis asserts that analysts should use Bayesian thinking to describe their beliefs about an opponent's goals, resources, optimism, and type of strategic calculation, such as minimax and level-k thinking. Within that framework, analysts then solve the problem from the perspective of the opponent while placing subjective probability distributions on a...
COMPUTER METHODS OF GENETIC ANALYSIS.
Directory of Open Access Journals (Sweden)
A. L. Osipov
2017-02-01
Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.
International Nuclear Information System (INIS)
Strait, R.S.
1996-01-01
The first phase of the Depleted Uranium Hexafluoride Management Program (Program)--management strategy selection--consists of several program elements: Technology Assessment, Engineering Analysis, Cost Analysis, and preparation of an Environmental Impact Statement (EIS). Cost Analysis will estimate the life-cycle costs associated with each of the long-term management strategy alternatives for depleted uranium hexafluoride (UF6). The scope of Cost Analysis will include all major expenditures, from the planning and design stages through decontamination and decommissioning. The costs will be estimated at a scoping or preconceptual design level and are intended to assist decision makers in comparing alternatives for further consideration. They will not be absolute costs or bid-document costs. The purpose of the Cost Analysis Guidelines is to establish a consistent approach to analyzing of cost alternatives for managing Department of Energy's (DOE's) stocks of depleted uranium hexafluoride (DUF6). The component modules that make up the DUF6 management program differ substantially in operational maintenance, process-options, requirements for R and D, equipment, facilities, regulatory compliance, (O and M), and operations risk. To facilitate a consistent and equitable comparison of costs, the guidelines offer common definitions, assumptions or basis, and limitations integrated with a standard approach to the analysis. Further, the goal is to evaluate total net life-cycle costs and display them in a way that gives DOE the capability to evaluate a variety of overall DUF6 management strategies, including commercial potential. The cost estimates reflect the preconceptual level of the designs. They will be appropriate for distinguishing among management strategies
Schramm, Michael J
2008-01-01
This text forms a bridge between courses in calculus and real analysis. It focuses on the construction of mathematical proofs as well as their final content. Suitable for upper-level undergraduates and graduate students of real analysis, it also provides a vital reference book for advanced courses in mathematics.The four-part treatment begins with an introduction to basic logical structures and techniques of proof, including discussions of the cardinality concept and the algebraic and order structures of the real and rational number systems. Part Two presents in-depth examinations of the compl
Fourier analysis an introduction
Stein, Elias M
2003-01-01
This first volume, a three-part introduction to the subject, is intended for students with a beginning knowledge of mathematical analysis who are motivated to discover the ideas that shape Fourier analysis. It begins with the simple conviction that Fourier arrived at in the early nineteenth century when studying problems in the physical sciences--that an arbitrary function can be written as an infinite sum of the most basic trigonometric functions.The first part implements this idea in terms of notions of convergence and summability of Fourier series, while highlighting applications such as th
DEFF Research Database (Denmark)
Reinau, Kristian Hegner
Traditionally, focus in the transport field, both politically and scientifically, has been on private cars and public transport. Freight transport has been a neglected topic. Recent years has seen an increased focus upon congestion as a core issue across Europe, resulting in a great need for know...... speed data for freight. Secondly, the analytical methods used, space-time cubes and emerging hot spot analysis, are also new in the freight transport field. The analysis thus estimates precisely how fast freight moves on the roads in Northern Jutland and how this has evolved over time....
Abrahams, J R; Hiller, N
1965-01-01
Signal Flow Analysis provides information pertinent to the fundamental aspects of signal flow analysis. This book discusses the basic theory of signal flow graphs and shows their relation to the usual algebraic equations.Organized into seven chapters, this book begins with an overview of properties of a flow graph. This text then demonstrates how flow graphs can be applied to a wide range of electrical circuits that do not involve amplification. Other chapters deal with the parameters as well as circuit applications of transistors. This book discusses as well the variety of circuits using ther
Automated Software Vulnerability Analysis
Sezer, Emre C.; Kil, Chongkyung; Ning, Peng
Despite decades of research, software continues to have vulnerabilities. Successful exploitations of these vulnerabilities by attackers cost millions of dollars to businesses and individuals. Unfortunately, most effective defensive measures, such as patching and intrusion prevention systems, require an intimate knowledge of the vulnerabilities. Many systems for detecting attacks have been proposed. However, the analysis of the exploited vulnerabilities is left to security experts and programmers. Both the human effortinvolved and the slow analysis process are unfavorable for timely defensive measure to be deployed. The problem is exacerbated by zero-day attacks.
Spectral analysis by correlation
International Nuclear Information System (INIS)
Fauque, J.M.; Berthier, D.; Max, J.; Bonnet, G.
1969-01-01
The spectral density of a signal, which represents its power distribution along the frequency axis, is a function which is of great importance, finding many uses in all fields concerned with the processing of the signal (process identification, vibrational analysis, etc...). Amongst all the possible methods for calculating this function, the correlation method (correlation function calculation + Fourier transformation) is the most promising, mainly because of its simplicity and of the results it yields. The study carried out here will lead to the construction of an apparatus which, coupled with a correlator, will constitute a set of equipment for spectral analysis in real time covering the frequency range 0 to 5 MHz. (author) [fr
DEFF Research Database (Denmark)
Josefsen, Knud; Nielsen, Henrik
2011-01-01
Northern blotting analysis is a classical method for analysis of the size and steady-state level of a specific RNA in a complex sample. In short, the RNA is size-fractionated by gel electrophoresis and transferred by blotting onto a membrane to which the RNA is covalently bound. Then, the membrane...... is analysed by hybridization to one or more specific probes that are labelled for subsequent detection. Northern blotting is relatively simple to perform, inexpensive, and not plagued by artefacts. Recent developments of hybridization membranes and buffers have resulted in increased sensitivity closing...
Subseabed disposal safety analysis
International Nuclear Information System (INIS)
Koplick, C.M.; Kabele, T.J.
1982-01-01
This report summarizes the status of work performed by Analytic Sciences Corporation (TASC) in FY'81 on subseabed disposal safety analysis. Safety analysis for subseabed disposal is divided into two phases: pre-emplacement which includes all transportation, handling, and emplacement activities; and long-term (post-emplacement), which is concerned with the potential hazard after waste is safely emplaced. Details of TASC work in these two areas are provided in two technical reports. The work to date, while preliminary, supports the technical and environmental feasibility of subseabed disposal of HLW
Sprecher, David A
2010-01-01
This classic text in introductory analysis delineates and explores the intermediate steps between the basics of calculus and the ultimate stage of mathematics: abstraction and generalization.Since many abstractions and generalizations originate with the real line, the author has made it the unifying theme of the text, constructing the real number system from the point of view of a Cauchy sequence (a step which Dr. Sprecher feels is essential to learn what the real number system is).The material covered in Elements of Real Analysis should be accessible to those who have completed a course in
Energy Technology Data Exchange (ETDEWEB)
Johnson, M A
1983-03-01
Energy analysis contributed to the public debate on the gasohol programme in the U.S. where this analysis became a legal requirement. The published energy analyses for gasohol are reviewed and we assess their inherent assumptions and data sources. The analyses are normalised to S.I. units to faciltate comparisons. The process of rationalising the various treatments uncovered areas of uncertainties particularly in the methodologies which could be used to analyse some parts of the process. Although the definitive study has still to be written, the consensus is that maize to fuel ethanol via the traditional fermentation route is a net consumer of energy. (Refs. 13).
Bhatia, Rajendra
2009-01-01
These notes are a record of a one semester course on Functional Analysis given by the author to second year Master of Statistics students at the Indian Statistical Institute, New Delhi. Students taking this course have a strong background in real analysis, linear algebra, measure theory and probability, and the course proceeds rapidly from the definition of a normed linear space to the spectral theorem for bounded selfadjoint operators in a Hilbert space. The book is organised as twenty six lectures, each corresponding to a ninety minute class session. This may be helpful to teachers planning a course on this topic. Well prepared students can read it on their own.
Analysis of maintenance strategies
International Nuclear Information System (INIS)
Laakso, K.; Simola, K.
1998-01-01
The main topics of the presentation include: (1) an analysis model and methods to evaluate maintenance action programs and the support decision to make changes in them and (2) to understand the maintenance strategies in a systems perspective as a basis for future developments. The subproject showed how systematic models for maintenance analysis and decision support, utilising computerised and statistical tool packages, can be taken into use for evaluation and optimisation of maintenance of active systems from the safety and economic point of view
Foundations of stochastic analysis
Rao, M M; Lukacs, E
1981-01-01
Foundations of Stochastic Analysis deals with the foundations of the theory of Kolmogorov and Bochner and its impact on the growth of stochastic analysis. Topics covered range from conditional expectations and probabilities to projective and direct limits, as well as martingales and likelihood ratios. Abstract martingales and their applications are also discussed. Comprised of five chapters, this volume begins with an overview of the basic Kolmogorov-Bochner theorem, followed by a discussion on conditional expectations and probabilities containing several characterizations of operators and mea
International Nuclear Information System (INIS)
Chatelus, R.; Schot, P.M.
2010-01-01
In order to verify compliance with safeguards and draw conclusions on the absence of undeclared nuclear material and activities, the International Atomic Energy Agency (IAEA) collects and analyses trade information that it receives from open sources as well as from Member States. Although the IAEA does not intervene in national export controls, it has to monitor the trade of dual use items. Trade analysis helps the IAEA to evaluate global proliferation threats, to understand States' ability to report exports according to additional protocols but also to compare against State declarations. Consequently, the IAEA has explored sources of trade-related information and has developed analysis methodologies beyond its traditional safeguards approaches. (author)
Cai, Tony
2010-01-01
Over the last few years, significant developments have been taking place in highdimensional data analysis, driven primarily by a wide range of applications in many fields such as genomics and signal processing. In particular, substantial advances have been made in the areas of feature selection, covariance estimation, classification and regression. This book intends to examine important issues arising from highdimensional data analysis to explore key ideas for statistical inference and prediction. It is structured around topics on multiple hypothesis testing, feature selection, regression, cla
Senthilkumar, K.; Ruchika Mehra Vijayan, E.
2017-11-01
This paper aims to illustrate real time analysis of large scale data. For practical implementation we are performing sentiment analysis on live Twitter feeds for each individual tweet. To analyze sentiments we will train our data model on sentiWordNet, a polarity assigned wordNet sample by Princeton University. Our main objective will be to efficiency analyze large scale data on the fly using distributed computation. Apache Spark and Apache Hadoop eco system is used as distributed computation platform with Java as development language
DEFF Research Database (Denmark)
Vatrapu, Ravi; Hussain, Abid; Buus Lassen, Niels
2015-01-01
of Facebook or Twitter data. However, there exist no other holistic computational social science approach beyond the relational sociology and graph theory of SNA. To address this limitation, this paper presents an alternative holistic approach to Big Social Data analytics called Social Set Analysis (SSA......This paper argues that the basic premise of Social Network Analysis (SNA) -- namely that social reality is constituted by dyadic relations and that social interactions are determined by structural properties of networks-- is neither necessary nor sufficient, for Big Social Data analytics...
Hoffman, Kenneth
2007-01-01
Developed for an introductory course in mathematical analysis at MIT, this text focuses on concepts, principles, and methods. Its introductions to real and complex analysis are closely formulated, and they constitute a natural introduction to complex function theory.Starting with an overview of the real number system, the text presents results for subsets and functions related to Euclidean space of n dimensions. It offers a rigorous review of the fundamentals of calculus, emphasizing power series expansions and introducing the theory of complex-analytic functions. Subsequent chapters cover seq
International Nuclear Information System (INIS)
Clark, R.D.
1996-01-01
This analysis defines and evaluates the surface water supply system from the existing J-13 well to the North Portal. This system includes the pipe running from J-13 to a proposed Booster Pump Station at the intersection of H Road and the North Portal access road. Contained herein is an analysis of the proposed Booster Pump Station with a brief description of the system that could be installed to the South Portal and the optional shaft. The tanks that supply the water to the North Portal are sized, and the supply system to the North Portal facilities and up to Topopah Spring North Ramp is defined
Structural analysis for Diagnosis
DEFF Research Database (Denmark)
Izadi-Zamanabadi, Roozbeh; Blanke, M.
2001-01-01
Aiming at design of algorithms for fault diagnosis, structural analysis of systems offers concise yet easy overall analysis. Graph-based matching, which is the essential technique to obtain redundant information for diagnosis, is re-considered in this paper. Matching is re-formulated as a problem...... of relating faults to known parameters and measurements of a system. Using explicit fault modelling, minimal over-determined subsystems are shown to provide necessary redundancy relations from the matching. Details of the method are presented and a realistic example used to clearly describe individual steps...
Structural analysis for diagnosis
DEFF Research Database (Denmark)
Izadi-Zamanabadi, Roozbeh; Blanke, M.
2002-01-01
Aiming at design of algorithms for fault diagnosis, structural analysis of systems offers concise yet easy overall analysis. Graph-based matching, which is the essential tech-nique to obtain redundant information for diagnosis, is reconsidered in this paper. Matching is reformulated as a problem...... of relating faults to known parameters and measurements of a system. Using explicit fault modelling, minimal overdetermined subsystems are shown to provide necessary redundancy relations from the matching. Details of the method are presented and a realistic example used to clearly describe individual steps....
International Nuclear Information System (INIS)
Bergman, R.
1980-12-01
The report describes the development of a method for in vivo Cd-analysis. The method is based on the analysis of the prompt gamma radiation which is emitted by neutron capture of the isotope Cd113. Different parts of the body can be analysed selectively by neutrons in the interval of 1 to 100 KeV. The results show that the level of Cd in Kidneys can be measured without exceeding the dose of 40 mrad and that only 20% uncertainty is introduced when analysing Cd. The development has been made at the R2 reactor in Studsvik using 25 KeV neutrons. (G.B.)
Brieda, Lubos
2015-01-01
This talk presents 3 different tools developed recently for contamination analysis:HTML QCM analyzer: runs in a web browser, and allows for data analysis of QCM log filesJava RGA extractor: can load in multiple SRS.ana files and extract pressure vs. time dataC++ Contamination Simulation code: 3D particle tracing code for modeling transport of dust particulates and molecules. Uses residence time to determine if molecules stick. Particulates can be sampled from IEST-STD-1246 and be accelerated by aerodynamic forces.
Beginning statistics with data analysis
Mosteller, Frederick; Rourke, Robert EK
2013-01-01
This introduction to the world of statistics covers exploratory data analysis, methods for collecting data, formal statistical inference, and techniques of regression and analysis of variance. 1983 edition.
Electronic Circuit Analysis Language (ECAL)
Chenghang, C.
1983-03-01
The computer aided design technique is an important development in computer applications and it is an important component of computer science. The special language for electronic circuit analysis is the foundation of computer aided design or computer aided circuit analysis (abbreviated as CACD and CACA) of simulated circuits. Electronic circuit analysis language (ECAL) is a comparatively simple and easy to use circuit analysis special language which uses the FORTRAN language to carry out the explanatory executions. It is capable of conducting dc analysis, ac analysis, and transient analysis of a circuit. Futhermore, the results of the dc analysis can be used directly as the initial conditions for the ac and transient analyses.
Prehistory analysis using photon activation analysis
International Nuclear Information System (INIS)
Krausova, I.; Chvatil, D.; Tajer, J.
2017-01-01
Instrumental photon activation analysis (IPAA) is a suitable radio-analytical method for non-destructive determination of total nitrogen in various matrices. IPAA determination of nitrogen is based on 14 N (γ, n) 13 N nuclear reaction after high-energy photon irradiation. The analytically usable product of this photo-nuclear reaction is a positron emitter emitting only non-specific annihilation of 511 keV, which can be emitted by other radionuclides present in the sample. Some of them, besides the non-specific 511 keV line, also emit specific lines that allow their contribution to analytical radionuclide 13 N to be subtracted. An efficient source of high-energy photon radiation is the secondary bremsstrahlung generated by the conversion of the electron beam accelerated by a high-frequency circular accelerator - a microtron. The non-destructive IPAA contributed to the clarification of the origins of a precious bracelet originating from a fortified settlement in the area of Karlovy Vary - Drahovice from the late Bronze Age. (authors)
Systems analysis - independent analysis and verification
Energy Technology Data Exchange (ETDEWEB)
DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)
1996-10-01
The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.
Learning Haskell data analysis
Church, James
2015-01-01
If you are a developer, analyst, or data scientist who wants to learn data analysis methods using Haskell and its libraries, then this book is for you. Prior experience with Haskell and a basic knowledge of data science will be beneficial.
International Nuclear Information System (INIS)
Malik, S; Bloom, K; Shipsey, I; Cavanaugh, R; Klima, B; Chan, Kai-Feng; D'Hondt, J; Narain, M; Palla, F; Rolandi, G; Schörner-Sadenius, T
2014-01-01
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
International Nuclear Information System (INIS)
2002-01-01
This document is one in a series of publications known as the ETDE/INIS Joint Reference Series and also constitutes a part of the ETDE Procedures Manual. It presents the rules, guidelines and procedures to be adopted by centers submitting input to the International Nuclear Information System (INIS) or the Energy Technology Data Exchange (ETDE). It is a manual for the subject analysis part of input preparation, meaning the selection, subject classification, abstracting and subject indexing of relevant publications, and is to be used in conjunction with the Thesauruses, Subject Categories documents and the documents providing guidelines for the preparation of abstracts. The concept and structure of the new manual are intended to describe in a logical and efficient sequence all the steps comprising the subject analysis of documents to be reported to INIS or ETDE. The manual includes new chapters on preparatory analysis, subject classification, abstracting and subject indexing, as well as rules, guidelines, procedures, examples and a special chapter on guidelines and examples for subject analysis in particular subject fields. (g.t.; a.n.)
Bedrossian, Nazareth; Jang, Jiann-Woei; McCants, Edward; Omohundro, Zachary; Ring, Tom; Templeton, Jeremy; Zoss, Jeremy; Wallace, Jonathan; Ziegler, Philip
2011-01-01
Draper Station Analysis Tool (DSAT) is a computer program, built on commercially available software, for simulating and analyzing complex dynamic systems. Heretofore used in designing and verifying guidance, navigation, and control systems of the International Space Station, DSAT has a modular architecture that lends itself to modification for application to spacecraft or terrestrial systems. DSAT consists of user-interface, data-structures, simulation-generation, analysis, plotting, documentation, and help components. DSAT automates the construction of simulations and the process of analysis. DSAT provides a graphical user interface (GUI), plus a Web-enabled interface, similar to the GUI, that enables a remotely located user to gain access to the full capabilities of DSAT via the Internet and Webbrowser software. Data structures are used to define the GUI, the Web-enabled interface, simulations, and analyses. Three data structures define the type of analysis to be performed: closed-loop simulation, frequency response, and/or stability margins. DSAT can be executed on almost any workstation, desktop, or laptop computer. DSAT provides better than an order of magnitude improvement in cost, schedule, and risk assessment for simulation based design and verification of complex dynamic systems.
Israel, Carsten W; Ekosso-Ejangue, Lucy; Sheta, Mohamed-Karim
2015-09-01
The key to a successful analysis of a pacemaker electrocardiogram (ECG) is the application of the systematic approach used for any other ECG without a pacemaker: analysis of (1) basic rhythm and rate, (2) QRS axis, (3) PQ, QRS and QT intervals, (4) morphology of P waves, QRS, ST segments and T(U) waves and (5) the presence of arrhythmias. If only the most obvious abnormality of a pacemaker ECG is considered, wrong conclusions can easily be drawn. If a systematic approach is skipped it may be overlooked that e.g. atrial pacing is ineffective, the left ventricle is paced instead of the right ventricle, pacing competes with intrinsic conduction or that the atrioventricular (AV) conduction time is programmed too long. Apart from this analysis, a pacemaker ECG which is not clear should be checked for the presence of arrhythmias (e.g. atrial fibrillation, atrial flutter, junctional escape rhythm and endless loop tachycardia), pacemaker malfunction (e.g. atrial or ventricular undersensing or oversensing, atrial or ventricular loss of capture) and activity of specific pacing algorithms, such as automatic mode switching, rate adaptation, AV delay modifying algorithms, reaction to premature ventricular contractions (PVC), safety window pacing, hysteresis and noise mode. A systematic analysis of the pacemaker ECG almost always allows a probable diagnosis of arrhythmias and malfunctions to be made, which can be confirmed by pacemaker control and can often be corrected at the touch of the right button to the patient's benefit.
Nancy E. Fleenor
2002-01-01
A Landscape Analysis Plan (LAP) sets out broad guidelines for project development within boundaries of the Kings River Sustainable Forest Ecosystems Project. The plan must be a dynamic, living document, subject to change as new information arises over the course of this very long-term project (several decades). Two watersheds, each of 32,000 acres, were dedicated to...
Bayesian logistic regression analysis
Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.
2012-01-01
In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an
Monotowns: A Quantitative Analysis
Directory of Open Access Journals (Sweden)
Shastitko Andrei
2016-06-01
Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.
SWOT ANALYSIS - CHINESE PETROLEUM
Chunlan Wang; Lei Zhang; Qi Zhong
2014-01-01
This article was written in early December 2013, combined with the historical development and the latest data on the Chinese Petroleum carried SWOTanalysis. This paper discusses corporate resources, cost, management and external factors such as the political environment and the market supply and demand, conducted a comprehensive and profound analysis.
Sensitivity Analysis Without Assumptions.
Ding, Peng; VanderWeele, Tyler J
2016-05-01
Unmeasured confounding may undermine the validity of causal inference with observational studies. Sensitivity analysis provides an attractive way to partially circumvent this issue by assessing the potential influence of unmeasured confounding on causal conclusions. However, previous sensitivity analysis approaches often make strong and untestable assumptions such as having an unmeasured confounder that is binary, or having no interaction between the effects of the exposure and the confounder on the outcome, or having only one unmeasured confounder. Without imposing any assumptions on the unmeasured confounder or confounders, we derive a bounding factor and a sharp inequality such that the sensitivity analysis parameters must satisfy the inequality if an unmeasured confounder is to explain away the observed effect estimate or reduce it to a particular level. Our approach is easy to implement and involves only two sensitivity parameters. Surprisingly, our bounding factor, which makes no simplifying assumptions, is no more conservative than a number of previous sensitivity analysis techniques that do make assumptions. Our new bounding factor implies not only the traditional Cornfield conditions that both the relative risk of the exposure on the confounder and that of the confounder on the outcome must satisfy but also a high threshold that the maximum of these relative risks must satisfy. Furthermore, this new bounding factor can be viewed as a measure of the strength of confounding between the exposure and the outcome induced by a confounder.
Seber, George A F
2012-01-01
Concise, mathematically clear, and comprehensive treatment of the subject.* Expanded coverage of diagnostics and methods of model fitting.* Requires no specialized knowledge beyond a good grasp of matrix algebra and some acquaintance with straight-line regression and simple analysis of variance models.* More than 200 problems throughout the book plus outline solutions for the exercises.* This revision has been extensively class-tested.
DEFF Research Database (Denmark)
Kirchmeier-Andersen, Sabine; Møller Christensen, Jakob; Lihn Jensen, Bente
2004-01-01
This article presents the latest version of VIA (version 3.0). The development of the program was initiated by a demand for more systematic training of language analysis in high schools and universities. The system is now web-based, which enables teachers and students to share exercises across...
Shabalin, P L; Yakubenko, A A; Pokhilevich, VA; Krein, M G
1986-01-01
This collection of eleven papers covers a broad spectrum of topics in analysis, from the study of certain classes of analytic functions to the solvability of singular problems for differential and integral equations to computational schemes for the partial differential equations and singular integral equations.
Lyman L. McDonald; Christina D. Vojta; Kevin S. McKelvey
2013-01-01
Perhaps the greatest barrier between monitoring and management is data analysis. Data languish in drawers and spreadsheets because those who collect or maintain monitoring data lack training in how to effectively summarize and analyze their findings. This chapter serves as a first step to surmounting that barrier by empowering any monitoring team with the basic...
Idris, Ivan
2014-01-01
This book is for programmers, scientists, and engineers who have knowledge of the Python language and know the basics of data science. It is for those who wish to learn different data analysis methods using Python and its libraries. This book contains all the basic ingredients you need to become an expert data analyst.
Haskell data analysis cookbook
Shukla, Nishant
2014-01-01
Step-by-step recipes filled with practical code samples and engaging examples demonstrate Haskell in practice, and then the concepts behind the code. This book shows functional developers and analysts how to leverage their existing knowledge of Haskell specifically for high-quality data analysis. A good understanding of data sets and functional programming is assumed.
Energy Technology Data Exchange (ETDEWEB)
Frame, Katherine Chiyoko [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-06-28
Neutron multiplicity measurements are widely used for nondestructive assay (NDA) of special nuclear material (SNM). When combined with isotopic composition information, neutron multiplicity analysis can be used to estimate the spontaneous fission rate and leakage multiplication of SNM. When combined with isotopic information, the total mass of fissile material can also be determined. This presentation provides an overview of this technique.
Greenberg, Marc W.; Laing, William
2013-01-01
An Economic Analysis (EA) is a systematic approach to the problem of choosing the best method of allocating scarce resources to achieve a given objective. An EA helps guide decisions on the "worth" of pursuing an action that departs from status quo ... an EA is the crux of decision-support.
Activation Analysis of Aluminium
Energy Technology Data Exchange (ETDEWEB)
Brune, Dag
1961-01-15
An analysis of pure aluminium alloyed with magnesium was per- formed by means of gamma spectrometry , Chemical separations were not employed. The isotopes to be determined were obtained in conditions of optimum activity by suitably choosing the time of irradiation and decay. The following elements were detected and measured quantitatively: Iron, zinc, copper, gallium, manganese, chromium, scandium and hafnium.
VENTILATION TECHNOLOGY SYSTEMS ANALYSIS
The report gives results of a project to develop a systems analysis of ventilation technology and provide a state-of-the-art assessment of ventilation and indoor air quality (IAQ) research needs. (NOTE: Ventilation technology is defined as the hardware necessary to bring outdoor ...
International Nuclear Information System (INIS)
Kaiser, V.
1993-01-01
In Chapter 2 process energy cost analysis for chemical processing is treated in a general way, independent of the specific form of energy and power production. Especially, energy data collection and data treatment, energy accounting (metering, balance setting), specific energy input, and utility energy costs and prices are discussed. (R.P.) 14 refs., 4 figs., 16 tabs
DEFF Research Database (Denmark)
Mai, Jens Erik
2005-01-01
is presented as an alternative and the paper discusses how this approach includes a broader range of analyses and how it requires a new set of actions from using this approach; analysis of the domain, users and indexers. The paper concludes that the two-step procedure to indexing is insufficient to explain...
On frame multiresolution analysis
DEFF Research Database (Denmark)
Christensen, Ole
2003-01-01
We use the freedom in frame multiresolution analysis to construct tight wavelet frames (even in the case where the refinable function does not generate a tight frame). In cases where a frame multiresolution does not lead to a construction of a wavelet frame we show how one can nevertheless...
Methods in algorithmic analysis
Dobrushkin, Vladimir A
2009-01-01
…helpful to any mathematics student who wishes to acquire a background in classical probability and analysis … This is a remarkably beautiful book that would be a pleasure for a student to read, or for a teacher to make into a year's course.-Harvey Cohn, Computing Reviews, May 2010
Quantitative Moessbauer analysis
International Nuclear Information System (INIS)
Collins, R.L.
1978-01-01
The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)
DEFF Research Database (Denmark)
Ris Hansen, Inge; Søgaard, Karen; Gram, Bibi
2015-01-01
This is the analysis plan for the multicentre randomised control study looking at the effect of training and exercises in chronic neck pain patients that is being conducted in Jutland and Funen, Denmark. This plan will be used as a work description for the analyses of the data collected....
Energy Technology Data Exchange (ETDEWEB)
Malik, S. [Nebraska U.; Shipsey, I. [Purdue U.; Cavanaugh, R. [Illinois U., Chicago; Bloom, K. [Nebraska U.; Chan, Kai-Feng [Taiwan, Natl. Taiwan U.; D' Hondt, J. [Vrije U., Brussels; Klima, B. [Fermilab; Narain, M. [Brown U.; Palla, F. [INFN, Pisa; Rolandi, G. [CERN; Schörner-Sadenius, T. [DESY
2014-01-01
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
DEFF Research Database (Denmark)
Conti, Roberto; Hong, Jeong Hee; Szymanski, Wojciech
2012-01-01
of such an algebra. Then we outline a powerful combinatorial approach to analysis of endomorphisms arising from permutation unitaries. The restricted Weyl group consists of automorphisms of this type. We also discuss the action of the restricted Weyl group on the diagonal MASA and its relationship...
International Nuclear Information System (INIS)
Kelsey, C.A.; Mettler, F.A.
1988-01-01
An elementary introduction to ROC analysis illustrates how ROC curves depend on observer threshold levels and discusses the relation between ROC curve parameters and other measures of observer performance including accuracy sensitivity specificity true positive fraction, true negative fraction, false positive fraction and false negative fraction
Information Security Risk Analysis
Peltier, Thomas R
2010-01-01
Offers readers with the knowledge and the skill-set needed to achieve a highly effective risk analysis assessment. This title demonstrates how to identify threats and then determine if those threats pose a real risk. It is suitable for industry and academia professionals.
Lubsch, A.; Timmermans, K.
2017-01-01
Texture analysis is a method to test the physical properties of a material by tension and compression. The growing interest in commercialisation of seaweeds for human food has stimulated research into the physical properties of seaweed tissue. These are important parameters for the survival of
Shifted Independent Component Analysis
DEFF Research Database (Denmark)
Mørup, Morten; Madsen, Kristoffer Hougaard; Hansen, Lars Kai
2007-01-01
Delayed mixing is a problem of theoretical interest and practical importance, e.g., in speech processing, bio-medical signal analysis and financial data modelling. Most previous analyses have been based on models with integer shifts, i.e., shifts by a number of samples, and have often been carried...
Multiscale principal component analysis
International Nuclear Information System (INIS)
Akinduko, A A; Gorban, A N
2014-01-01
Principal component analysis (PCA) is an important tool in exploring data. The conventional approach to PCA leads to a solution which favours the structures with large variances. This is sensitive to outliers and could obfuscate interesting underlying structures. One of the equivalent definitions of PCA is that it seeks the subspaces that maximize the sum of squared pairwise distances between data projections. This definition opens up more flexibility in the analysis of principal components which is useful in enhancing PCA. In this paper we introduce scales into PCA by maximizing only the sum of pairwise distances between projections for pairs of datapoints with distances within a chosen interval of values [l,u]. The resulting principal component decompositions in Multiscale PCA depend on point (l,u) on the plane and for each point we define projectors onto principal components. Cluster analysis of these projectors reveals the structures in the data at various scales. Each structure is described by the eigenvectors at the medoid point of the cluster which represent the structure. We also use the distortion of projections as a criterion for choosing an appropriate scale especially for data with outliers. This method was tested on both artificial distribution of data and real data. For data with multiscale structures, the method was able to reveal the different structures of the data and also to reduce the effect of outliers in the principal component analysis
Euler principal component analysis
Liwicki, Stephan; Tzimiropoulos, Georgios; Zafeiriou, Stefanos; Pantic, Maja
Principal Component Analysis (PCA) is perhaps the most prominent learning tool for dimensionality reduction in pattern recognition and computer vision. However, the ℓ 2-norm employed by standard PCA is not robust to outliers. In this paper, we propose a kernel PCA method for fast and robust PCA,
International Nuclear Information System (INIS)
Santoro, R.T.; Iida, H.; Khripunov, V.; Petrizzi, L.; Sato, S.; Sawan, M.; Shatalov, G.; Schipakin, O.
2001-01-01
This paper summarizes the main results of nuclear analysis calculations performed during the International Thermonuclear Experimental Reactor (ITER) Engineering Design Activity (EDA). Major efforts were devoted to fulfilling the General Design Requirements to minimize the nuclear heating rate in the superconducting magnets and ensuring that radiation conditions at the cryostat are suitable for hands-on-maintenance after reactor shut-down. (author)
Elementary functional analysis
Shilov, Georgi E
1996-01-01
Introductory text covers basic structures of mathematical analysis (linear spaces, metric spaces, normed linear spaces, etc.), differential equations, orthogonal expansions, Fourier transforms - including problems in the complex domain, especially involving the Laplace transform - and more. Each chapter includes a set of problems, with hints and answers. Bibliography. 1974 edition.
Computer aided safety analysis
International Nuclear Information System (INIS)
1988-05-01
The document reproduces 20 selected papers from the 38 papers presented at the Technical Committee/Workshop on Computer Aided Safety Analysis organized by the IAEA in co-operation with the Institute of Atomic Energy in Otwock-Swierk, Poland on 25-29 May 1987. A separate abstract was prepared for each of these 20 technical papers. Refs, figs and tabs
1979-01-31
but expinds ’acordiohlike.’ (4) The height- integrated intensity ratio of the red (6300 A) to green (5577 A) emisions of atomic o\\)gen is a good... molecular ion: Analysis of two rocket experiments, Planet. Space Sci. 16, 737, 1968. Hays, P. B. and C. D. Anger, The influence of ground scattering on
Communication Network Analysis Methods.
Farace, Richard V.; Mabee, Timothy
This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…
Making Strategic Analysis Matter
2012-01-01
Bryan Gabbard , Assessing the Tradecraft of Intelligence Analysis, Santa Monica, Calif.: RAND Corporation, TR-293, 2008. 4 See The Commission on the...July 7, 2011: http://www.rand.org/pubs/occasional_papers/OP152.html Treverton, Gregory F., and C. Bryan Gabbard , Assessing the Tradecraft of
Instrumental analysis, second edition
International Nuclear Information System (INIS)
Christian, G.D.; O'Reilly, J.E.
1988-01-01
The second edition of Instrumental Analysis is a survey of the major instrument-based methods of chemical analysis. It appears to be aimed at undergraduates but would be equally useful in a graduate course. The volume explores all of the classical quantitative methods and contains sections on techniques that usually are not included in a semester course in instrumentation (such as electron spectroscopy and the kinetic methods). Adequate coverage of all of the methods contained in this book would require several semesters of focused study. The 25 chapters were written by different authors, yet the style throughout the book is more uniform than in the earlier edition. With the exception of a two-chapter course in analog and digital circuits, the book purports to de-emphasize instrumentation, focusing more on the theory behind the methods and the application of the methods to analytical problems. However, a detailed analysis of the instruments used in each method is by no means absent. The book has the favor of a user's guide to analysis
Kolmogorov, A N; Silverman, Richard A
1975-01-01
Self-contained and comprehensive, this elementary introduction to real and functional analysis is readily accessible to those with background in advanced calculus. It covers basic concepts and introductory principles in set theory, metric spaces, topological and linear spaces, linear functionals and linear operators, and much more. 350 problems. 1970 edition.
Rotation in correspondence analysis
van de Velden, Michel; Kiers, Henk A.L.
2005-01-01
In correspondence analysis rows and columns of a nonnegative data matrix are depicted as points in a, usually, two-dimensional plot. Although such a two-dimensional plot often provides a reasonable approximation, the situation can occur that an approximation of higher dimensionality is required.
2016-04-01
Expand Childcare Center hours Dual-military Co-location Policy Maternity , Paternity, and Adoption leave o Women in Service Increase...Distribution unlimited Analysis of Undesignated Work Karan A. Schriver, Edward J. Schmitz, Greggory J. Schell, Hoda Parvin April 2016...designated and undesignated work requirements. Over time, this mix fluctuates, causing changes to the force profile. Undesignated workload has
Indian Academy of Sciences (India)
Chrissa G. Tsiara
2018-03-13
Mar 13, 2018 ... a meta-analysis of case–control studies was conducted. Univariate and ...... recent hepatitis C virus: potential benefit for ribavirin use in. HCV/HIV ... C/G polymorphism in breast pathologies and in HIV-infected patients.
International Nuclear Information System (INIS)
Abe, Toshinori
2001-01-01
The North American Linear Collider Detector group has developed simulation and analysis program packages. LCDROOT is one of the packages, and is based on ROOT and the C++ programing language to maximally benefit from object oriented programming techniques. LCDROOT is constantly improved and now has a new topological vertex finder, ZVTOP3. In this proceeding, the features of the LCDROOT simulation are briefly described
Communication Analysis of Environment.
Malik, M. F.; Thwaites, H. M.
This textbook was developed for use in a Concordia University (Quebec) course entitled "Communication Analysis of Environment." Designed as a practical application of information theory and cybernetics in the field of communication studies, the course is intended to be a self-instructional process, whereby each student chooses one…
Learning: An Evolutionary Analysis
Swann, Joanna
2009-01-01
This paper draws on the philosophy of Karl Popper to present a descriptive evolutionary epistemology that offers philosophical solutions to the following related problems: "What happens when learning takes place?" and "What happens in human learning?" It provides a detailed analysis of how learning takes place without any direct transfer of…
Stress Analysis of Composites.
1981-01-01
8217, Finite Elements in Nonlinear Mechanics, 1., 109-130, Tapir Publishers, Norway (1978). 9. A.J. Barnard and P.W. Sharman, ’Elastic-Plastic Analysis Using...Hybrid Stress Finite Elements,’ Finite Elements in Nonlinear Mechanics, 1, 131-148, Tapir Publishers Norway, (1978). ’.........Pian, ’Variational
Pavlovic, Dusko; Domenach, Florent; Ignatov, Dmitry I.; Poelmans, Jonas
2012-01-01
Formal Concept Analysis (FCA) begins from a context, given as a binary relation between some objects and some attributes, and derives a lattice of concepts, where each concept is given as a set of objects and a set of attributes, such that the first set consists of all objects that satisfy all
Multidisciplinary System Reliability Analysis
Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)
2001-01-01
The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.
Deterministic uncertainty analysis
International Nuclear Information System (INIS)
Worley, B.A.
1987-01-01
Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig
International Nuclear Information System (INIS)
1997-10-01
The improvement of safety in nuclear power stations is an important proposition. Therefore also as to the safety evaluation, it is important to comprehensively and systematically execute it by referring to the operational experience and the new knowledge which is important for the safety throughout the period of use as well as before the construction and the start of operation of nuclear power stations. In this report, the results when the safety analysis for ''Fugen'' was carried out by referring to the newest technical knowledge are described. As the result, it was able to be confirmed that the safety of ''Fugen'' has been secured by the inherent safety and the facilities which were designed for securing the safety. The basic way of thinking on the safety analysis including the guidelines to be conformed to is mentioned. As to the abnormal transient change in operation and accidents, their definition, the events to be evaluated and the standards for judgement are reported. The matters which were taken in consideration at the time of the analysis are shown. The computation programs used for the analysis were REACT, HEATUP, LAYMON, FATRAC, SENHOR, LOTRAC, FLOOD and CONPOL. The analyses of the abnormal transient change in operation and accidents are reported on the causes, countermeasures, protective functions and results. (K.I.)
International Nuclear Information System (INIS)
Lima-e-Silva, Pedro Paulo de
1996-01-01
The conventional Risk Analysis (RA) relates usually a certain undesired event frequency with its consequences. Such technique is used nowadays in Brazil to analyze accidents and their consequences strictly under the human approach, valuing loss of human equipment, human structures and human lives, without considering the damage caused to natural resources that keep life possible on Earth. This paradigm was developed primarily because of the Homo sapiens' lack of perception upon the natural web needed to sustain his own life. In reality, the Brazilian professionals responsible today for licensing, auditing and inspecting environmental aspects of human activities face huge difficulties in making technical specifications and procedures leading to acceptable levels of impact, furthermore considering the intrinsic difficulties to define those levels. Therefore, in Brazil the RA technique is a weak tool for licensing for many reasons, and of them are its short scope (only accident considerations) and wrong a paradigm (only human direct damages). A paper from the author about the former was already proposed to the 7th International Conference on Environmetrics, past July'96, USP-SP. This one discusses the extension of the risk analysis concept to take into account environmental consequences, transforming the conventional analysis into a broader methodology named here as Environmental Risk Analysis. (author)
Energy Technology Data Exchange (ETDEWEB)
Fudge, A.
1978-12-15
The following aspects of isotope dilution analysis are covered in this report: fundamental aspects of the technique; elements of interest in the nuclear field, choice and standardization of spike nuclide; pre-treatment to achieve isotopic exchange and chemical separation; sensitivity; selectivity; and accuracy.
Szapacs, Cindy
2006-01-01
Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…
Perfusion dyssynchrony analysis
Chiribiri, A.; Villa, A.D.M.; Sammut, E.; Breeuwer, M.; Nagel, E.
2015-01-01
AIMS: We sought to describe perfusion dyssynchrony analysis specifically to exploit the high temporal resolution of stress perfusion CMR. This novel approach detects differences in the temporal distribution of the wash-in of contrast agent across the left ventricular wall. METHODS AND RESULTS:
Proteoglycan isolation and analysis
DEFF Research Database (Denmark)
Woods, A; Couchman, J R
2001-01-01
Proteoglycans can be difficult molecules to isolate and analyze due to large mass, charge, and tendency to aggregate or form macromolecular complexes. This unit describes detailed methods for purification of matrix, cell surface, and cytoskeleton-linked proteoglycans. Methods for analysis...
Uranium and transuranium analysis
International Nuclear Information System (INIS)
Regnaud, F.
1989-01-01
Analytical chemistry of uranium, neptunium, plutonium, americium and curium is reviewed. Uranium and neptunium are mainly treated and curium is only briefly evoked. Analysis methods include coulometry, titration, mass spectrometry, absorption spectrometry, spectrofluorometry, X-ray spectrometry, nuclear methods and radiation spectrometry [fr
International Nuclear Information System (INIS)
Preyssl, C.
1986-01-01
Safety analysis provides the only tool for evaluation and quantification of rare or hypothetical events leading to system failure. So far probability theory has been used for the fault- and event-tree methodology. The phenomenon of uncertainties constitutes an important aspect in risk analysis. Uncertainties can be classified as originating from 'randomness' or 'fuzziness'. Probability theory addresses randomness only. The use of 'fuzzy set theory' makes it possible to include both types of uncertainty in the mathematical model of risk analysis. Thus the 'fuzzy fault tree' is expressed in 'possibilistic' terms implying a range of simplifications and improvements. 'Human failure' and 'conditionality' can be treated correctly. Only minimum-maximum relations are used to combine the possibility distributions of events. Various event-classifications facilitate the interpretation of the results. The method is demonstrated by application to a TRIGA-research reactor. Uncertainty as an implicit part of 'fuzzy risk' can be quantified explicitly using an 'uncertainty measure'. Based on this the 'degree of relative compliance' with a quantizative safety goal can be defined for a particular risk. The introduction of 'weighting functionals' guarantees the consideration of the importances attached to different parts of the risk exceeding or complying with the standard. The comparison of two reference systems is demonstrated in a case study. It is concluded that any application of the 'fuzzy risk analysis' has to be free of any hypostatization when reducing subjective to objective information. (Author)
Bayesian Independent Component Analysis
DEFF Research Database (Denmark)
Winther, Ole; Petersen, Kaare Brandt
2007-01-01
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...
Ignalina Safety Analysis Group
International Nuclear Information System (INIS)
Ushpuras, E.
1995-01-01
The article describes the fields of activities of Ignalina NPP Safety Analysis Group (ISAG) in the Lithuanian Energy Institute and overview the main achievements gained since the group establishment in 1992. The group is working under the following guidelines: in-depth analysis of the fundamental physical processes of RBMK-1500 reactors; collection, systematization and verification of the design and operational data; simulation and analysis of potential accident consequences; analysis of thermohydraulic and neutronic characteristics of the plant; provision of technical and scientific consultations to VATESI, Governmental authorities, and also international institutions, participating in various projects aiming at Ignalina NPP safety enhancement. The ISAG is performing broad scientific co-operation programs with both Eastern and Western scientific groups, supplying engineering assistance for Ignalina NPP. ISAG is also participating in the joint Lithuanian - Swedish - Russian project - Barselina, the first Probabilistic Safety Assessment (PSA) study of Ignalina NPP. The work is underway together with Maryland University (USA) for assessment of the accident confinement system for a range of breaks in the primary circuit. At present the ISAG personnel is also involved in the project under the grant from the Nuclear Safety Account, administered by the European Bank for reconstruction and development for the preparation and review of an in-depth safety assessment of the Ignalina plant
Kane, Jonathan M
2016-01-01
This is a textbook on proof writing in the area of analysis, balancing a survey of the core concepts of mathematical proof with a tight, rigorous examination of the specific tools needed for an understanding of analysis. Instead of the standard "transition" approach to teaching proofs, wherein students are taught fundamentals of logic, given some common proof strategies such as mathematical induction, and presented with a series of well-written proofs to mimic, this textbook teaches what a student needs to be thinking about when trying to construct a proof. Covering the fundamentals of analysis sufficient for a typical beginning Real Analysis course, it never loses sight of the fact that its primary focus is about proof writing skills. This book aims to give the student precise training in the writing of proofs by explaining exactly what elements make up a correct proof, how one goes about constructing an acceptable proof, and, by learning to recognize a correct proof, how to avoid writing incorrect proofs. T...
Russian Language Analysis Project
Serianni, Barbara; Rethwisch, Carolyn
2011-01-01
This paper is the result of a language analysis research project focused on the Russian Language. The study included a diverse literature review that included published materials as well as online sources in addition to an interview with a native Russian speaker residing in the United States. Areas of study include the origin and history of the…
Douglas, David
2016-01-01
Doxing is the intentional public release onto the Internet of personal information about an individual by a third party, often with the intent to humiliate, threaten, intimidate, or punish the identified individual. In this paper I present a conceptual analysis of the practice of doxing and how it
Polysome Profile Analysis - Yeast
Czech Academy of Sciences Publication Activity Database
Pospíšek, M.; Valášek, Leoš Shivaya
2013-01-01
Roč. 530, č. 2013 (2013), s. 173-181 ISSN 0076-6879 Institutional support: RVO:61388971 Keywords : grow yeast cultures * polysome profile analysis * sucrose density gradient centrifugation Subject RIV: CE - Biochemistry Impact factor: 2.194, year: 2013