WorldWideScience

Sample records for source receptor methodology

  1. Developing a source-receptor methodology for the characterization of VOC sources in ambient air

    International Nuclear Information System (INIS)

    Borbon, A.; Badol, C.; Locoge, N.

    2005-01-01

    Since 2001, in France, a continuous monitoring of about thirty ozone precursor non-methane hydrocarbons (NMHC) is led in some urban areas. The automated system for NMHC monitoring consists of sub-ambient preconcentration on a cooled multi-sorbent trap followed by thermal desorption and bidimensional Gas Chromatography/Flame Ionisation Detection analysis.The great number of data collected and their exploitation should provide a qualitative and quantitative assessment of hydrocarbon sources. This should help in the definition of relevant strategies of emission regulation as stated by the European Directive relative to ozone in ambient air (2002/3/EC). The purpose of this work is to present the bases and the contributions of an original methodology known as source-receptor in the characterization of NMHC sources. It is a statistical and diagnostic approach, adaptable and transposable in all urban sites, which integrates the spatial and temporal dynamics of the emissions. The methods for source identification combine descriptive or more complex complementary approaches: 1) univariate approach through the analysis of NMHC time series and concentration roses, 2) bivariate approach through a Graphical Ratio Analysis and a characterization of scatterplot distributions of hydrocarbon pairs, 3) multivariate approach with Principal Component Analyses on various time basis. A linear regression model is finally developed to estimate the spatial and temporal source contributions. Apart from vehicle exhaust emissions, sources of interest are: combustion and fossil fuel-related activities, petrol and/or solvent evaporation, the double anthropogenic and biogenic origin of isoprene and other industrial activities depending on local parameters. (author)

  2. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.; Mai, Paul Martin; Schorlemmer, Danijel

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data

  3. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  4. Solution to the inversely stated transient source-receptor problem

    International Nuclear Information System (INIS)

    Sajo, E.; Sheff, J.R.

    1995-01-01

    Transient source-receptor problems are traditionally handled via the Boltzmann equation or through one of its variants. In the atmospheric transport of pollutants, meteorological uncertainties in the planetary boundary layer render only a few approximations to the Boltzmann equation useful. Often, due to the high number of unknowns, the atmospheric source-receptor problem is ill-posed. Moreover, models to estimate downwind concentration invariably assume that the source term is known. In this paper, an inverse methodology is developed, based on downwind measurement of concentration and that of meterological parameters to estimate the source term

  5. Methodological aspects on drug receptor binding analysis

    International Nuclear Information System (INIS)

    Wahlstroem, A.

    1978-01-01

    Although drug receptors occur in relatively low concentrations, they can be visualized by the use of appropriate radioindicators. In most cases the procedure is rapid and can reach a high degree of accuracy. Specificity of the interaction is studied by competition analysis. The necessity of using several radioindicators to define a receptor population is emphasized. It may be possible to define isoreceptors and drugs with selectivity for one isoreceptor. (Author)

  6. Development of a novel methodology for indoor emission source identification

    DEFF Research Database (Denmark)

    Han, K.H.; Zhang, J.S.; Knudsen, H.N.

    2011-01-01

    The objective of this study was to develop and evaluate a methodology to identify individual sources of emissions based on the measurements of mixed air samples and the emission signatures of individual materials previously determined by Proton Transfer Reaction-Mass Spectrometry (PTR-MS), an on......-line analytical device. The methodology based on signal processing principles was developed by employing the method of multiple regression least squares (MRLS) and a normalization technique. Samples of nine typical building materials were tested individually and in combination, including carpet, ceiling material...... experiments and investigation are needed for cases where the relative emission rates among different compounds may change over a long-term period....

  7. Development of methodology for the characterization of radioactive sealed sources

    International Nuclear Information System (INIS)

    Ferreira, Robson de Jesus

    2010-01-01

    Sealed radioactive sources are widely used in many applications of nuclear technology in industry, medicine, research and others. The International Atomic Energy Agency (IAEA) estimates tens of millions sources in the world. In Brazil, the number is about 500 thousand sources, if the Americium-241 sources present in radioactive lightning rods and smoke detectors are included in the inventory. At the end of the useful life, most sources become disused, constitute a radioactive waste, and are then termed spent sealed radioactive sources (SSRS). In Brazil, this waste is collected by the research institutes of the Nuclear Commission of Nuclear Energy and kept under centralized storage, awaiting definition of the final disposal route. The Waste Management Laboratory (WML) at the Nuclear and Energy Research Institute is the main storage center, having received until July 2010 about 14.000 disused sources, not including the tens of thousands of lightning rod and smoke detector sources. A program is underway in the WML to replacing the original shielding by a standard disposal package and to determining the radioisotope content and activity of each one. The identification of the radionuclides and the measurement of activities will be carried out with a well type ionization chamber. This work aims to develop a methodology for measuring or to determine the activity SSRS stored in the WML accordance with its geometry and determine their uncertainties. (author)

  8. Operational source receptor calculations for large agglomerations

    Science.gov (United States)

    Gauss, Michael; Shamsudheen, Semeena V.; Valdebenito, Alvaro; Pommier, Matthieu; Schulz, Michael

    2016-04-01

    For Air quality policy an important question is how much of the air pollution within an urbanized region can be attributed to local sources and how much of it is imported through long-range transport. This is critical information for a correct assessment of the effectiveness of potential emission measures. The ratio between indigenous and long-range transported air pollution for a given region depends on its geographic location, the size of its area, the strength and spatial distribution of emission sources, the time of the year, but also - very strongly - on the current meteorological conditions, which change from day to day and thus make it important to provide such calculations in near-real-time to support short-term legislation. Similarly, long-term analysis over longer periods (e.g. one year), or of specific air quality episodes in the past, can help to scientifically underpin multi-regional agreements and long-term legislation. Within the European MACC projects (Monitoring Atmospheric Composition and Climate) and the transition to the operational CAMS service (Copernicus Atmosphere Monitoring Service) the computationally efficient EMEP MSC-W air quality model has been applied with detailed emission data, comprehensive calculations of chemistry and microphysics, driven by high quality meteorological forecast data (up to 96-hour forecasts), to provide source-receptor calculations on a regular basis in forecast mode. In its current state, the product allows the user to choose among different regions and regulatory pollutants (e.g. ozone and PM) to assess the effectiveness of fictive emission reductions in air pollutant emissions that are implemented immediately, either within the agglomeration or outside. The effects are visualized as bar charts, showing resulting changes in air pollution levels within the agglomeration as a function of time (hourly resolution, 0 to 4 days into the future). The bar charts not only allow assessing the effects of emission

  9. Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.

    Science.gov (United States)

    Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

    2014-09-01

    The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition).

  10. Application of source-receptor models to determine source areas of biological components (pollen and butterflies)

    OpenAIRE

    M. Alarcón; M. Àvila; J. Belmonte; C. Stefanescu; R. Izquierdo

    2010-01-01

    The source-receptor models allow the establishment of relationships between a receptor point (sampling point) and the probable source areas (regions of emission) through the association of concentration values at the receptor point with the corresponding atmospheric back-trajectories, and, together with other techniques, to interpret transport phenomena on a synoptic scale. These models are generally used in air pollution studies to determine the areas of origin of chemical compounds measured...

  11. Overview of receptor-based source apportionment studies for speciated atmospheric mercury

    OpenAIRE

    Cheng, I.; Xu, X.; Zhang, L.

    2015-01-01

    Receptor-based source apportionment studies of speciated atmospheric mercury are not only concerned with source contributions but also with the influence of transport, transformation, and deposition processes on speciated atmospheric mercury concentrations at receptor locations. Previous studies applied multivariate receptor models including principal components analysis and positive matrix factorization, and back trajectory receptor models including potential source contri...

  12. Source-receptor matrix calculation with a Lagrangian particle dispersion model in backward mode

    Directory of Open Access Journals (Sweden)

    P. Seibert

    2004-01-01

    Full Text Available The possibility to calculate linear-source receptor relationships for the transport of atmospheric trace substances with a Lagrangian particle dispersion model (LPDM running in backward mode is shown and presented with many tests and examples. This mode requires only minor modifications of the forward LPDM. The derivation includes the action of sources and of any first-order processes (transformation with prescribed rates, dry and wet deposition, radioactive decay, etc.. The backward mode is computationally advantageous if the number of receptors is less than the number of sources considered. The combination of an LPDM with the backward (adjoint methodology is especially attractive for the application to point measurements, which can be handled without artificial numerical diffusion. Practical hints are provided for source-receptor calculations with different settings, both in forward and backward mode. The equivalence of forward and backward calculations is shown in simple tests for release and sampling of particles, pure wet deposition, pure convective redistribution and realistic transport over a short distance. Furthermore, an application example explaining measurements of Cs-137 in Stockholm as transport from areas contaminated heavily in the Chernobyl disaster is included.

  13. ANL calculational methodologies for determining spent nuclear fuel source term

    International Nuclear Information System (INIS)

    McKnight, R. D.

    2000-01-01

    Over the last decade Argonne National Laboratory has developed reactor depletion methods and models to determine radionuclide inventories of irradiated EBR-II fuels. Predicted masses based on these calculational methodologies have been validated using available data from destructive measurements--first from measurements of lead EBR-II experimental test assemblies and later using data obtained from processing irradiated EBR-II fuel assemblies in the Fuel Conditioning Facility. Details of these generic methodologies are described herein. Validation results demonstrate these methods meet the FCF operations and material control and accountancy requirements

  14. Sources of particulate matter components in the Athabasca oil sands region: investigation through a comparison of trace element measurement methodologies

    Science.gov (United States)

    Phillips-Smith, Catherine; Jeong, Cheol-Heon; Healy, Robert M.; Dabek-Zlotorzynska, Ewa; Celo, Valbona; Brook, Jeffrey R.; Evans, Greg

    2017-08-01

    The province of Alberta, Canada, is home to three oil sands regions which, combined, contain the third largest deposit of oil in the world. Of these, the Athabasca oil sands region is the largest. As part of Environment and Climate Change Canada's program in support of the Joint Canada-Alberta Implementation Plan for Oil Sands Monitoring program, concentrations of trace elements in PM2. 5 (particulate matter smaller than 2.5 µm in diameter) were measured through two campaigns that involved different methodologies: a long-term filter campaign and a short-term intensive campaign. In the long-term campaign, 24 h filter samples were collected once every 6 days over a 2-year period (December 2010-November 2012) at three air monitoring stations in the regional municipality of Wood Buffalo. For the intensive campaign (August 2013), hourly measurements were made with an online instrument at one air monitoring station; daily filter samples were also collected. The hourly and 24 h filter data were analyzed individually using positive matrix factorization. Seven emission sources of PM2. 5 trace elements were thereby identified: two types of upgrader emissions, soil, haul road dust, biomass burning, and two sources of mixed origin. The upgrader emissions, soil, and haul road dust sources were identified through both the methodologies and both methodologies identified a mixed source, but these exhibited more differences than similarities. The second upgrader emissions and biomass burning sources were only resolved by the hourly and filter methodologies, respectively. The similarity of the receptor modeling results from the two methodologies provided reassurance as to the identity of the sources. Overall, much of the PM2. 5-related trace elements were found to be anthropogenic, or at least to be aerosolized through anthropogenic activities. These emissions may in part explain the previously reported higher levels of trace elements in snow, water, and biota samples collected

  15. Volatility in the California power market: source, methodology and recommendations

    International Nuclear Information System (INIS)

    Dahlgren, R.W.; Liu, C.-C.; Lawarree, J.

    2001-01-01

    Extreme short-term price volatility in competitive electricity markets creates the need for price risk management for electric utilities. Recent methods in California provide examples of lessons that can be applied to other markets worldwide. Value-at-Risk (VAR), a method for quantifying risk exposure in the financial industry, is introduced as a technique that is applicable to quantifying price risk exposure in power systems. The methodology for applying VAR using changes in prices from corresponding hours on previous periods to understand how the hourly VAR entity is exposed when the power system is obligated to serve a load and does not have a contract for supply. The VAR methodology introduced is then applied to a sample company in California that is serving a 100 MW load. Proposed remedies for the problems observed in the competitive California electric power industry are introduced. (Author)

  16. Receptor models for source apportionment of remote aerosols in Brazil

    International Nuclear Information System (INIS)

    Artaxo Netto, P.E.

    1985-11-01

    The PIXE (particle induced X-ray emission), and PESA (proton elastic scattering analysis) method were used in conjunction with receptor models for source apportionment of remote aerosols in Brazil. The PIXE used in the determination of concentration for elements with Z >- 11, has a detection limit of about 1 ng/m 3 . The concentrations of carbon, nitrogen and oxygen in the fine fraction of Amazon Basin aerosols was measured by PESA. We sampled in Jureia (SP), Fernando de Noronha, Arembepe (BA), Firminopolis (GO), Itaberai (GO) and Amazon Basin. For collecting the airbone particles we used cascade impactors, stacked filter units, and streaker samplers. Three receptor models were used: chemical mass balance, stepwise multiple regression analysis and principal factor analysis. The elemental and gravimetric concentrations were explained by the models within the experimental errors. Three sources of aerosol were quantitatively distinguished: marine aerosol, soil dust and aerosols related to forests. The emission of aerosols by vegetation is very clear for all the sampling sites. In Amazon Basin and Jureia it is the major source, responsible for 60 to 80% of airborne concentrations. (Author) [pt

  17. Source apportionment of airborne particulates through receptor modeling: Indian scenario

    Science.gov (United States)

    Banerjee, Tirthankar; Murari, Vishnu; Kumar, Manish; Raju, M. P.

    2015-10-01

    Airborne particulate chemistry mostly governed by associated sources and apportionment of specific sources is extremely essential to delineate explicit control strategies. The present submission initially deals with the publications (1980s-2010s) of Indian origin which report regional heterogeneities of particulate concentrations with reference to associated species. Such meta-analyses clearly indicate the presence of reservoir of both primary and secondary aerosols in different geographical regions. Further, identification of specific signatory molecules for individual source category was also evaluated in terms of their scientific merit and repeatability. Source signatures mostly resemble international profile while, in selected cases lack appropriateness. In India, source apportionment (SA) of airborne particulates was initiated way back in 1985 through factor analysis, however, principal component analysis (PCA) shares a major proportion of applications (34%) followed by enrichment factor (EF, 27%), chemical mass balance (CMB, 15%) and positive matrix factorization (PMF, 9%). Mainstream SA analyses identify earth crust and road dust resuspensions (traced by Al, Ca, Fe, Na and Mg) as a principal source (6-73%) followed by vehicular emissions (traced by Fe, Cu, Pb, Cr, Ni, Mn, Ba and Zn; 5-65%), industrial emissions (traced by Co, Cr, Zn, V, Ni, Mn, Cd; 0-60%), fuel combustion (traced by K, NH4+, SO4-, As, Te, S, Mn; 4-42%), marine aerosols (traced by Na, Mg, K; 0-15%) and biomass/refuse burning (traced by Cd, V, K, Cr, As, TC, Na, K, NH4+, NO3-, OC; 1-42%). In most of the cases, temporal variations of individual source contribution for a specific geographic region exhibit radical heterogeneity possibly due to unscientific orientation of individual tracers for specific source and well exaggerated by methodological weakness, inappropriate sample size, implications of secondary aerosols and inadequate emission inventories. Conclusively, a number of challenging

  18. Identification of androgen receptor antagonists: In vitro investigation and classification methodology for flavonoid.

    Science.gov (United States)

    Wu, Yang; Doering, Jon A; Ma, Zhiyuan; Tang, Song; Liu, Hongling; Zhang, Xiaowei; Wang, Xiaoxiang; Yu, Hongxia

    2016-09-01

    A tremendous gap exists between the number of potential endocrine disrupting chemicals (EDCs) possibly in the environment and the limitation of traditional regulatory testing. In this study, the anti-androgenic potencies of 21 flavonoids were analyzed in vitro, and another 32 flavonoids from the literature were selected as additional chemicals. Molecular dynamic simulations were employed to obtain four different separation approaches based on the different behaviors of ligands and receptors during the process of interaction. Specifically, ligand-receptor complex which highlighted the discriminating features of ligand escape or retention via "mousetrap" mechanism, hydrogen bonds formed during simulation times, ligand stability and the stability of the helix-12 of the receptor were investigated. Together, a methodology was generated that 87.5% of flavonoids could be discriminated as active versus inactive antagonists, and over 90% inactive antagonists could be filtered out before QSAR study. This methodology could be used as a "proof of concept" to identify inactive anti-androgenic flavonoids, as well could be beneficial for rapid risk assessment and regulation of multiple new chemicals for androgenicity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Receptor Model Source Apportionment of Nonmethane Hydrocarbons in Mexico City

    Directory of Open Access Journals (Sweden)

    V. Mugica

    2002-01-01

    Full Text Available With the purpose of estimating the source contributions of nonmethane hydrocarbons (NMHC to the atmosphere at three different sites in the Mexico City Metropolitan Area, 92 ambient air samples were measured from February 23 to March 22 of 1997. Light- and heavy-duty vehicular profiles were determined to differentiate the NMHC contribution of diesel and gasoline to the atmosphere. Food cooking source profiles were also determined for chemical mass balance receptor model application. Initial source contribution estimates were carried out to determine the adequate combination of source profiles and fitting species. Ambient samples of NMHC were apportioned to motor vehicle exhaust, gasoline vapor, handling and distribution of liquefied petroleum gas (LP gas, asphalt operations, painting operations, landfills, and food cooking. Both gasoline and diesel motor vehicle exhaust were the major NMHC contributors for all sites and times, with a percentage of up to 75%. The average motor vehicle exhaust contributions increased during the day. In contrast, LP gas contribution was higher during the morning than in the afternoon. Apportionment for the most abundant individual NMHC showed that the vehicular source is the major contributor to acetylene, ethylene, pentanes, n-hexane, toluene, and xylenes, while handling and distribution of LP gas was the major source contributor to propane and butanes. Comparison between CMB estimates of NMHC and the emission inventory showed a good agreement for vehicles, handling and distribution of LP gas, and painting operations; nevertheless, emissions from diesel exhaust and asphalt operations showed differences, and the results suggest that these emissions could be underestimated.

  20. Accounting Methodology for Source Energy of Non-Combustible Renewable Electricity Generation

    Energy Technology Data Exchange (ETDEWEB)

    Donohoo-Vallett, Paul [US Department of Energy, Washington, DC (United States)

    2016-10-01

    As non-combustible sources of renewable power (wind, solar, hydro, and geothermal) do not consume fuel, the “source” (or “primary”) energy from these sources cannot be accounted for in the same manner as it is for fossil fuel sources. The methodology chosen for these technologies is important as it affects the perception of the relative size of renewable source energy to fossil energy, affects estimates of source-based building energy use, and overall source energy based metrics such as energy productivity. This memo reviews the methodological choices, outlines implications of each choice, summarizes responses to a request for information on this topic, and presents guiding principles for the U.S. Department of Energy, (DOE) Office of Energy Efficiency and Renewable Energy (EERE) to use to determine where modifying the current renewable source energy accounting method used in EERE products and analyses would be appropriate to address the issues raised above.

  1. Sources of particulate matter components in the Athabasca oil sands region: investigation through a comparison of trace element measurement methodologies

    Directory of Open Access Journals (Sweden)

    C. Phillips-Smith

    2017-08-01

    Full Text Available The province of Alberta, Canada, is home to three oil sands regions which, combined, contain the third largest deposit of oil in the world. Of these, the Athabasca oil sands region is the largest. As part of Environment and Climate Change Canada's program in support of the Joint Canada-Alberta Implementation Plan for Oil Sands Monitoring program, concentrations of trace elements in PM2. 5 (particulate matter smaller than 2.5 µm in diameter were measured through two campaigns that involved different methodologies: a long-term filter campaign and a short-term intensive campaign. In the long-term campaign, 24 h filter samples were collected once every 6 days over a 2-year period (December 2010–November 2012 at three air monitoring stations in the regional municipality of Wood Buffalo. For the intensive campaign (August 2013, hourly measurements were made with an online instrument at one air monitoring station; daily filter samples were also collected. The hourly and 24 h filter data were analyzed individually using positive matrix factorization. Seven emission sources of PM2. 5 trace elements were thereby identified: two types of upgrader emissions, soil, haul road dust, biomass burning, and two sources of mixed origin. The upgrader emissions, soil, and haul road dust sources were identified through both the methodologies and both methodologies identified a mixed source, but these exhibited more differences than similarities. The second upgrader emissions and biomass burning sources were only resolved by the hourly and filter methodologies, respectively. The similarity of the receptor modeling results from the two methodologies provided reassurance as to the identity of the sources. Overall, much of the PM2. 5-related trace elements were found to be anthropogenic, or at least to be aerosolized through anthropogenic activities. These emissions may in part explain the previously reported higher levels of trace elements in snow

  2. Comparison between two methodologies for uniformity correction of extensive reference sources

    International Nuclear Information System (INIS)

    Junior, Iremar Alves S.; Siqueira, Paulo de T.D.; Vivolo, Vitor; Potiens, Maria da Penha A.; Nascimento, Eduardo

    2016-01-01

    This article presents the procedures to obtain the uniformity correction factors for extensive reference sources proposed by two different methodologies. The first methodology is presented by the Good Practice Guide of Nº 14 of the NPL, which provides a numerical correction. The second one uses the radiation transport code, MCNP5, to obtain the correction factor. Both methods retrieve very similar corrections factor values, with a maximum deviation of 0.24%. (author)

  3. Methodology for safety and security of radioactive sources and materials. The Israeli approach

    International Nuclear Information System (INIS)

    Keren, M.

    1998-01-01

    About 10 Radioactive incidents occurred in Israel during 1996-1997. Some of them were theft or lost of Radioactive equipment or sources, some happened because misuse of Radioactive equipment and some of other reasons. Part of them could be eliminated if a better methodological attitude to the subject existed. A new methodology for notification, registration and licensing is described. Hopefully this methodology will increase defense in depth and the Safety and Security of Radioactive sources and materials. Information on the inventory of Radioactive sources and materials is essential. Where they are situated, what is the supply rate or all history from berth to grave. Persons involved are important: Who are the Radiation Safety Officers (RSO), what is their training and updating programs. As much as possible information on the site and places where those Radioactive sources and materials are used. Procedures for security of sources and materials is part of site information, beside safety precautions. Users are obliged to inform on any changes and to ask for confirmation to those changes. The same is when high activity sources are moved across the country. (author)

  4. Source apportionment and sensitivity analysis: two methodologies with two different purposes

    Science.gov (United States)

    Clappier, Alain; Belis, Claudio A.; Pernigotti, Denise; Thunis, Philippe

    2017-11-01

    This work reviews the existing methodologies for source apportionment and sensitivity analysis to identify key differences and stress their implicit limitations. The emphasis is laid on the differences between source impacts (sensitivity analysis) and contributions (source apportionment) obtained by using four different methodologies: brute-force top-down, brute-force bottom-up, tagged species and decoupled direct method (DDM). A simple theoretical example to compare these approaches is used highlighting differences and potential implications for policy. When the relationships between concentration and emissions are linear, impacts and contributions are equivalent concepts. In this case, source apportionment and sensitivity analysis may be used indifferently for both air quality planning purposes and quantifying source contributions. However, this study demonstrates that when the relationship between emissions and concentrations is nonlinear, sensitivity approaches are not suitable to retrieve source contributions and source apportionment methods are not appropriate to evaluate the impact of abatement strategies. A quantification of the potential nonlinearities should therefore be the first step prior to source apportionment or planning applications, to prevent any limitations in their use. When nonlinearity is mild, these limitations may, however, be acceptable in the context of the other uncertainties inherent to complex models. Moreover, when using sensitivity analysis for planning, it is important to note that, under nonlinear circumstances, the calculated impacts will only provide information for the exact conditions (e.g. emission reduction share) that are simulated.

  5. Development of a methodology examining the behaviours of VOCs source apportionment with micro-meteorology analysis in an urban and industrial area.

    Science.gov (United States)

    Xiang, Yang; Delbarre, Hervé; Sauvage, Stéphane; Léonardis, Thierry; Fourmentin, Marc; Augustin, Patrick; Locoge, Nadine

    2012-03-01

    During summer 2009, online measurements of 25 Volatile Organic Compounds (VOCs) from C6 to C10 as well as micro-meteorological parameters were simultaneously performed in the industrial city of Dunkerque. With the obtained data set, we developed a methodology to examine how the contributions of different source categories depend on atmospheric turbulences, and the results provided identification of emission modes. Eight factors were resolved by using Positive Matrix Factorization model and three of them were associated with mixed sources. The observed behaviours of contributions with turbulences lead to attribute some factors with sources at ground level, and some other factors with sources in the upper part of surface layer. The impact of vertical turbulence on the pollutant dispersion is also affected by the distance between sources and receptor site. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. FCγ Chimeric Receptor-Engineered T Cells: Methodology, Advantages, Limitations, and Clinical Relevance

    Directory of Open Access Journals (Sweden)

    Giuseppe Sconocchia

    2017-04-01

    Full Text Available For many years, disappointing results have been generated by many investigations, which have utilized a variety of immunologic strategies to enhance the ability of a patient’s immune system to recognize and eliminate malignant cells. However, in recent years, immunotherapy has been used successfully for the treatment of hematologic and solid malignancies. The impressive clinical responses observed in many types of cancer have convinced even the most skeptical clinical oncologists that a patient’s immune system can recognize and reject his tumor if appropriate strategies are implemented. The success immunotherapy is due to the development of at least three therapeutic strategies. They include tumor-associated antigen (TAA-specific monoclonal antibodies (mAbs, T cell checkpoint blockade, and TAA-specific chimeric antigen receptors (CARs T cell-based immunotherapy. However, the full realization of the therapeutic potential of these approaches requires the development of strategies to counteract and overcome some limitations. They include off-target toxicity and mechanisms of cancer immune evasion, which obstacle the successful clinical application of mAbs and CAR T cell-based immunotherapies. Thus, we and others have developed the Fc gamma chimeric receptors (Fcγ-CRs-based strategy. Like CARs, Fcγ-CRs are composed of an intracellular tail resulting from the fusion of a co-stimulatory molecule with the T cell receptor ζ chain. In contrast, the extracellular CAR single-chain variable fragment (scFv, which recognizes the targeted TAA, has been replaced with the extracellular portion of the FcγRIIIA (CD16. Fcγ-CR T cells have a few intriguing features. First, given in combination with mAbs, Fcγ-CR T cells mediate anticancer activity in vitro and in vivo by an antibody-mediated cellular cytotoxicity mechanism. Second, CD16-CR T cells can target multiple cancer types provided that TAA-specific mAbs with the appropriate specificity are available

  7. A shape and mesh adaptive computational methodology for gamma ray dose from volumetric sources

    International Nuclear Information System (INIS)

    Mirza, N.M.; Ali, B.; Mirza, S.M.; Tufail, M.; Ahmad, N.

    1991-01-01

    Indoor external exposure to the population is dominated by gamma rays emitted from the walls and the floor of a room. A shape and mesh size adaptive flux calculational approach has been developed for a typical wall source. Parametric studies of the effect of mesh size on flux calculations have been done. The optimum value of the mesh size is found to depend strongly on distance from the source, permissible limits on uncertainty in flux predictions and on computer Central Processing Unit time. To test the computations, a typical wall source was reduced to a point, a line and an infinite volume source having finite thickness, and the computed flux values were compared with values from corresponding analytical expressions for these sources. Results indicate that the errors under optimum conditions remain less than 6% for the fluxes calculated from this approach when compared with the analytical values for the point and the line source approximations. Also, when the wall is simulated as an infinite volume source having finite thickness, the errors in computed to analytical flux ratios remain large for smaller wall dimensions. However, the errors become less than 10% when the wall dimensions are greater than ten mean free paths for 3 MeV gamma rays. Also, specific dose rates from this methodology remain within the difference of 15% for the values obtained by Monte Carlo method. (author)

  8. Implementation and training methodology of subcritical reactors neutronic calculations triggered by external neutron source and applications

    International Nuclear Information System (INIS)

    Carluccio, Thiago

    2011-01-01

    This works had as goal to investigate calculational methodologies on subcritical source driven reactor, such as Accelerator Driven Subcritical Reactor (ADSR) and Fusion Driven Subcritical Reactor (FDSR). Intense R and D has been done about these subcritical concepts, mainly due to Minor Actinides (MA) and Long Lived Fission Products (LLFP) transmutation possibilities. In this work, particular emphasis has been given to: (1) complement and improve calculation methodology with neutronic transmutation and decay capabilities and implement it computationally, (2) utilization of this methodology in the Coordinated Research Project (CRP) of the International Atomic Energy Agency Analytical and Experimental Benchmark Analysis of ADS and in the Collaborative Work on Use of Low Enriched Uranium in ADS, especially in the reproduction of the experimental results of the Yalina Booster subcritical assembly and study of a subcritical core of IPEN / MB-01 reactor, (3) to compare different nuclear data libraries calculation of integral parameters, such as k eff and k src , and differential distributions, such as spectrum and flux, and nuclides inventories and (4) apply the develop methodology in a study that may help future choices about dedicated transmutation system. The following tools have been used in this work: MCNP (Monte Carlo N particle transport code), MCB (enhanced version of MCNP that allows burnup calculation) and NJOY to process nuclear data from evaluated nuclear data files. (author)

  9. Multiple human schemas and the communication-information sources use: An application of Q-methodology

    Directory of Open Access Journals (Sweden)

    Mansour Shahvali

    2014-12-01

    Full Text Available This study was conducted with the aim of developing a communication and information model for greenhouse farmers in Yazd city using schema theory. Performing the Q methodology together with the factor analysis, as such, the different variables were loaded over the five schematic factors which included the human philosophical nature, ideological, economic, social, and environmental-conservation beliefs. Running AMOS,of course, it was also unveiled that the philosophical, ideological, social, economic and environmental schemas influence directly on the personal communication-information sources use. Furthermore, the environmental-conservation schema affects directly and indirectly the personal communication-information sources use. More importantly, this study indicated the important role of the indigenous sources which play in constructing, evaluating and retrieving the environmental knowledge in respondents. The research predisposes a suitable context for policymakers who seek to draw up much more effective and appropriate communication and information strategies to address the specific target groups’ needs.

  10. Small Works, Big Stories. Methodological approaches to photogrammetry through crowd-sourcing experiences

    Directory of Open Access Journals (Sweden)

    Seren Griffiths

    2015-12-01

    Full Text Available A recent digital public archaeology project (HeritageTogether sought to build a series of 3D ditigal models using photogrammetry from crowd-sourced images. The project saw over 13000 digital images being donated, and resulted in models of some 78 sites, providing resources for researchers, and condition surveys. The project demonstrated that digital public archaeology does not stop at the 'trowel's edge', and that collaborative post-excavation analysis and generation of research processes are as important as time in the field. We emphasise in this contribution that our methodologies, as much as our research outputs, can be fruitfully co-produced in public archaeology projects.

  11. Sensitivity analysis of source driven subcritical systems by the HGPT methodology

    International Nuclear Information System (INIS)

    Gandini, A.

    1997-01-01

    The heuristically based generalized perturbation theory (HGPT) methodology has been extensively used in the last decades for analysis studies in the nuclear reactor field. Its use leads to fundamental reciprocity relationships from which perturbation, or sensitivity expressions can be derived, to first and higher order, in terms of simple integration operation of quantities calculated at unperturbed system conditions. Its application to subcritical, source-driven systems, now considered with increasing interest in many laboratories for their potential use as nuclear waste burners and/or safer energy producers, is here commented, with particular emphasis to problems implying an intensive system control variable. (author)

  12. Methodology and main results of seismic source characterization for the PEGASOS Project, Switzerland

    International Nuclear Information System (INIS)

    Coppersmith, K. J.; Youngs, R. R.; Sprecher, Ch.

    2009-01-01

    Under the direction of the National Cooperative for the Disposal of Radioactive Waste (NAGRA), a probabilistic seismic hazard analysis was conducted for the Swiss nuclear power plant sites. The study has become known under the name 'PEGASOS Project'. This is the first of a group of papers in this volume that describes the seismic source characterization methodology and the main results of the project. A formal expert elicitation process was used, including dissemination of a comprehensive database, multiple workshops for identification and discussion of alternative models and interpretations, elicitation interviews, feedback to provide the experts with the implications of their preliminary assessments, and full documentation of the assessments. A number of innovative approaches to the seismic source characterization methodology were developed by four expert groups and implemented in the study. The identification of epistemic uncertainties and treatment using logic trees were important elements of the assessments. Relative to the assessment of the seismotectonic framework, the four expert teams identified similar main seismotectonic elements: the Rhine Graben, the Jura / Molasse regions, Helvetic and crystalline subdivisions of the Alps, and the southern Germany region. In defining seismic sources, the expert teams used a variety of approaches. These range from large regional source zones having spatially-smoothed seismicity to smaller local zones, to account for spatial variations in observed seismicity. All of the teams discussed the issue of identification of feature-specific seismic sources (i.e. individual mapped faults) as well as the potential reactivation of the boundary faults of the Permo-Carboniferous grabens. Other important seismic source definition elements are the specification of earthquake rupture dimensions and the earthquake depth distribution. Maximum earthquake magnitudes were assessed for each seismic source using approaches that consider the

  13. Receptor modeling for source apportionment of polycyclic aromatic hydrocarbons in urban atmosphere.

    Science.gov (United States)

    Singh, Kunwar P; Malik, Amrita; Kumar, Ranjan; Saxena, Puneet; Sinha, Sarita

    2008-01-01

    This study reports source apportionment of polycyclic aromatic hydrocarbons (PAHs) in particulate depositions on vegetation foliages near highway in the urban environment of Lucknow city (India) using the principal components analysis/absolute principal components scores (PCA/APCS) receptor modeling approach. The multivariate method enables identification of major PAHs sources along with their quantitative contributions with respect to individual PAH. The PCA identified three major sources of PAHs viz. combustion, vehicular emissions, and diesel based activities. The PCA/APCS receptor modeling approach revealed that the combustion sources (natural gas, wood, coal/coke, biomass) contributed 19-97% of various PAHs, vehicular emissions 0-70%, diesel based sources 0-81% and other miscellaneous sources 0-20% of different PAHs. The contributions of major pyrolytic and petrogenic sources to the total PAHs were 56 and 42%, respectively. Further, the combustion related sources contribute major fraction of the carcinogenic PAHs in the study area. High correlation coefficient (R2 > 0.75 for most PAHs) between the measured and predicted concentrations of PAHs suggests for the applicability of the PCA/APCS receptor modeling approach for estimation of source contribution to the PAHs in particulates.

  14. Inter-comparison of receptor models for PM source apportionment: Case study in an industrial area

    Science.gov (United States)

    Viana, M.; Pandolfi, M.; Minguillón, M. C.; Querol, X.; Alastuey, A.; Monfort, E.; Celades, I.

    2008-05-01

    Receptor modelling techniques are used to identify and quantify the contributions from emission sources to the levels and major and trace components of ambient particulate matter (PM). A wide variety of receptor models are currently available, and consequently the comparability between models should be evaluated if source apportionment data are to be used as input in health effects studies or mitigation plans. Three of the most widespread receptor models (principal component analysis, PCA; positive matrix factorization, PMF; chemical mass balance, CMB) were applied to a single PM10 data set (n=328 samples, 2002-2005) obtained from an industrial area in NE Spain, dedicated to ceramic production. Sensitivity and temporal trend analyses (using the Mann-Kendall test) were applied. Results evidenced the good overall performance of the three models (r2>0.83 and α>0.91×between modelled and measured PM10 mass), with a good agreement regarding source identification and high correlations between input (CMB) and output (PCA, PMF) source profiles. Larger differences were obtained regarding the quantification of source contributions (up to a factor of 4 in some cases). The combined application of different types of receptor models would solve the limitations of each of the models, by constructing a more robust solution based on their strengths. The authors suggest the combined use of factor analysis techniques (PCA, PMF) to identify and interpret emission sources, and to obtain a first quantification of their contributions to the PM mass, and the subsequent application of CMB. Further research is needed to ensure that source apportionment methods are robust enough for application to PM health effects assessments.

  15. Parsing pyrogenic polycyclic aromatic hydrocarbons: forensic chemistry, receptor models, and source control policy.

    Science.gov (United States)

    O'Reilly, Kirk T; Pietari, Jaana; Boehm, Paul D

    2014-04-01

    A realistic understanding of contaminant sources is required to set appropriate control policy. Forensic chemical methods can be powerful tools in source characterization and identification, but they require a multiple-lines-of-evidence approach. Atmospheric receptor models, such as the US Environmental Protection Agency (USEPA)'s chemical mass balance (CMB), are increasingly being used to evaluate sources of pyrogenic polycyclic aromatic hydrocarbons (PAHs) in sediments. This paper describes the assumptions underlying receptor models and discusses challenges in complying with these assumptions in practice. Given the variability within, and the similarity among, pyrogenic PAH source types, model outputs are sensitive to specific inputs, and parsing among some source types may not be possible. Although still useful for identifying potential sources, the technical specialist applying these methods must describe both the results and their inherent uncertainties in a way that is understandable to nontechnical policy makers. The authors present an example case study concerning an investigation of a class of parking-lot sealers as a significant source of PAHs in urban sediment. Principal component analysis is used to evaluate published CMB model inputs and outputs. Targeted analyses of 2 areas where bans have been implemented are included. The results do not support the claim that parking-lot sealers are a significant source of PAHs in urban sediments. © 2013 SETAC.

  16. Source-receptor relationships for atmospheric mercury in urban Detroit, Michigan

    Science.gov (United States)

    Lynam, Mary M.; Keeler, Gerald J.

    Speciated hourly mercury measurements were made in Detroit, Michigan during four sampling campaigns from 2000 to 2002. In addition, other chemical and meteorological parameters were measured concurrently. These data were analyzed using principal components analysis (PCA) in order to develop source receptor relationships for mercury species in urban Detroit. Reactive gaseous mercury (RGM) was found to cluster on two main factors; photochemistry and a coal combustion factor. Particulate phase mercury, Hg p, tended to cluster with RGM on the same factor. The photochemistry factor corroborates previous observations of the presence of RGM in highly oxidizing atmospheres and does not point to a specific source emission type. Instead, it likely represents local emissions and regional transport of photochemically processed air masses. The coal combustion factor is indicative of emissions from coal-fired power plants near the receptor site. Elemental mercury was found on a factor for combustion from automobiles and points to the influence these emissions have on the receptor site, which was located proximate to two major interstate highways and the largest border crossing in the United States. This analysis reveals that the receptor site which is located in an industrialized sector of the city of Detroit experienced impacts from both stationary and point sources of mercury that are both local and regional in nature.

  17. Estimating the reliability of glycemic index values and potential sources of methodological and biological variability.

    Science.gov (United States)

    Matthan, Nirupa R; Ausman, Lynne M; Meng, Huicui; Tighiouart, Hocine; Lichtenstein, Alice H

    2016-10-01

    The utility of glycemic index (GI) values for chronic disease risk management remains controversial. Although absolute GI value determinations for individual foods have been shown to vary significantly in individuals with diabetes, there is a dearth of data on the reliability of GI value determinations and potential sources of variability among healthy adults. We examined the intra- and inter-individual variability in glycemic response to a single food challenge and methodologic and biological factors that potentially mediate this response. The GI value for white bread was determined by using standardized methodology in 63 volunteers free from chronic disease and recruited to differ by sex, age (18-85 y), and body mass index [BMI (in kg/m 2 ): 20-35]. Volunteers randomly underwent 3 sets of food challenges involving glucose (reference) and white bread (test food), both providing 50 g available carbohydrates. Serum glucose and insulin were monitored for 5 h postingestion, and GI values were calculated by using different area under the curve (AUC) methods. Biochemical variables were measured by using standard assays and body composition by dual-energy X-ray absorptiometry. The mean ± SD GI value for white bread was 62 ± 15 when calculated by using the recommended method. Mean intra- and interindividual CVs were 20% and 25%, respectively. Increasing sample size, replication of reference and test foods, and length of blood sampling, as well as AUC calculation method, did not improve the CVs. Among the biological factors assessed, insulin index and glycated hemoglobin values explained 15% and 16% of the variability in mean GI value for white bread, respectively. These data indicate that there is substantial variability in individual responses to GI value determinations, demonstrating that it is unlikely to be a good approach to guiding food choices. Additionally, even in healthy individuals, glycemic status significantly contributes to the variability in GI value

  18. Development of the methodology for application of revised source term to operating nuclear power plants in Korea

    International Nuclear Information System (INIS)

    Kang, M.S.; Kang, P.; Kang, C.S.; Moon, J.H.

    2004-01-01

    Considering the current trend in applying the revised source term proposed by NUREG-1465 to the nuclear power plants in the U.S., it is expected that the revised source term will be applied to the Korean operating nuclear power plants in the near future, even though the exact time can not be estimated. To meet the future technical demands, it is necessary to prepare the technical system including the related regulatory requirements in advance. In this research, therefore, it is intended to develop the methodology to apply the revised source term to operating nuclear power plants in Korea. Several principles were established to develop the application methodologies. First, it is not necessary to modify the existing regulations about source term (i.e., any back-fitting to operating nuclear plants is not necessary). Second, if the pertinent margin of safety is guaranteed, the revised source term suggested by NUREG-1465 may be useful to full application. Finally, a part of revised source term could be selected to application based on the technical feasibility. As the results of this research, several methodologies to apply the revised source term to the Korean operating nuclear power plants have been developed, which include: 1) the selective (or limited) application to use only some of all the characteristics of the revised source term, such as release timing of fission products and chemical form of radio-iodine and 2) the full application to use all the characteristics of the revised source term. The developed methodologies are actually applied to Ulchin 9 and 4 units and their application feasibilities are reviewed. The results of this research are used as either a manual in establishing the plan and the procedure for applying the revised source term to the domestic nuclear plant from the utility's viewpoint; or a technical basis of revising the related regulations from the regulatory body's viewpoint. The application of revised source term to operating nuclear

  19. Source-water susceptibility assessment in Texas—Approach and methodology

    Science.gov (United States)

    Ulery, Randy L.; Meyer, John E.; Andren, Robert W.; Newson, Jeremy K.

    2011-01-01

    Public water systems provide potable water for the public's use. The Safe Drinking Water Act amendments of 1996 required States to prepare a source-water susceptibility assessment (SWSA) for each public water system (PWS). States were required to determine the source of water for each PWS, the origin of any contaminant of concern (COC) monitored or to be monitored, and the susceptibility of the public water system to COC exposure, to protect public water supplies from contamination. In Texas, the Texas Commission on Environmental Quality (TCEQ) was responsible for preparing SWSAs for the more than 6,000 public water systems, representing more than 18,000 surface-water intakes or groundwater wells. The U.S. Geological Survey (USGS) worked in cooperation with TCEQ to develop the Source Water Assessment Program (SWAP) approach and methodology. Texas' SWAP meets all requirements of the Safe Drinking Water Act and ultimately provides the TCEQ with a comprehensive tool for protection of public water systems from contamination by up to 247 individual COCs. TCEQ staff identified both the list of contaminants to be assessed and contaminant threshold values (THR) to be applied. COCs were chosen because they were regulated contaminants, were expected to become regulated contaminants in the near future, or were unregulated but thought to represent long-term health concerns. THRs were based on maximum contaminant levels from U.S. Environmental Protection Agency (EPA)'s National Primary Drinking Water Regulations. For reporting purposes, COCs were grouped into seven contaminant groups: inorganic compounds, volatile organic compounds, synthetic organic compounds, radiochemicals, disinfection byproducts, microbial organisms, and physical properties. Expanding on the TCEQ's definition of susceptibility, subject-matter expert working groups formulated the SWSA approach based on assumptions that natural processes and human activities contribute COCs in quantities that vary in space

  20. Health beliefs and their sources in Korean and Japanese nurses: A Q-methodology pilot study.

    Science.gov (United States)

    Stone, Teresa E; Kang, Sook Jung; Cha, Chiyoung; Turale, Sue; Murakami, Kyoko; Shimizu, Akihiko

    2016-01-01

    Many health beliefs do not have supporting scientific evidence, and are influenced by culture, gender, religion, social circumstance and popular media. Nurses may also hold non-evidenced-based beliefs that affect their own health behaviours and their practices. Using Q-methodology, pilot Q-cards representing a concourse of health beliefs for Japanese and South Korean nurses and explain the content and sources of health beliefs. Qualitative. Two university campuses, one each in Japan and Korea. A convenience sample of 30 was obtained, 14 clinical nurses and 16 academic nurses. Literature reviews and expert informants were used to develop two sets of 65 Q-cards which listed culturally appropriate health beliefs in both Japan and Korea. These beliefs were examined in four structured groups and five individual interviews in Japan, and five groups and two individual interviews in Korea. Our unique study revealed six categories regarding sources of health beliefs that provide rich insights about how participants accessed, processed and transmitted health information. They were more certain about knowledge from their specialty area such as that from medical or nursing resources, but derived and distributed many general health beliefs from personal experience, family and mass media. They did not always pass on accurate information to students or those in their care, and often beliefs were not based on scientific evidence. Findings highlight the dangers of clinical and academic nurses relying on health belief advice of others and passing this on to patients, students or others, without mindfully examining the basis of their beliefs through scientific evidence. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Methodology and Data Sources for Assessing Extreme Charging Events within the Earth's Magnetosphere

    Science.gov (United States)

    Parker, L. N.; Minow, J. I.; Talaat, E. R.

    2016-12-01

    Spacecraft surface and internal charging is a potential threat to space technologies because electrostatic discharges on, or within, charged spacecraft materials can result in a number of adverse impacts to spacecraft systems. The Space Weather Action Plan (SWAP) ionizing radiation benchmark team recognized that spacecraft charging will need to be considered to complete the ionizing radiation benchmarks in order to evaluate the threat of charging to critical space infrastructure operating within the near-Earth ionizing radiation environments. However, the team chose to defer work on the lower energy charging environments and focus the initial benchmark efforts on the higher energy galactic cosmic ray, solar energetic particle, and trapped radiation belt particle environments of concern for radiation dose and single event effects in humans and hardware. Therefore, an initial set of 1 in 100 year spacecraft charging environment benchmarks remains to be defined to meet the SWAP goals. This presentation will discuss the available data sources and a methodology to assess the 1 in 100 year extreme space weather events that drive surface and internal charging threats to spacecraft. Environments to be considered are the hot plasmas in the outer magnetosphere during geomagnetic storms, relativistic electrons in the outer radiation belt, and energetic auroral electrons in low Earth orbit at high latitudes.

  2. Microbial methodological artifacts in [3H]glutamate receptor binding assays

    International Nuclear Information System (INIS)

    Yoneda, Y.; Ogita, K.

    1989-01-01

    Incubation of radiolabeled L-glutamic acid, a putative central excitatory neurotransmitter, in 50 mM Tris-acetate buffer (pH 7.4) at 30 degrees C in the absence of brain synaptic membranes resulted in a significant adsorption of the radioactivity to glass fiber filters routinely employed to trap the bound ligand in receptor binding assays. The adsorption was not only eliminated by the inclusion of L-isomers of structurally related amino acids, but also inhibited by that of most presumed agonists and antagonists for the brain glutamate receptors. This displaceable adsorption was a temperature-dependent nonreversible, and saturable phenomenon. Scatchard analysis of these data revealed that the adsorption consisted of a single component with an apparent dissociation constant of 73 nM. The displaceable adsorption was significantly attenuated by a concurrent incubation with papain, pronase E, and phospholipase C. A significant amount of the radioactivity was detected in the pass-through fraction of the Dowex column following an application of the reaction mixture incubated with purified [ 3 H]glutamate at 30 degree C for 60 min in the absence of membranous proteins added. Complete abolition of the displaceable adsorption resulted from the use of incubation buffer boiled at 100 degrees C as well as filtered through a nitrocellulose membrane filter with a pore size of 0.45 micron immediately before use. These results suggest that the displaceable adsorption may be attributable to the radioactive metabolite of [ 3 H]glutamate by microorganisms contaminating the Tris-acetate buffer. This might in part contribute to some of the controversial results with regard to receptor binding studies on acidic amino acids

  3. Water Quality Assessment of River Soan (Pakistan) and Source Apportionment of Pollution Sources Through Receptor Modeling.

    Science.gov (United States)

    Nazeer, Summya; Ali, Zeshan; Malik, Riffat Naseem

    2016-07-01

    The present study was designed to determine the spatiotemporal patterns in water quality of River Soan using multivariate statistics. A total of 26 sites were surveyed along River Soan and its associated tributaries during pre- and post-monsoon seasons in 2008. Hierarchical agglomerative cluster analysis (HACA) classified sampling sites into three groups according to their degree of pollution, which ranged from least to high degradation of water quality. Discriminant function analysis (DFA) revealed that alkalinity, orthophosphates, nitrates, ammonia, salinity, and Cd were variables that significantly discriminate among three groups identified by HACA. Temporal trends as identified through DFA revealed that COD, DO, pH, Cu, Cd, and Cr could be attributed for major seasonal variations in water quality. PCA/FA identified six factors as potential sources of pollution of River Soan. Absolute principal component scores using multiple regression method (APCS-MLR) further explained the percent contribution from each source. Heavy metals were largely added through industrial activities (28 %) and sewage waste (28 %), nutrients through agriculture runoff (35 %) and sewage waste (28 %), organic pollution through sewage waste (27 %) and urban runoff (17 %) and macroelements through urban runoff (39 %), and mineralization and sewage waste (30 %). The present study showed that anthropogenic activities are the major source of variations in River Soan. In order to address the water quality issues, implementation of effective waste management measures are needed.

  4. Receptor modeling studies for the characterization of PM10 pollution sources in Belgrade

    Directory of Open Access Journals (Sweden)

    Mijić Zoran

    2012-01-01

    Full Text Available The objective of this study is to determine the major sources and potential source regions of PM10 over Belgrade, Serbia. The PM10 samples were collected from July 2003 to December 2006 in very urban area of Belgrade and concentrations of Al, V, Cr, Mn, Fe, Ni, Cu, Zn, Cd and Pb were analyzed by atomic absorption spectrometry. The analysis of seasonal variations of PM10 mass and some element concentrations reported relatively higher concentrations in winter, what underlined the importance of local emission sources. The Unmix model was used for source apportionment purpose and the four main source profiles (fossil fuel combustion; traffic exhaust/regional transport from industrial centers; traffic related particles/site specific sources and mineral/crustal matter were identified. Among the resolved factors the fossil fuel combustion was the highest contributor (34% followed by traffic/regional industry (26%. Conditional probability function (CPF results identified possible directions of local sources. The potential source contribution function (PSCF and concentration weighted trajectory (CWT receptor models were used to identify spatial source distribution and contribution of regional-scale transported aerosols. [Projekat Ministarstva nauke Republike Srbije, br. III43007 i br. III41011

  5. The development of a methodology to assess population doses from multiple sources and exposure pathways of radioactivity

    International Nuclear Information System (INIS)

    Hancox, J.; Stansby, S.; Thorne, M.

    2002-01-01

    The Environment Agency (EA) has new duties in accordance with the Basic Safety Standards Directive under which it is required to ensure that doses to individuals received from exposure to anthropogenic sources of radioactivity are within defined limits. In order to assess compliance with these requirements, the EA needs to assess the doses to members of the most highly exposed population groups ('critical' groups) from all relevant potential sources of anthropogenic radioactivity and all relevant potential exposure pathways to such radioactivity. The EA has identified a need to develop a methodology for the retrospective assessment of effective doses from multiple sources of radioactive materials and exposure pathways associated with those sources. Under contract to the EA, AEA Technology has undertaken the development of a suitable methodology as part of EA R and D Project P3-070. The methodology developed under this research project has been designed to support the EA in meeting its obligations under the Euratom Basic Safety Standards Directive and is consistent with UK and international approaches to radiation dosimetry and radiological protection. The development and trial application of the methodology is described in this report

  6. Prevalence and methodologies for detection, characterization and subtyping of Listeria monocytogenes and L. ivanovii in foods and environmental sources

    Directory of Open Access Journals (Sweden)

    Jin-Qiang Chen

    2017-09-01

    Full Text Available Listeria monocytogenes, one of the most important foodborne pathogens, can cause listeriosis, a lethal disease for humans. L. ivanovii, which is closely related to L. monocytogenes, is also widely distributed in nature and infects mainly warm-blooded ruminants, causing economic loss. Thus, there are high priority needs for methodologies for rapid, specific, cost-effective and accurate detection, characterization and subtyping of L. monocytogenes and L. ivanovii in foods and environmental sources. In this review, we (A described L. monocytogenes and L. ivanovii, world-wide incidence of listeriosis, and prevalence of various L. monocytogenes strains in food and environmental sources; (B comprehensively reviewed different types of traditional and newly developed methodologies, including culture-based, antigen/antibody-based, LOOP-mediated isothermal amplification, matrix-assisted laser desorption ionization-time of flight-mass spectrometry, DNA microarray, and genomic sequencing for detection and characterization of L. monocytogenes in foods and environmental sources; (C comprehensively summarized different subtyping methodologies, including pulsed-field gel electrophoresis, multi-locus sequence typing, ribotyping, and phage-typing, and whole genomic sequencing etc. for subtyping of L. monocytogenes strains from food and environmental sources; and (D described the applications of these methodologies in detection and subtyping of L. monocytogenes in foods and food processing facilities.

  7. Methodology for benzodiazepine receptor binding assays at physiological temperature. Rapid change in equilibrium with falling temperature

    International Nuclear Information System (INIS)

    Dawson, R.M.

    1986-01-01

    Benzodiazepine receptors of rat cerebellum were assayed with [ 3 H]-labeled flunitrazepam at 37 0 C, and assays were terminated by filtration in a cold room according to one of three protocols: keeping each sample at 37 degrees C until ready for filtration, taking the batch of samples (30) into the cold room and filtering sequentially in the order 1-30, and taking the batch of 30 samples into the cold room and filtering sequentially in the order 30-1. the results for each protocol were substantially different from each other, indicating that rapid disruption of equilibrium occurred as the samples cooled in the cold room while waiting to be filtered. Positive or negative cooperativity of binding was apparent, and misleading effects of gamma-aminobutyric acid on the affinity of diazepam were observed, unless each sample was kept at 37 0 C until just prior to filtration

  8. On the use of Different Methodologies in Cognitive Neuropsychology: Drink Deep and from Several Sources

    Science.gov (United States)

    Nickels, Lyndsey; Howard, David; Best, Wendy

    2012-01-01

    Cognitive neuropsychology has championed the use of single-case research design. Recently, however, case series designs that employ multiple single cases have been increasingly utilized to address theoretical issues using data from neuropsychological populations. In this paper, we examine these methodologies, focusing on a number of points in particular. First we discuss the use of dissociations and associations, often thought of as a defining feature of cognitive neuropsychology, and argue that they are better viewed as part of a spectrum of methods that aim to explain and predict behaviour. We also raise issues regarding case series design in particular, arguing that selection of an appropriate sample, including controlling degree of homogeneity, is critical and constrains the theoretical claims that can be made on the basis of the data. We discuss the possible interpretation of “outliers” in a case series, suggesting that while they may reflect “noise” caused by variability in performance due to factors that are not of relevance to the theoretical claims, they may also reflect the presence of patterns that are critical to test, refine, and potentially falsify our theories. The role of case series in treatment research is also raised, in light of the fact that, despite their status as gold standard, randomized controlled trials cannot provide answers to many crucial theoretical and clinical questions. Finally, we stress the importance of converging evidence: We propose that it is conclusions informed by multiple sources of evidence that are likely to best inform theory and stand the test of time. PMID:22746689

  9. Platform development for merging various information sources for water management: methodological, technical and operational aspects

    Science.gov (United States)

    Galvao, Diogo

    2013-04-01

    As a result of various economic, social and environmental factors, we can all experience the increase in importance of water resources at a global scale. As a consequence, we can also notice the increasing need of methods and systems capable of efficiently managing and combining the rich and heterogeneous data available that concerns, directly or indirectly, these water resources, such as in-situ monitoring station data, Earth Observation images and measurements, Meteorological modeling forecasts and Hydrological modeling. Under the scope of the MyWater project, we developed a water management system capable of satisfying just such needs, under a flexible platform capable of accommodating future challenges, not only in terms of sources of data but also on applicable models to extract information from it. From a methodological point of view, the MyWater platform obtains data from distinct sources, and in distinct formats, be they Satellite images or meteorological model forecasts, transforms and combines them in ways that allow them to be fed to a variety of hydrological models (such as MOHID Land, SIMGRO, etc…), which themselves can also be combined, using such approaches as those advocated by the OpenMI standard, to extract information in an automated and time efficient manner. Such an approach brings its own deal of challenges, and further research was developed under this project on the best ways to combine such data and on novel approaches to hydrological modeling (like the PriceXD model). From a technical point of view, the MyWater platform is structured according to a classical SOA architecture, with a flexible object oriented modular backend service responsible for all the model process management and data treatment, while the information extracted can be interacted with using a variety of frontends, from a web portal, including also a desktop client, down to mobile phone and tablet applications. From an operational point of view, a user can not only see

  10. Assessment of source-receptor relationships of aerosols: An integrated forward and backward modeling approach

    Science.gov (United States)

    Kulkarni, Sarika

    This dissertation presents a scientific framework that facilitates enhanced understanding of aerosol source -- receptor (S/R) relationships and their impact on the local, regional and global air quality by employing a complementary suite of modeling methods. The receptor -- oriented Positive Matrix Factorization (PMF) technique is combined with Potential Source Contribution Function (PSCF), a trajectory ensemble model, to characterize sources influencing the aerosols measured at Gosan, Korea during spring 2001. It is found that the episodic dust events originating from desert regions in East Asia (EA) that mix with pollution along the transit path, have a significant and pervasive impact on the air quality of Gosan. The intercontinental and hemispheric transport of aerosols is analyzed by a series of emission perturbation simulations with the Sulfur Transport and dEposition Model (STEM), a regional scale Chemical Transport Model (CTM), evaluated with observations from the 2008 NASA ARCTAS field campaign. This modeling study shows that pollution transport from regions outside North America (NA) contributed ˜ 30 and 20% to NA sulfate and BC surface concentration. This study also identifies aerosols transported from Europe, NA and EA regions as significant contributors to springtime Arctic sulfate and BC. Trajectory ensemble models are combined with source region tagged tracer model output to identify the source regions and possible instances of quasi-lagrangian sampled air masses during the 2006 NASA INTEX-B field campaign. The impact of specific emission sectors from Asia during the INTEX-B period is studied with the STEM model, identifying residential sector as potential target for emission reduction to combat global warming. The output from the STEM model constrained with satellite derived aerosol optical depth and ground based measurements of single scattering albedo via an optimal interpolation assimilation scheme is combined with the PMF technique to

  11. Radiological risk assessment for the public under the loss of medium and large sources using bayesian methodology

    International Nuclear Information System (INIS)

    Kim, Joo Yeon; Jang, Han Ki; Lee, Jai Ki

    2005-01-01

    Bayesian methodology is appropriated for use in PRA because subjective knowledges as well as objective data are applied to assessment. In this study, radiological risk based on Bayesian methodology is assessed for the loss of source in field radiography. The exposure scenario for the lost source presented in U.S. NRC is reconstructed by considering the domestic situation and Bayes theorem is applied to updating of failure probabilities of safety functions. In case of updating of failure probabilities, it shows that 5% Bayes credible intervals using Jeffreys prior distribution are lower than ones using vague prior distribution. It is noted that Jeffreys prior distribution is appropriated in risk assessment for systems having very low failure probabilities. And, it shows that the mean of the expected annual dose for the public based on Bayesian methodology is higher than the dose based on classical methodology because the means of the updated probabilities are higher than classical probabilities. The database for radiological risk assessment are sparse in domestic. It summarizes that Bayesian methodology can be applied as an useful alternative for risk assessment and the study on risk assessment will be contributed to risk-informed regulation in the field of radiation safety

  12. Methodology for Quantitative Analysis of Large Liquid Samples with Prompt Gamma Neutron Activation Analysis using Am-Be Source

    International Nuclear Information System (INIS)

    Idiri, Z.; Mazrou, H.; Beddek, S.; Amokrane, A.

    2009-01-01

    An optimized set-up for prompt gamma neutron activation analysis (PGNAA) with Am-Be source is described and used for large liquid samples analysis. A methodology for quantitative analysis is proposed: it consists on normalizing the prompt gamma count rates with thermal neutron flux measurements carried out with He-3 detector and gamma attenuation factors calculated using MCNP-5. The relative and absolute methods are considered. This methodology is then applied to the determination of cadmium in industrial phosphoric acid. The same sample is then analyzed by inductively coupled plasma (ICP) method. Our results are in good agreement with those obtained with ICP method.

  13. Revealing transboundary and local air pollutant sources affecting Metro Manila through receptor modeling studies

    International Nuclear Information System (INIS)

    Pabroa, Preciosa Corazon B.; Bautista VII, Angel T.; Santos, Flora L.; Racho, Joseph Michael D.

    2011-01-01

    Ambient fine particulate matter (PM 2 .5) levels at the Metro Manila air sampling stations of the Philippine Nuclear Research Research Institute were found to be above the WHO guideline value of 10 μg m 3 indicating, in general, very poor air quality in the area. The elemental components of the fine particulate matter were obtained using the energy-dispersive x-ray fluorescence spectrometry. Positive matrix factorization, a receptor modelling tool, was used to identify and apportion air pollution sources. Location of probable transboundary air pollutants were evaluated using HYSPLIT (Hybrid Single Particle Lagrangian Integrated Trajectory Model) while location of probable local air pollutant sources were determined using the conditional probability function (CPF). Air pollutant sources can either be natural or anthropogenic. This study has shown natural air pollutant sources such as volcanic eruptions from Bulusan volcano in 2006 and from Anatahan volcano in 2005 to have impacted on the region. Fine soils was shown to have originated from China's Mu US Desert some time in 2004. Smoke in the fine fraction in 2006 show indications of coming from forest fires in Sumatra and Borneo. Fine particulate Pb in Valenzuela was shown to be coming from the surrounding area. Many more significant air pollution impacts can be evaluated with the identification of probable air pollutant sources with the use of elemental fingerprints and locating these sources with the use of HYSPLIT and CPF. (author)

  14. Comparison of receptor models for source apportionment of volatile organic compounds in Beijing, China

    International Nuclear Information System (INIS)

    Song Yu; Dai Wei; Shao Min; Liu Ying; Lu Sihua; Kuster, William; Goldan, Paul

    2008-01-01

    Identifying the sources of volatile organic compounds (VOCs) is key to reducing ground-level ozone and secondary organic aerosols (SOAs). Several receptor models have been developed to apportion sources, but an intercomparison of these models had not been performed for VOCs in China. In the present study, we compared VOC sources based on chemical mass balance (CMB), UNMIX, and positive matrix factorization (PMF) models. Gasoline-related sources, petrochemical production, and liquefied petroleum gas (LPG) were identified by all three models as the major contributors, with UNMIX and PMF producing quite similar results. The contributions of gasoline-related sources and LPG estimated by the CMB model were higher, and petrochemical emissions were lower than in the UNMIX and PMF results, possibly because the VOC profiles used in the CMB model were for fresh emissions and the profiles extracted from ambient measurements by the two-factor analysis models were 'aged'. - VOCs sources were similar for three models with CMB showing a higher estimate for vehicles

  15. Comparison of receptor models for source apportionment of volatile organic compounds in Beijing, China

    Energy Technology Data Exchange (ETDEWEB)

    Song Yu; Dai Wei [Department of Environmental Sciences, Peking University, Beijing 100871 (China); Shao Min [State Joint Key Laboratory of Environmental Simulation and Pollution Control, Peking University, Beijing 100871 (China)], E-mail: mshao@pku.edu.cn; Liu Ying; Lu Sihua [State Joint Key Laboratory of Environmental Simulation and Pollution Control, Peking University, Beijing 100871 (China); Kuster, William; Goldan, Paul [Chemical Sciences Division, NOAA Earth System Research Laboratory, Boulder, CO 80305 (United States)

    2008-11-15

    Identifying the sources of volatile organic compounds (VOCs) is key to reducing ground-level ozone and secondary organic aerosols (SOAs). Several receptor models have been developed to apportion sources, but an intercomparison of these models had not been performed for VOCs in China. In the present study, we compared VOC sources based on chemical mass balance (CMB), UNMIX, and positive matrix factorization (PMF) models. Gasoline-related sources, petrochemical production, and liquefied petroleum gas (LPG) were identified by all three models as the major contributors, with UNMIX and PMF producing quite similar results. The contributions of gasoline-related sources and LPG estimated by the CMB model were higher, and petrochemical emissions were lower than in the UNMIX and PMF results, possibly because the VOC profiles used in the CMB model were for fresh emissions and the profiles extracted from ambient measurements by the two-factor analysis models were 'aged'. - VOCs sources were similar for three models with CMB showing a higher estimate for vehicles.

  16. Development of the methodology for estimation of dose from a source

    International Nuclear Information System (INIS)

    Golebaone, E.M.

    2012-04-01

    The geometry of a source plays an important role when determining which method to apply in order to accurately estimate dose from a source. If wrong source geometry is used the dose received may be underestimated or overestimated therefore this may lead to wrong decision in dealing with the exposure situation. In this project moisture density gauge was used to represent a point source in order to demonstrate the key parameters to be used when estimating dose from point source. The parameters to be considered are activity of the source, the ambient dose rate, gamma constant for the radionuclide, as well as the transport index on the package of the source. The distance from the source, and the time spent in the radiation field must be known in order to calculate the dose. (author)

  17. Atmospheric Aerosol Source-Receptor Relationships: The Role of Coal-Fired Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Allen L. Robinson; Spyros N. Pandis; Cliff I. Davidson

    2005-12-01

    This report describes the technical progress made on the Pittsburgh Air Quality Study (PAQS) during the period of March 2005 through August 2005. Significant progress was made this project period on the source characterization, source apportionment, and deterministic modeling activities. This report highlights new data on road dust, vegetative detritus and motor vehicle emissions. For example, the results show significant differences in the composition in urban and rural road dust. A comparison of the organic of the fine particulate matter in the tunnel with the ambient provides clear evidence of the significant contribution of vehicle emissions to ambient PM. The source profiles developed from this work are being used by the source-receptor modeling activities. The report presents results on the spatial distribution of PMF-factors. The results can be grouped into three different categories: regional sources, local sources, or potentially both regional and local sources. Examples of the regional sources are the sulfate and selenium PMF-factors which most likely-represent coal fired power plants. Examples of local sources are the specialty steel and lead factors. There is reasonable correspondence between these apportionments and data from the EPA TRI and AIRS emission inventories. Detailed comparisons between PMCAMx predictions and measurements by the STN and IMPROVE measurements in the Eastern US are presented. Comparisons were made for the major aerosol components and PM{sub 2.5} mass in July 2001, October 2001, January 2002, and April 2002. The results are encouraging with average fraction biases for most species less than 0.25. The improvement of the model performance during the last two years was mainly due to the comparison of the model predictions with the continuous measurements in the Pittsburgh Supersite. Major improvements have included the descriptions: of ammonia emissions (CMU inventory), night time nitrate chemistry, EC emissions and their diurnal

  18. Receptor model-based source apportionment of particulate pollution in Hyderabad, India.

    Science.gov (United States)

    Guttikunda, Sarath K; Kopakka, Ramani V; Dasari, Prasad; Gertler, Alan W

    2013-07-01

    Air quality in Hyderabad, India, often exceeds the national ambient air quality standards, especially for particulate matter (PM), which, in 2010, averaged 82.2 ± 24.6, 96.2 ± 12.1, and 64.3 ± 21.2 μg/m(3) of PM10, at commercial, industrial, and residential monitoring stations, respectively, exceeding the national ambient standard of 60 μg/m(3). In 2005, following an ordinance passed by the Supreme Court of India, a source apportionment study was conducted to quantify source contributions to PM pollution in Hyderabad, using the chemical mass balance (version 8.2) receptor model for 180 ambient samples collected at three stations for PM10 and PM2.5 size fractions for three seasons. The receptor modeling results indicated that the PM10 pollution is dominated by the direct vehicular exhaust and road dust (more than 60%). PM2.5 with higher propensity to enter the human respiratory tracks, has mixed sources of vehicle exhaust, industrial coal combustion, garbage burning, and secondary PM. In order to improve the air quality in the city, these findings demonstrate the need to control emissions from all known sources and particularly focus on the low-hanging fruits like road dust and waste burning, while the technological and institutional advancements in the transport and industrial sectors are bound to enhance efficiencies. Andhra Pradesh Pollution Control Board utilized these results to prepare an air pollution control action plan for the city.

  19. Estrogen-related receptor gamma disruption of source water and drinking water treatment processes extracts.

    Science.gov (United States)

    Li, Na; Jiang, Weiwei; Rao, Kaifeng; Ma, Mei; Wang, Zijian; Kumaran, Satyanarayanan Senthik

    2011-01-01

    Environmental chemicals in drinking water can impact human health through nuclear receptors. Additionally, estrogen-related receptors (ERRs) are vulnerable to endocrine-disrupting effects. To date, however, ERR disruption of drinking water potency has not been reported. We used ERRgamma two-hybrid yeast assay to screen ERRgamma disrupting activities in a drinking water treatment plant (DWTP) located in north China and in source water from a reservoir, focusing on agonistic, antagonistic, and inverse agonistic activity to 4-hydroxytamoxifen (4-OHT). Water treatment processes in the DWTP consisted of pre-chlorination, coagulation, coal and sand filtration, activated carbon filtration, and secondary chlorination processes. Samples were extracted by solid phase extraction. Results showed that ERRgamma antagonistic activities were found in all sample extracts, but agonistic and inverse agonistic activity to 4-OHT was not found. When calibrated with the toxic equivalent of 4-OHT, antagonistic effluent effects ranged from 3.4 to 33.1 microg/L. In the treatment processes, secondary chlorination was effective in removing ERRgamma antagonists, but the coagulation process led to significantly increased ERRgamma antagonistic activity. The drinking water treatment processes removed 73.5% of ERRgamma antagonists. To our knowledge, the occurrence of ERRgamma disruption activities on source and drinking water in vitro had not been reported previously. It is vital, therefore, to increase our understanding of ERRy disrupting activities in drinking water.

  20. Information sources in biomedical science and medical journalism: methodological approaches and assessment.

    Science.gov (United States)

    Miranda, Giovanna F; Vercellesi, Luisa; Bruno, Flavia

    2004-09-01

    Throughout the world the public is showing increasing interest in medical and scientific subjects and journalists largely spread this information, with an important impact on knowledge and health. Clearly, therefore, the relationship between the journalist and his sources is delicate: freedom and independence of information depend on the independence and truthfulness of the sources. The new "precision journalism" holds that scientific methods should be applied to journalism, so authoritative sources are a common need for journalists and scientists. We therefore compared the individual classifications and methods of assessing of sources in biomedical science and medical journalism to try to extrapolate scientific methods of evaluation to journalism. In journalism and science terms used to classify sources of information show some similarities, but their meanings are different. In science primary and secondary classes of information, for instance, refer to the levels of processing, but in journalism to the official nature of the source itself. Scientists and journalists must both always consult as many sources as possible and check their authoritativeness, reliability, completeness, up-to-dateness and balance. In journalism, however, there are some important differences and limits: too many sources can sometimes diminish the quality of the information. The sources serve a first filter between the event and the journalist, who is not providing the reader with the fact, but with its projection. Journalists have time constraints and lack the objective criteria for searching, the specific background knowledge, and the expertise to fully assess sources. To assist in understanding the wealth of sources of information in journalism, we have prepared a checklist of items and questions. There are at least four fundamental points that a good journalist, like any scientist, should know: how to find the latest information (the sources), how to assess it (the quality and

  1. New methodology for source location and activity determination in preparation of repairing or decommissioning activities

    International Nuclear Information System (INIS)

    Toubon, H.; Boudergui, K.; Pin, P.; Nohl, B.; Lefevre, S.; Chiron, M.

    2006-01-01

    The operations of dismantling of nuclear installations are often difficult due to the lack of knowledge about the position, the identification and the radiological characteristics of the contamination. To avoid the manual mapping and simply sampling and radiochemical analysis, which takes time and causes doses new tools are now used: - CARTOGAM to detect the position of the activity and the relative dose rates of the different hot spots, - NaI, CZT or Ge gamma spectrometers to characterize the major radionuclides, - a model with ISOCS gamma attenuation code or MERCURAD-PASCALYS gamma attenuation and dose rate evaluation code CARTOGAM and MERCURAD were developed in collaboration with CEA and COGEMA. All these tools are used to build a complete methodology to give adapted solutions to the nuclear facilities. This methodology helps to prepare for and execute decontamination and dismantling activities. After describing the methodology, examples are given of its use in preparation of repairing at an EDF NPP site and in dismantling operations at a CEA site. These examples give concrete insights into their significance and the productivity gains they offer. (authors)

  2. A modified receptor model for source apportionment of heavy metal pollution in soil.

    Science.gov (United States)

    Huang, Ying; Deng, Meihua; Wu, Shaofu; Japenga, Jan; Li, Tingqiang; Yang, Xiaoe; He, Zhenli

    2018-07-15

    Source apportionment is a crucial step toward reduction of heavy metal pollution in soil. Existing methods are generally based on receptor models. However, overestimation or underestimation occurs when they are applied to heavy metal source apportionment in soil. Therefore, a modified model (PCA-MLRD) was developed, which is based on principal component analysis (PCA) and multiple linear regression with distance (MLRD). This model was applied to a case study conducted in a peri-urban area in southeast China where soils were contaminated by arsenic (As), cadmium (Cd), mercury (Hg) and lead (Pb). Compared with existing models, PCA-MLRD is able to identify specific sources and quantify the extent of influence for each emission. The zinc (Zn)-Pb mine was identified as the most important anthropogenic emission, which affected approximately half area for Pb and As accumulation, and approximately one third for Cd. Overall, the influence extent of the anthropogenic emissions decreased in the order of mine (3 km) > dyeing mill (2 km) ≈ industrial hub (2 km) > fluorescent factory (1.5 km) > road (0.5 km). Although algorithm still needs to improved, the PCA-MLRD model has the potential to become a useful tool for heavy metal source apportionment in soil. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Source Apportionment of Atmospheric Particles by Electron Probe X-Ray Microanalysis and Receptor Models.

    Science.gov (United States)

    van Borm, Werner August

    abundance and particle composition. Alternatively, the bulk analysis of filters (total, fine and coarse mode) using Particle Induced X -Ray Emission (PIXE) and the application of a receptor modeling approach provided for complementary information on a macroscopical level. A computer program was developed incorporating an absolute factor analysis based receptor modeling procedure. Source profiles and contributions are described by elemental concentrations and an atmospheric mass balance is put forward. The latter method was applied in a two year study of the Antwerp urban aerosol and for the swiss aerosol, revealing a number of previously known and unknown sources. Both methods were successfully combined to increase the source resolution.

  4. Source term and radiological consequence evaluation for nuclear accidents using a 'hand type' methodology

    International Nuclear Information System (INIS)

    Margeanu, Sorin; Tatiana, Angelescu

    2005-01-01

    In the last decades, hand type calculations have been replaced by computerized solutions, which are much more accurate, but the preparation of an input to run the code can be a time consuming process and can require a laborious work. This is why, a place for hand calculation based on nomograms still exist in some areas. An example is emergency response to an accidental release of radioactive contaminants when the health of persons close to the accident site might be at risk. In this case, results from computerized accident consequences assessment models may be delayed due to the equipment malfunction or the time required developing minimal input files and performing the calculations (typically more than five minutes). A simple nomogram (developed using computerized dispersion model calculations) can provide dispersion and dose estimates within a minute. The paper presents the methodology used for these 'hand type' calculation and the nomograms, figures and tables used to evaluate the dose to an individual close to the release point. In order to illustrate the use of methodology, a hypothetical severe accident scenario involving 14-MW INR-TRIGA research reactor was considered. (authors)

  5. Advanced Reactor PSA Methodologies for System Reliability Analysis and Source Term Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, D.; Brunett, A.; Passerini, S.; Grelle, A.; Bucknor, M.

    2017-06-26

    Beginning in 2015, a project was initiated to update and modernize the probabilistic safety assessment (PSA) of the GE-Hitachi PRISM sodium fast reactor. This project is a collaboration between GE-Hitachi and Argonne National Laboratory (Argonne), and funded in part by the U.S. Department of Energy. Specifically, the role of Argonne is to assess the reliability of passive safety systems, complete a mechanistic source term calculation, and provide component reliability estimates. The assessment of passive system reliability focused on the performance of the Reactor Vessel Auxiliary Cooling System (RVACS) and the inherent reactivity feedback mechanisms of the metal fuel core. The mechanistic source term assessment attempted to provide a sequence specific source term evaluation to quantify offsite consequences. Lastly, the reliability assessment focused on components specific to the sodium fast reactor, including electromagnetic pumps, intermediate heat exchangers, the steam generator, and sodium valves and piping.

  6. Innovative methodology for intercomparison of radionuclide calibrators using short half-life in situ prepared radioactive sources

    International Nuclear Information System (INIS)

    Oliveira, P. A.; Santos, J. A. M.

    2014-01-01

    Purpose: An original radionuclide calibrator method for activity determination is presented. The method could be used for intercomparison surveys for short half-life radioactive sources used in Nuclear Medicine, such as 99m Tc or most positron emission tomography radiopharmaceuticals. Methods: By evaluation of the resulting net optical density (netOD) using a standardized scanning method of irradiated Gafchromic XRQA2 film, a comparison of the netOD measurement with a previously determined calibration curve can be made and the difference between the tested radionuclide calibrator and a radionuclide calibrator used as reference device can be calculated. To estimate the total expected measurement uncertainties, a careful analysis of the methodology, for the case of 99m Tc, was performed: reproducibility determination, scanning conditions, and possible fadeout effects. Since every factor of the activity measurement procedure can influence the final result, the method also evaluates correct syringe positioning inside the radionuclide calibrator. Results: As an alternative to using a calibrated source sent to the surveyed site, which requires a relatively long half-life of the nuclide, or sending a portable calibrated radionuclide calibrator, the proposed method uses a source preparedin situ. An indirect activity determination is achieved by the irradiation of a radiochromic film using 99m Tc under strictly controlled conditions, and cumulated activity calculation from the initial activity and total irradiation time. The irradiated Gafchromic film and the irradiator, without the source, can then be sent to a National Metrology Institute for evaluation of the results. Conclusions: The methodology described in this paper showed to have a good potential for accurate (3%) radionuclide calibrators intercomparison studies for 99m Tc between Nuclear Medicine centers without source transfer and can easily be adapted to other short half-life radionuclides

  7. Prioritization of renewable energy sources for Turkey by using a hybrid MCDM methodology

    International Nuclear Information System (INIS)

    Kabak, Mehmet; Dağdeviren, Metin

    2014-01-01

    Highlights: • The paper proposes a hybrid model to prioritize RE sources for Turkey. • The hybrid model based on BOCR and ANP is proposed under linguistic values. • Strategic criteria are economy, security, wellbeing, technology and global effects. • Nineteen criteria are used to evaluate hydro, geothermal, solar, wind and biomass. - Abstract: Developing countries such as Turkey, with their fast growing population and economy, are facing an increasing demand for energy. Turkey does not possess a sufficient quantity of domestic oil and natural gas resources to support this growing demand. On the other hand, the country does have abundant reserves of renewable energy that can be a major component in providing part of the overall energy supply. The country plans to explore its renewable energy (RE) sources and increase the renewable energy share in near future. With this in mind, this paper proposes a hybrid model based on BOCR (Benefits, Opportunities, Costs and Risks) and ANP (Analytic Network Process) to determine Turkey’s energy status and prioritize alternative RE sources. BOCR analysis provides a strategic analysis and detailed overview of the country’s energy issues. ANP is a practical multi criteria decision making (MCDM) method and offers the advantages of decision making models, based on tangible and intangible factors. 19 criteria are used to evaluate five alternative RE sources (Hydro, Geothermal, Solar, Wind and Biomass). The subsequent results show that the most important strategic criterion is economy; other criteria include security, human wellbeing, technology and global effects. Their weights are 0.485, 0.235, 0.130, 0.097 and 0.053, respectively. In the conclusion of this paper, the authors propose hydro power as the optimal RE source for the country

  8. Quality at the source (QATS) system design under six sigma methodology

    Energy Technology Data Exchange (ETDEWEB)

    Aguirre, F; Ballasteros, I; Maricalva, J [Emperesa Nacional del Uranio, S.A. (ENUSA), Nuclear Fuel Manufacturing Plant, Juzbado, Salamanca (Spain)

    2000-07-01

    One of the main objectives in the manufacturing of fuel assemblies, is to fulfill the customer expectations with a product that assures its reliability during its stay in the NPP. By mean of the QATS System design under 6-Sigma methodology, all the customer requirements are included in the product specifications and drawings. Product characteristics and process variables are classified and process capability is evaluated. All this information permits to identify CTQ's (Critical to Quality) product characteristics and process variables, and to define a quality system (QATS) based in the process and on-line characteristics control handled by the manufacturing workers. At the end, this system ensures a continuous product quality improvement, and a strong commitment with the customer requirements. (author)

  9. Quality at the source (QATS) system design under six sigma methodology

    International Nuclear Information System (INIS)

    Aguirre, F.; Ballasteros, I.; Maricalva, J.

    2000-01-01

    One of the main objectives in the manufacturing of fuel assemblies, is to fulfill the customer expectations with a product that assures its reliability during its stay in the NPP. By mean of the QATS System design under 6-Sigma methodology, all the customer requirements are included in the product specifications and drawings. Product characteristics and process variables are classified and process capability is evaluated. All this information permits to identify CTQ's (Critical to Quality) product characteristics and process variables, and to define a quality system (QATS) based in the process and on-line characteristics control handled by the manufacturing workers. At the end, this system ensures a continuous product quality improvement, and a strong commitment with the customer requirements. (author)

  10. A Quantitative Methodology for Vetting Dark Network Intelligence Sources for Social Network Analysis

    Science.gov (United States)

    2012-06-01

    Figure V-7 Source Stress Contributions for the Example ............................................ V-24  Figure V-8 ROC Curve for the Example...resilience is the ability of the organization “to avoid disintegration when coming under stress (Milward & Raab, 2006, p. 351).” Despite numerous...members of the network. Examples such as subordinates directed to meetings in place of their superiors, virtual participation via telecommuting

  11. A Systematic Literature Review on relationship between agile methods and Open Source Software Development methodology

    OpenAIRE

    Gandomani, Taghi Javdani; Zulzalil, Hazura; Ghani, Abdul Azim Abdul; Sultan, Abu Bakar Md

    2013-01-01

    Agile software development methods (ASD) and open source software development methods (OSSD) are two different approaches which were introduced in last decade and both of them have their fanatical advocators. Yet, it seems that relation and interface between ASD and OSSD is a fertile area and few rigorous studies have been done in this matter. Major goal of this study was assessment of the relation and integration of ASD and OSSD. Analyzing of collected data shows that ASD and OSSD are able t...

  12. Source-receptor relationships between East Asian sulfur dioxide emissions and Northern Hemisphere sulfate concentrations

    Directory of Open Access Journals (Sweden)

    J. Liu

    2008-07-01

    Full Text Available We analyze the effect of varying East Asian (EA sulfur emissions on sulfate concentrations in the Northern Hemisphere, using a global coupled oxidant-aerosol model (MOZART-2. We conduct a base and five sensitivity simulations, in which sulfur emissions from each continent are tagged, to establish the source-receptor (S-R relationship between EA sulfur emissions and sulfate concentrations over source and downwind regions. We find that from west to east across the North Pacific, EA sulfate contributes approximately 80%–20% of sulfate at the surface, but at least 50% at 500 hPa. Surface sulfate concentrations are dominated by local anthropogenic sources. Of the sulfate produced from sources other than local anthropogenic emissions (defined here as "background" sulfate, EA sources account for approximately 30%–50% (over the Western US and 10%–20% (over the Eastern US. The surface concentrations of sulfate from EA sources over the Western US are highest in MAM (up to 0.15 μg/m3, and lowest in DJF (less than 0.06 μg/m3. Reducing EA SO2 emissions will significantly decrease the spatial extent of the EA sulfate influence (represented by the areas where at least 0.1 μg m−3 of sulfate originates from EA over the North Pacific both at the surface and at 500 hPa in all seasons, but the extent of influence is insensitive to emission increases, particularly in DJF and JJA. We find that EA sulfate concentrations over most downwind regions respond nearly linearly to changes in EA SO2 emissions, but sulfate concentrations over the EA source region increase more slowly than SO2 emissions, particularly at the surface and in winter, due to limited availability of oxidants (in particular of H2O2, which oxidizes SO2 to sulfate in the aqueous phase. We find that similar estimates of the S-R relationship for trans-Pacific transport of EA sulfate would be

  13. MEAPA - Integrated methodology for alternative energy sources map pin in the State of Para - Brazil; MEAPA - Metodologias integradas para o mapeamento de energias alternativas no Estado do Para

    Energy Technology Data Exchange (ETDEWEB)

    Monteiro, Claudio; Lopes, J. Pecas; Va, Kowk P.; Herold, Helmut [Instituto de Engenharia de Sistemas e Computadores (INESC), Porto (Portugal). Power System Unit. E-mail: cmonteiro@inescn.pt; Rocha, Brigida; Pinheiro, Helten; Rocha, Olavo [Para Univ., Belem, PA (Brazil). Dept. de Engenharia Eletrica. E-mail: rocha@interconect.com.br; Silva, Isa O. [Para Univ., Belem, PA (Brazil). Dept. de Meteorologia; Moraes, Sinfronio [Para Univ., Belem, PA (Brazil). Dept. de Mecanica

    1999-07-01

    This paper describes the MEAPA project for the development of methodologies which support the renewable energy sources integration in the Marajo island - Para - Brazil. The methodologies used will be described including the construction of a geographical database, renewable resources (wind, solar and biomass) mapping, transport costs, cost of electric power produced by various systems and electrification sceneries and comparison of electrification solutions.

  14. A source term and risk calculations using level 2+PSA methodology

    International Nuclear Information System (INIS)

    Park, S. I.; Jea, M. S.; Jeon, K. D.

    2002-01-01

    The scope of Level 2+ PSA includes the assessment of dose risk which is associated with the exposures of the radioactive nuclides escaping from nuclear power plants during severe accidents. The establishment of data base for the exposure dose in Korea nuclear power plants may contribute to preparing the accident management programs and periodic safety reviews. In this study the ORIGEN, MELCOR and MACCS code were employed to produce a integrated framework to assess the radiation source term risk. The framework was applied to a reference plant. Using IPE results, the dose rate for the reference plant was calculated quantitatively

  15. Application of Response Surface Methodology to study the effect of different calcium sources in fish muscle-alginate restructured products

    Directory of Open Access Journals (Sweden)

    Helena María Moreno

    2011-03-01

    Full Text Available Sodium alginate needs the presence of calcium ions to gelify. For this reason, the contribution of the calcium source in a fish muscle mince added by sodium alginate, makes gelification possible, resulting a restructured fish product. The three different calcium sources considered were: Calcium Chloride (CC; Calcium Caseinate (CCa; and Calcium lactate (CLa. Several physical properties were analyzed, including mechanical properties, colour and cooking loss. Response Surface Methodology (RSM was used to determine the contribution of different calcium sources to a restructured fish muscle. The calcium source that modifies the system the most is CC. A combination of CC and sodium alginate weakened mechanical properties as reflected in the negative linear contribution of sodium alginate. Moreover, CC by itself increased lightness and cooking loss. The mechanical properties of restructured fish muscle elaborated were enhanced by using CCa and sodium alginate, as reflected in the negative linear contribution of sodium alginate. Also, CCa increased cooking loss. The role of CLa combined with sodium alginate was not so pronounced in the system discussed here.

  16. Source-receptor metrology and modeling of trace amounts of atmospheric pollutants

    International Nuclear Information System (INIS)

    Coddeville, P.

    2005-12-01

    This work deals with acid pollution and with its long distance transport using the metrology of trace amounts of pollutants in rural environment and the identification of the emission sources at the origin of acid atmospheric fallouts. Several French and foreign precipitation collectors have been evaluated and tested on the field. The measurement efficiency and limitations of four sampling systems for gas and particulate sulfur, ammonia and nitrous compounds have been evaluated. The limits of methods and the measurement uncertainties have been determined and calculated. A second aspect concerns the development of oriented receptor-type statistical models with the aim of improving the research of emission sources in smaller size areas defined by the cells of a geographical mesh. The construction of these models combines the pollution data of the sites with the informations about the trajectories of air masses. Results are given as probability or concentration fields revealing the areas potentially at the origin of pollutant emissions. Areas with strong pollutant emissions have been detected at the Polish, Czech and German borders and have been identified as responsible of pollution events encountered in Morvan region. Quantitative source-receptor relations have been also established. The different atmospheric transport profiles, their related frequency and concentration have been also evaluated using a dynamical clouds classification of air mass retro-trajectories. Finally, the first medium-term exploitation results (14 years) of precipitation data from measurement stations allow to perfectly identify the different meteorological regimes of the French territory by establishing a relation with the chemical composition of rainfalls. A west-east oriented increase of rainfall acidity is observed over the French territory. The pluviometry of the north-east area being among the highest of France, it generates more important deposits of acidifying compounds. The analysis

  17. Review of current GPS methodologies for producing accurate time series and their error sources

    Science.gov (United States)

    He, Xiaoxing; Montillet, Jean-Philippe; Fernandes, Rui; Bos, Machiel; Yu, Kegen; Hua, Xianghong; Jiang, Weiping

    2017-05-01

    The Global Positioning System (GPS) is an important tool to observe and model geodynamic processes such as plate tectonics and post-glacial rebound. In the last three decades, GPS has seen tremendous advances in the precision of the measurements, which allow researchers to study geophysical signals through a careful analysis of daily time series of GPS receiver coordinates. However, the GPS observations contain errors and the time series can be described as the sum of a real signal and noise. The signal itself can again be divided into station displacements due to geophysical causes and to disturbing factors. Examples of the latter are errors in the realization and stability of the reference frame and corrections due to ionospheric and tropospheric delays and GPS satellite orbit errors. There is an increasing demand on detecting millimeter to sub-millimeter level ground displacement signals in order to further understand regional scale geodetic phenomena hence requiring further improvements in the sensitivity of the GPS solutions. This paper provides a review spanning over 25 years of advances in processing strategies, error mitigation methods and noise modeling for the processing and analysis of GPS daily position time series. The processing of the observations is described step-by-step and mainly with three different strategies in order to explain the weaknesses and strengths of the existing methodologies. In particular, we focus on the choice of the stochastic model in the GPS time series, which directly affects the estimation of the functional model including, for example, tectonic rates, seasonal signals and co-seismic offsets. Moreover, the geodetic community continues to develop computational methods to fully automatize all phases from analysis of GPS time series. This idea is greatly motivated by the large number of GPS receivers installed around the world for diverse applications ranging from surveying small deformations of civil engineering structures (e

  18. Methodologies for estimating air emissions from three non-traditional source categories: Oil spills, petroleum vessel loading and unloading, and cooling towers. Final report, October 1991-March 1993

    International Nuclear Information System (INIS)

    Ramadan, W.; Sleva, S.; Dufner, K.; Snow, S.; Kersteter, S.L.

    1993-04-01

    The report discusses part of EPA's program to identify and characterize emissions sources not currently accounted for by either the existing Aerometric Information Retrieval System (AIRS) or State Implementation Plan (SIP) area source methodologies and to develop appropriate emissions estimation methodologies and emission factors for a group of these source categories. Based on the results of the identification and characterization portions of this research, three source categories were selected for methodology and emission factor development: oil spills, petroleum vessel loading and unloading, and cooling towers. The report describes the category selection process and presents emissions estimation methodologies and emission factor data for the selected source categories. The discussions for each category include general background information, emissions generation activities, pollutants emitted, sources of activity and pollutant data, emissions estimation methodologies and data issues. The information used in these discussions was derived from various sources including available literature, industrial and trade association publications and contracts, experts on the category and activity, and knowledgeable federal and state personnel

  19. Study and methodologies for fixing epoxy resin in radioactive sources used for brachytherapy

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, Bruna T.; Rostelato, Maria E.C.M.; Souza, Carla D.; Tozetti, Cíntia A.; Zeituni, Carlos A.; Nogueira, Beatriz R.; Silva, José T.; Júnior, Dib K.; Fernandes, Vagner; Souza, Raquel V.; Abreu, Rodrigo T., E-mail: bteigarodrigues@gmail.com, E-mail: elisaros@ipen.br, E-mail: carladdsouza@yahoo.com.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Universidade de São Paulo (USP), SP (Brazil)

    2017-07-01

    The World Health Organization (WHO) estimates that the number of new cancer cases worldwide will reach 15 million by 2020. The disease is already the second leading cause of death worldwide, being behind only cardiovascular disease. It is unquestionable that it is a public health problem, especially among developing countries. Prostate cancer is the most common among men, approximately 28.6%. The choice of type of treatment for prostate cancer should consider several factors such as: tumor size and extent, apparent aggressiveness (pathological characteristics), age, health. Among the methods applied, brachytherapy has been used in the initial and intermediate stages of the disease. Brachytherapy is a safe and effective treatment for localized prostate cancer. Brachytherapy is a form of radiotherapy in which radioactive seeds are placed in contact with or within the organ being treated. This technique allows a large dose of radiation to be released only on the target tumor that protects healthy surrounding tissues. Sources may have different shapes and sizes, but the one used for prostate cancer is usually 4.5 mm in length and 0.8 mm in diameter. About 80 to 120 seeds can be used per patient. Iodine-125 is the radioisotope most used in brachytherapy of the prostate, it emits 35,49keV X-rays in 100% of the decays, with average energy of 29 keV. The treatment of prostate cancer with permanent implantation of iodine-125 seeds has grown dramatically in the world in recent years. Most patients can return to normal life within three days with little or no pain. (author)

  20. Study and methodologies for fixing epoxy resin in radioactive sources used for brachytherapy

    International Nuclear Information System (INIS)

    Rodrigues, Bruna T.; Rostelato, Maria E.C.M.; Souza, Carla D.; Tozetti, Cíntia A.; Zeituni, Carlos A.; Nogueira, Beatriz R.; Silva, José T.; Júnior, Dib K.; Fernandes, Vagner; Souza, Raquel V.; Abreu, Rodrigo T.

    2017-01-01

    The World Health Organization (WHO) estimates that the number of new cancer cases worldwide will reach 15 million by 2020. The disease is already the second leading cause of death worldwide, being behind only cardiovascular disease. It is unquestionable that it is a public health problem, especially among developing countries. Prostate cancer is the most common among men, approximately 28.6%. The choice of type of treatment for prostate cancer should consider several factors such as: tumor size and extent, apparent aggressiveness (pathological characteristics), age, health. Among the methods applied, brachytherapy has been used in the initial and intermediate stages of the disease. Brachytherapy is a safe and effective treatment for localized prostate cancer. Brachytherapy is a form of radiotherapy in which radioactive seeds are placed in contact with or within the organ being treated. This technique allows a large dose of radiation to be released only on the target tumor that protects healthy surrounding tissues. Sources may have different shapes and sizes, but the one used for prostate cancer is usually 4.5 mm in length and 0.8 mm in diameter. About 80 to 120 seeds can be used per patient. Iodine-125 is the radioisotope most used in brachytherapy of the prostate, it emits 35,49keV X-rays in 100% of the decays, with average energy of 29 keV. The treatment of prostate cancer with permanent implantation of iodine-125 seeds has grown dramatically in the world in recent years. Most patients can return to normal life within three days with little or no pain. (author)

  1. Methodology study for fixation of radioactive iodine in polymeric substrate for brachytherapy sources

    International Nuclear Information System (INIS)

    Rodrigues, Bruna T.; Rostelato, Maria Elisa C.M.; Souza, Carla D.; Tiezzi, Rodrigo; Souza, Daiane B. de; Benega, Marcos A.G.; Souza, Anderson S. de; Peleias Junior, Fernando S.; Zeituni, Calos A.; Fernandes, Vagner; Melo, Emerson Ronaldo de; Camargo, Anderson Rogerio de

    2015-01-01

    Cancer is now the second leading cause of death by disease in several countries, including Brazil. Prostate cancer is the most common among men. Brachytherapy is a modality of radiotherapy in which radioactive seeds are placed inside or in contact with the organ to be treated. The most widely used radioisotope in prostate brachytherapy is Iodine-125 which is presented fixated on a silver substrate that is subsequently placed inside a titanium capsule. A large dose of radiation is released only in the targeted tumor protecting healthy surrounding tissues. The technique requires the application of 80 - 120 seeds per patient. The implants of seeds have low impact and non-surgical procedures. Most patients can return to normal life within three days with little or no pain. This work proposes an alternative to the seeds that have already been developed, in order to reduce the cost by obtaining a better efficiency on fixing the radioactive iodine onto the epoxy resin. Methods have been developed to perform the fixation of Iodine-125 onto polymeric substrates. The parameters analyzed were the immersion time, type of static or dynamic reaction, concentration of the adsorption solution, the specific activity of the radioactive source, the need for carrier and chemical form of the radioactive Iodine. These experiments defined the most effective method to fixate the Iodine onto the polymeric material (epoxy resin), the Iodine activity in the polymeric substrate, the activity of the distribution of variation in a plot of polymeric cores and the efficiency of the epoxy resin to seal the seed. (author)

  2. Developing methodologies for source attribution. Glass phase separation in Trinitite using NF{sub 3}

    Energy Technology Data Exchange (ETDEWEB)

    Koeman, Elizabeth C.; Simonetti, Antonio [Notre Dame Univ., IN (United States). Dept. of Civil and Environmental Engineering and Earth Sciences; McNamara, Bruce K.; Smith, Frances N. [Pacific Northwest National Laboratory, Richland, WA (United States). Nuclear Chemistry and Engineering; Burns, Peter C. [Notre Dame Univ., IN (United States). Dept. of Civil and Environmental Engineering and Earth Sciences; Notre Dame Univ., IN (United States). Dept. of Chemistry and Biochemistry

    2017-08-01

    separation of silica from minerals (i.e. naturally occurring crystalline materials) and glasses (i.e. amorphous materials), leaving behind non-volatile fluorinated species and refractory phases. The results from our investigation clearly indicate that the NF{sub 3} treatment of nuclear materials is a technique that provides effective separation of bomb components from complex matrices (e.g. post-detonation samples), which will aid with rapid and accurate source attribution.

  3. Multi-model Estimates of Intercontinental Source-Receptor Relationships for Ozone Pollution

    Energy Technology Data Exchange (ETDEWEB)

    Fiore, A M; Dentener, F J; Wild, O; Cuvelier, C; Schultz, M G; Hess, P; Textor, C; Schulz, M; Doherty, R; Horowitz, L W; MacKenzie, I A; Sanderson, M G; Shindell, D T; Stevenson, D S; Szopa, S; Van Dingenen, R; Zeng, G; Atherton, C; Bergmann, D; Bey, I; Carmichael, G; Collins, W J; Duncan, B N; Faluvegi, G; Folberth, G; Gauss, M; Gong, S; Hauglustaine, D; Holloway, T; Isaksen, I A; Jacob, D J; Jonson, J E; Kaminski, J W; Keating, T J; Lupu, A; Marmer, E; Montanaro, V; Park, R; Pitari, G; Pringle, K J; Pyle, J A; Schroeder, S; Vivanco, M G; Wind, P; Wojcik, G; Wu, S; Zuber, A

    2008-10-16

    Understanding the surface O{sub 3} response over a 'receptor' region to emission changes over a foreign 'source' region is key to evaluating the potential gains from an international approach to abate ozone (O{sub 3}) pollution. We apply an ensemble of 21 global and hemispheric chemical transport models to estimate the spatial average surface O{sub 3} response over East Asia (EA), Europe (EU), North America (NA) and South Asia (SA) to 20% decreases in anthropogenic emissions of the O{sub 3} precursors, NO{sub x}, NMVOC, and CO (individually and combined), from each of these regions. We find that the ensemble mean surface O{sub 3} concentrations in the base case (year 2001) simulation matches available observations throughout the year over EU but overestimates them by >10 ppb during summer and early fall over the eastern U.S. and Japan. The sum of the O{sub 3} responses to NO{sub x}, CO, and NMVOC decreases separately is approximately equal to that from a simultaneous reduction of all precursors. We define a continental-scale 'import sensitivity' as the ratio of the O{sub 3} response to the 20% reductions in foreign versus 'domestic' (i.e., over the source region itself) emissions. For example, the combined reduction of emissions from the 3 foreign regions produces an ensemble spatial mean decrease of 0.6 ppb over EU (0.4 ppb from NA), less than the 0.8 ppb from the reduction of EU emissions, leading to an import sensitivity ratio of 0.7. The ensemble mean surface O{sub 3} response to foreign emissions is largest in spring and late fall (0.7-0.9 ppb decrease in all regions from the combined precursor reductions in the 3 foreign regions), with import sensitivities ranging from 0.5 to 1.1 (responses to domestic emission reductions are 0.8-1.6 ppb). High O{sub 3} values are much more sensitive to domestic emissions than to foreign emissions, as indicated by lower import sensitivities of 0.2 to 0.3 during July in EA, EU, and NA

  4. Quantification of Greenhouse Gas Emission Rates from strong Point Sources by Airborne IPDA-Lidar Measurements: Methodology and Experimental Results

    Science.gov (United States)

    Ehret, G.; Amediek, A.; Wirth, M.; Fix, A.; Kiemle, C.; Quatrevalet, M.

    2016-12-01

    We report on a new method and on the first demonstration to quantify emission rates from strong greenhouse gas (GHG) point sources using airborne Integrated Path Differential Absorption (IPDA) Lidar measurements. In order to build trust in the self-reported emission rates by countries, verification against independent monitoring systems is a prerequisite to check the reported budget. A significant fraction of the total anthropogenic emission of CO2 and CH4 originates from localized strong point sources of large energy production sites or landfills. Both are not monitored with sufficiently accuracy by the current observation system. There is a debate whether airborne remote sensing could fill in the gap to infer those emission rates from budgeting or from Gaussian plume inversion approaches, whereby measurements of the GHG column abundance beneath the aircraft can be used to constrain inverse models. In contrast to passive sensors, the use of an active instrument like CHARM-F for such emission verification measurements is new. CHARM-F is a new airborne IPDA-Lidar devised for the German research aircraft HALO for the simultaneous measurement of the column-integrated dry-air mixing ratio of CO2 and CH4 commonly denoted as XCO2 und XCH4, respectively. It has successfully been tested in a serious of flights over Central Europe to assess its performance under various reflectivity conditions and in a strongly varying topography like the Alps. The analysis of a methane plume measured in crosswind direction of a coal mine ventilation shaft revealed an instantaneous emission rate of 9.9 ± 1.7 kt CH4 yr-1. We discuss the methodology of our point source estimation approach and give an outlook on the CoMet field experiment scheduled in 2017 for the measurement of anthropogenic and natural GHG emissions by a combination of active and passive remote sensing instruments on research aircraft.

  5. Source apportionment of PM10 and PM2.5 in major urban Greek agglomerations using a hybrid source-receptor modeling process.

    Science.gov (United States)

    Argyropoulos, G; Samara, C; Diapouli, E; Eleftheriadis, K; Papaoikonomou, K; Kungolos, A

    2017-12-01

    A hybrid source-receptor modeling process was assembled, to apportion and infer source locations of PM 10 and PM 2.5 in three heavily-impacted urban areas of Greece, during the warm period of 2011, and the cold period of 2012. The assembled process involved application of an advanced computational procedure, the so-called Robotic Chemical Mass Balance (RCMB) model. Source locations were inferred using two well-established probability functions: (a) the Conditional Probability Function (CPF), to correlate the output of RCMB with local wind directional data, and (b) the Potential Source Contribution Function (PSCF), to correlate the output of RCMB with 72h air-mass back-trajectories, arriving at the receptor sites, during sampling. Regarding CPF, a higher-level conditional probability function was defined as well, from the common locus of CPF sectors derived for neighboring receptor sites. With respect to PSCF, a non-parametric bootstrapping method was applied to discriminate the statistically significant values. RCMB modeling showed that resuspended dust is actually one of the main barriers for attaining the European Union (EU) limit values in Mediterranean urban agglomerations, where the drier climate favors build-up. The shift in the energy mix of Greece (caused by the economic recession) was also evidenced, since biomass burning was found to contribute more significantly to the sampling sites belonging to the coldest climatic zone, particularly during the cold period. The CPF analysis showed that short-range transport of anthropogenic emissions from urban traffic to urban background sites was very likely to have occurred, within all the examined urban agglomerations. The PSCF analysis confirmed that long-range transport of primary and/or secondary aerosols may indeed be possible, even from distances over 1000km away from study areas. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Sediment source fingerprinting as an aid to catchment management: A review of the current state of knowledge and a methodological decision-tree for end-users

    Science.gov (United States)

    Collins, A.L; Pulley, S.; Foster, I.D.L; Gellis, Allen; Porto, P.; Horowitz, A.J.

    2017-01-01

    The growing awareness of the environmental significance of fine-grained sediment fluxes through catchment systems continues to underscore the need for reliable information on the principal sources of this material. Source estimates are difficult to obtain using traditional monitoring techniques, but sediment source fingerprinting or tracing procedures, have emerged as a potentially valuable alternative. Despite the rapidly increasing numbers of studies reporting the use of sediment source fingerprinting, several key challenges and uncertainties continue to hamper consensus among the international scientific community on key components of the existing methodological procedures. Accordingly, this contribution reviews and presents recent developments for several key aspects of fingerprinting, namely: sediment source classification, catchment source and target sediment sampling, tracer selection, grain size issues, tracer conservatism, source apportionment modelling, and assessment of source predictions using artificial mixtures. Finally, a decision-tree representing the current state of knowledge is presented, to guide end-users in applying the fingerprinting approach.

  7. On the autarchic use of solely PIXE data in particulate matter source apportionment studies by receptor modeling

    Energy Technology Data Exchange (ETDEWEB)

    Lucarelli, F. [Department of Physics and Astronomy, University of Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); National Institute of Nuclear Physics (INFN)-Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); Nava, S., E-mail: nava@fi.infn.it [National Institute of Nuclear Physics (INFN)-Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); Calzolai, G. [Department of Physics and Astronomy, University of Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); Chiari, M. [National Institute of Nuclear Physics (INFN)-Florence, Via G. Sansone 1, 50019 Sesto Fiorentino (Italy); Giannoni, M.; Traversi, R.; Udisti, R. [Department of Chemistry, University of Florence, Via della Lastruccia 3, 50019 Sesto Fiorentino (Italy)

    2015-11-15

    Particle Induced X-ray Emission (PIXE) analysis of aerosol samples allows simultaneous detection of several elements, including important tracers of many particulate matter sources. This capability, together with the possibility of analyzing a high number of samples in very short times, makes PIXE a very effective tool for source apportionment studies by receptor modeling. However, important aerosol components, like nitrates, OC and EC, cannot be assessed by PIXE: this limitation may strongly compromise the results of a source apportionment study if based on PIXE data alone. In this work, an experimental dataset characterised by an extended chemical speciation (elements, EC–OC, ions) is used to test the effect of reducing input species in the application of one of the most widely used receptor model, namely Positive Matrix Factorization (PMF). The main effect of using only PIXE data is that the secondary nitrate source is not identified and the contribution of biomass burning is overestimated, probably due to the similar seasonal pattern of these two sources.

  8. Natural killer cells as a promising tool to tackle cancer-A review of sources, methodologies, and potentials.

    Science.gov (United States)

    Preethy, Senthilkumar; Dedeepiya, Vidyasagar Devaprasad; Senthilkumar, Rajappa; Rajmohan, Mathaiyan; Karthick, Ramalingam; Terunuma, Hiroshi; Abraham, Samuel J K

    2017-07-04

    Immune cell-based therapies are emerging as a promising tool to tackle malignancies, both solid tumors and selected hematological tumors. Vast experiences in literature have documented their safety and added survival benefits when such cell-based therapies are combined with the existing treatment options. Numerous methodologies of processing and in vitro expansion protocols of immune cells, such as the dendritic cells, natural killer (NK) cells, NKT cells, αβ T cells, so-called activated T lymphocytes, γδ T cells, cytotoxic T lymphocytes, and lymphokine-activated killer cells, have been reported for use in cell-based therapies. Among this handful of immune cells of significance, the NK cells stand apart from the rest for not only their direct cytotoxic ability against cancer cells but also their added advantage, which includes their capability of (i) action through both innate and adaptive immune mechanism, (ii) tackling viruses too, giving benefits in conditions where viral infections culminate in cancer, and (iii) destroying cancer stem cells, thereby preventing resistance to chemotherapy and radiotherapy. This review thoroughly analyses the sources of such NK cells, methods for expansion, and the future potentials of taking the in vitro expanded allogeneic NK cells with good cytotoxic ability as a drug for treating cancer and/or viral infection and even as a prophylactic tool for prevention of cancer after initial remission.

  9. SPET imaging of central muscarinic receptors with (R,R)[123I]-I-QNB: methodological considerations

    International Nuclear Information System (INIS)

    Norbury, R.; Travis, M.J.; Erlandsson, K.; Waddington, W.; Owens, J.; Ell, P.J.; Murphy, D.G.

    2004-01-01

    Investigations on the effect of normal healthy ageing on the muscarinic system have shown conflicting results. Also, in vivo determination of muscarinic receptor binding has been hampered by a lack of subtype selective ligands and differences in methods used for quantification of receptor densities. Recent in vitro and in vivo work with the muscarinic antagonist (R,R)-I-QNB indicates this ligand has selectivity for m 1 and m 4 muscarinic receptor subtypes. Therefore, we used (R,R)[ 123 I]-I-QNB and single photon emission tomography to study brain m 1 and m 4 muscarinic receptors in 25 healthy female subjects (11 younger subjects, age range 26-32 years and 14 older subjects, age range 57-82 years). Our aims were to ascertain the viability of tracer administration and imaging within the same day, and to evaluate whether normalization to whole brain, compared to normalization to cerebellum, could alter the clinical interpretation of results. Images were analyzed using the simplified reference tissue model and by two ratio methods: normalization to whole brain and normalization to cerebellum. Significant correlations were observed between kinetic analysis and normalization to cerebellum, but not to whole brain. Both the kinetic analysis and normalization to cerebellum showed age-related reductions in muscarinic binding in frontal, orbitofrontal, and parietal regions. Normalization to whole brain, however, failed to detect age-related changes in any region. Here we show that, for this radiotracer, normalizing to a region of negligible specific binding (cerebellum) significantly improves sensitivity when compared to global normalization

  10. Provision of a simplified methodology for determining estradiol and progesterone receptors in human breast tumours. Internal and external quality control

    International Nuclear Information System (INIS)

    Farinate, Z.

    1990-10-01

    A simplified assay for the detection of progesterone receptors (PR) in human breast tissue is described. Tissue storage is at -20 deg. C rather than -70 deg. C and a centrifugation speed of 20,000 rpm avoids requirement of an ultracentrifuge. Cytosol preparations obtained from homogenized oestradiol benzoate primed wistar rat uteri performed satisfactorily as positive controls with stability of two months in liquid nitrogen. The use of iodinated tracer (progesterone 11 alpha glucuronide 125 I iodotyramine) proved disappointing in the progesterone receptor assay in contrast to 125 I oestradiol which worked well in a oestrogen receptor assay, previously developed. Hydroxyl-apatite was a better separating agent than dextran coated charcoal in both assays and yielded better sensitivity, particularly when protein concentrations were low. Five breast cancer specimens assayed yielded, by Scatchard analysis, Kd values between 12 to 22x10 -9 m|h, comparable to the positive controls. However, two of these had binding site capacity of less than 5 fmol/mg cytosol as compared to the three others and the positive controls where values ranged from 47-196 fmol/mg cytosol. 28 refs, 6 figs, 14 tabs

  11. The Freight Analysis Framework Verson 4 (FAF4) - Building the FAF4 Regional Database: Data Sources and Estimation Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Ho-Ling [ORNL; Hargrove, Stephanie [ORNL; Chin, Shih-Miao [ORNL; Wilson, Daniel W [ORNL; Taylor, Rob D [ORNL; Davidson, Diane [ORNL

    2016-09-01

    The Freight Analysis Framework (FAF) integrates data from a variety of sources to create a comprehensive national picture of freight movements among states and major metropolitan areas by all modes of transportation. It provides a national picture of current freight flows to, from, and within the United States, assigns the flows to the transportation network, and projects freight flow patterns into the future. The FAF4 is the fourth database of its kind, FAF1 provided estimates for truck, rail, and water tonnage for calendar year 1998, FAF2 provided a more complete picture based on the 2002 Commodity Flow Survey (CFS) and FAF3 made further improvements building on the 2007 CFS. Since the first FAF effort, a number of changes in both data sources and products have taken place. The FAF4 flow matrix described in this report is used as the base-year data to forecast future freight activities, projecting shipment weights and values from year 2020 to 2045 in five-year intervals. It also provides the basis for annual estimates to the FAF4 flow matrix, aiming to provide users with the timeliest data. Furthermore, FAF4 truck freight is routed on the national highway network to produce the FAF4 network database and flow assignments for truck. This report details the data sources and methodologies applied to develop the base year 2012 FAF4 database. An overview of the FAF4 components is briefly discussed in Section 2. Effects on FAF4 from the changes in the 2012 CFS are highlighted in Section 3. Section 4 provides a general discussion on the process used in filling data gaps within the domestic CFS matrix, specifically on the estimation of CFS suppressed/unpublished cells. Over a dozen CFS OOS components of FAF4 are then addressed in Section 5 through Section 11 of this report. This includes discussions of farm-based agricultural shipments in Section 5, shipments from fishery and logging sectors in Section 6. Shipments of municipal solid wastes and debris from construction

  12. receptores

    Directory of Open Access Journals (Sweden)

    Salete Regina Daronco Benetti

    2006-01-01

    Full Text Available Se trata de un estudio etnográfico, que tuvo lo objetivo de interpretar el sistema de conocimiento y del significado atribuidos a la sangre referente a la transfusión sanguínea por los donadores y receptores de un banco de sangre. Para la colecta de las informaciones se observaron los participantes y la entrevista etnográfica se realizó el análisis de dominio, taxonómicos y temáticos. Los dominios culturales fueron: la sangre es vida: fuente de vida y alimento valioso; creencias religiosas: fuentes simbólicas de apoyos; donación sanguínea: un gesto colaborador que exige cuidarse, gratifica y trae felicidad; donación sanguínea: fuente simbólica de inseguridad; estar enfermo es una condición para realizar transfusión sanguínea; transfusión sanguínea: esperanza de vida; Creencias populares: transfusión sanguínea como riesgo para la salud; donadores de sangre: personas benditas; donar y recibir sangre: como significado de felicidad. Temática: “líquido precioso que origina, sostiene, modifica la vida, provoca miedo e inseguridad”.

  13. Source contribution analysis of surface particulate polycyclic aromatic hydrocarbon concentrations in northeastern Asia by source–receptor relationships

    International Nuclear Information System (INIS)

    Inomata, Yayoi; Kajino, Mizuo; Sato, Keiichi; Ohara, Toshimasa; Kurokawa, Jun-ichi; Ueda, Hiromasa; Tang, Ning; Hayakawa, Kazuichi; Ohizumi, Tsuyoshi; Akimoto, Hajime

    2013-01-01

    We analyzed the source–receptor relationships for particulate polycyclic aromatic hydrocarbon (PAH) concentrations in northeastern Asia using an aerosol chemical transport model. The model successfully simulated the observed concentrations. In Beijing (China) benzo[a]pyren (BaP) concentrations are due to emissions from its own domain. In Noto, Oki and Tsushima (Japan), transboundary transport from northern China (>40°N, 40–60%) and central China (30–40°N, 10–40%) largely influences BaP concentrations from winter to spring, whereas the relative contribution from central China is dominant (90%) in Hedo. In the summer, the contribution from Japanese domestic sources increases (40–80%) at the 4 sites. Contributions from Japan and Russia are additional source of BaP over the northwestern Pacific Ocean in summer. The contribution rates for the concentrations from each domain are different among PAH species depending on their particulate phase oxidation rates. Reaction with O 3 on particulate surfaces may be an important component of the PAH oxidation processes. -- Highlights: •Source–receptor analysis was conducted for investigating PAHs in northeast Asia. •In winter, transboundary transport from China is large contribution in leeward. •Relative contribution from Korea, Japan, and eastern Russia is increased in summer. •This seasonal variation is strongly controlled by the meteorological conditions. •The transport distance is different among PAH species. -- Transboundary transport of PAHs in northeast Asia was investigated by source–receptor analysis

  14. Endocrine Disrupters in Human Blood and Breast Milk: Extraction Methodologies, Cellular Uptake and Effect on Key Nuclear Receptor Functions

    DEFF Research Database (Denmark)

    Hjelmborg, Philip Sebastian

    2010-01-01

    -products from incineration plants, plastic additives, technical industry products, pesticides from the farming industry and detergent degradation products. Many of these substances can interfere with the hormonal system in organisms. The common name for these compounds is endocrine disrupters (EDCs). Some EDCs...... are persistent to degradation and are also called persistent organic pollutants (POPs). Endocrine disrupters are compounds that can interfere with an organism’s hormone system by interacting with the hormone receptors. Many of an organism’s body functions are controlled by interactions between hormones...

  15. Source-receptor relationships for speciated atmospheric mercury at the remote Experimental Lakes Area, northwestern Ontario, Canada

    Directory of Open Access Journals (Sweden)

    I. Cheng

    2012-02-01

    Full Text Available Source-receptor relationships for speciated atmospheric mercury measured at the Experimental Lakes Area (ELA, northwestern Ontario, Canada were investigated using various receptor-based approaches. The data used in this study include gaseous elemental mercury (GEM, mercury bound to fine airborne particles (<2.5 μm (PHg, reactive gaseous mercury (RGM, major inorganic ions, sulphur dioxide, nitric acid gas, ozone, and meteorological variables, all of which were measured between May 2005 and December 2006. The source origins identified were related to transport of industrial and combustion emissions (associated with elevated GEM, photochemical production of RGM (associated with elevated RGM, road-salt particles with absorption of gaseous Hg (associated with elevated PHg and RGM, crustal/soil emissions, and background pollution. Back trajectory modelling illustrated that a remote site, like ELA, is affected by distant Hg point sources in Canada and the United States. The sources identified from correlation analysis, principal components analysis and K-means cluster analysis were generally consistent. The discrepancies between the K-means and Hierarchical cluster analysis were the clusters related to transport of industrial/combustion emissions, photochemical production of RGM, and crustal/soil emissions. Although it was possible to assign the clusters to these source origins, the trajectory plots for the Hierarchical clusters were similar to some of the trajectories belonging to several K-means clusters. This likely occurred because the variables indicative of transport of industrial/combustion emissions were elevated in at least two or more of the clusters, which means this Hg source was well-represented in the data.

  16. Life history theory and breast cancer risk: methodological and theoretical challenges: Response to "Is estrogen receptor negative breast cancer risk associated with a fast life history strategy?".

    Science.gov (United States)

    Aktipis, Athena

    2016-01-01

    In a meta-analysis published by myself and co-authors, we report differences in the life history risk factors for estrogen receptor negative (ER-) and estrogen receptor positive (ER+) breast cancers. Our meta-analysis did not find the association of ER- breast cancer risk with fast life history characteristics that Hidaka and Boddy suggest in their response to our article. There are a number of possible explanations for the differences between their conclusions and the conclusions we drew from our meta-analysis, including limitations of our meta-analysis and methodological challenges in measuring and categorizing estrogen receptor status. These challenges, along with the association of ER+ breast cancer with slow life history characteristics, may make it challenging to find a clear signal of ER- breast cancer with fast life history characteristics, even if that relationship does exist. The contradictory results regarding breast cancer risk and life history characteristics illustrate a more general challenge in evolutionary medicine: often different sub-theories in evolutionary biology make contradictory predictions about disease risk. In this case, life history models predict that breast cancer risk should increase with faster life history characteristics, while the evolutionary mismatch hypothesis predicts that breast cancer risk should increase with delayed reproduction. Whether life history tradeoffs contribute to ER- breast cancer is still an open question, but current models and several lines of evidence suggest that it is a possibility. © The Author(s) 2016. Published by Oxford University Press on behalf of the Foundation for Evolution, Medicine, and Public Health.

  17. Historical evolution of sources identification by means of Receptor Modeling in the Metropolitan Area of São Paulo, Brazil.

    Science.gov (United States)

    Miranda, R. M.; Andrade, M. D. F.; Marien, Y., Sr.

    2017-12-01

    The atmospheric aerosols sources have been identified in Sao Paulo since the 80´s with the use of receptor models. The Metropolitan Area of São Paulo (MASP) is a megacity with a population of 21 million, corresponding to more than 11% of the total population of Brazil. The first results for the identification of sources of particles were obtained with the application of Absolute Principal Component Analysis, Factor Analysis and Chemical Mass Balance. More recently the Positive Matrix Factorization has been used in combination with the other receptor models. With the improvement of the aerosol composition analytical determination (more elements and better resolution) the source identification has became more accurate. But, in spite of that, the main sources are the same for fine particles: vehicular emission, secondary formation and biomass burning. The large amount of biofuels used in the MASP makes this region an important example of the atmospheric chemistry of fossil fuel and biofuel emissions. The 7 million vehicles can run on gasohol, ethanol (95% ethanol + 5% gasoline) and biodiesel (mostly for trucks and buses). We have considered the Black Carbon as the tracer for diesel engines and biomass burning, being this last source associated not only with burning of sugar cane plantation and forest fires, but also with wood and charcoal used in restaurant and domestic cooking and residues burning. The responsibility of the vehicular emission to the fine particles has been maintained in approximately 50% of the mass. The soil resuspension was associated with 8% of the fine particles origin. We are presenting the data obtained from experiments performed from 1983 to 2014, not continuously and mainly performed in the winter time. It is a long period of data that is going to be considered. The previous results obtained with the application of PCA were compared to that obtained with PMF applied to the historical data collected at MASP, showing the evolution of the

  18. Fine particulates over South Asia: Review and meta-analysis of PM2.5 source apportionment through receptor model.

    Science.gov (United States)

    Singh, Nandita; Murari, Vishnu; Kumar, Manish; Barman, S C; Banerjee, Tirthankar

    2017-04-01

    Fine particulates (PM 2.5 ) constitute dominant proportion of airborne particulates and have been often associated with human health disorders, changes in regional climate, hydrological cycle and more recently to food security. Intrinsic properties of particulates are direct function of sources. This initiates the necessity of conducting a comprehensive review on PM 2.5 sources over South Asia which in turn may be valuable to develop strategies for emission control. Particulate source apportionment (SA) through receptor models is one of the existing tool to quantify contribution of particulate sources. Review of 51 SA studies were performed of which 48 (94%) were appeared within a span of 2007-2016. Almost half of SA studies (55%) were found concentrated over few typical urban stations (Delhi, Dhaka, Mumbai, Agra and Lahore). Due to lack of local particulate source profile and emission inventory, positive matrix factorization and principal component analysis (62% of studies) were the primary choices, followed by chemical mass balance (CMB, 18%). Metallic species were most regularly used as source tracers while use of organic molecular markers and gas-to-particle conversion were minimum. Among all the SA sites, vehicular emissions (mean ± sd: 37 ± 20%) emerged as most dominating PM 2.5 source followed by industrial emissions (23 ± 16%), secondary aerosols (22 ± 12%) and natural sources (20 ± 15%). Vehicular emissions (39 ± 24%) also identified as dominating source for highly polluted sites (PM 2.5 >100 μgm -3 , n = 15) while site specific influence of either or in combination of industrial, secondary aerosols and natural sources were recognized. Source specific trends were considerably varied in terms of region and seasonality. Both natural and industrial sources were most influential over Pakistan and Afghanistan while over Indo-Gangetic plain, vehicular, natural and industrial emissions appeared dominant. Influence of vehicular emission was

  19. Using an Explicit Emission Tagging Method in Global Modeling of Source-Receptor Relationships for Black Carbon in the Arctic: Variations, Sources and Transport Pathways

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hailong; Rasch, Philip J.; Easter, Richard C.; Singh, Balwinder; Zhang, Rudong; Ma, Po-Lun; Qian, Yun; Ghan, Steven J.; Beagley, Nathaniel

    2014-11-27

    We introduce an explicit emission tagging technique in the Community Atmosphere Model to quantify source-region-resolved characteristics of black carbon (BC), focusing on the Arctic. Explicit tagging of BC source regions without perturbing the emissions makes it straightforward to establish source-receptor relationships and transport pathways, providing a physically consistent and computationally efficient approach to produce a detailed characterization of the destiny of regional BC emissions and the potential for mitigation actions. Our analysis shows that the contributions of major source regions to the global BC burden are not proportional to the respective emissions due to strong region-dependent removal rates and lifetimes, while the contributions to BC direct radiative forcing show a near-linear dependence on their respective contributions to the burden. Distant sources contribute to BC in remote regions mostly in the mid- and upper troposphere, having much less impact on lower-level concentrations (and deposition) than on burden. Arctic BC concentrations, deposition and source contributions all have strong seasonal variations. Eastern Asia contributes the most to the wintertime Arctic burden. Northern Europe emissions are more important to both surface concentration and deposition in winter than in summer. The largest contribution to Arctic BC in the summer is from Northern Asia. Although local emissions contribute less than 10% to the annual mean BC burden and deposition within the Arctic, the per-emission efficiency is much higher than for major non-Arctic sources. The interannual variability (1996-2005) due to meteorology is small in annual mean BC burden and radiative forcing but is significant in yearly seasonal means over the Arctic. When a slow aging treatment of BC is introduced, the increase of BC lifetime and burden is source-dependent. Global BC forcing-per-burden efficiency also increases primarily due to changes in BC vertical distributions. The

  20. Source apportionment of PM2.5 at the Lin'an regional background site in China with three receptor models

    Science.gov (United States)

    Deng, Junjun; Zhang, Yanru; Qiu, Yuqing; Zhang, Hongliang; Du, Wenjiao; Xu, Lingling; Hong, Youwei; Chen, Yanting; Chen, Jinsheng

    2018-04-01

    Source apportionment of fine particulate matter (PM2.5) were conducted at the Lin'an Regional Atmospheric Background Station (LA) in the Yangtze River Delta (YRD) region in China from July 2014 to April 2015 with three receptor models including principal component analysis combining multiple linear regression (PCA-MLR), UNMIX and Positive Matrix Factorization (PMF). The model performance, source identification and source contribution of the three models were analyzed and inter-compared. Source apportionment of PM2.5 was also conducted with the receptor models. Good correlations between the reconstructed and measured concentrations of PM2.5 and its major chemical species were obtained for all models. PMF resolved almost all masses of PM2.5, while PCA-MLR and UNMIX explained about 80%. Five, four and seven sources were identified by PCA-MLR, UNMIX and PMF, respectively. Combustion, secondary source, marine source, dust and industrial activities were identified by all the three receptor models. Combustion source and secondary source were the major sources, and totally contributed over 60% to PM2.5. The PMF model had a better performance on separating the different combustion sources. These findings improve the understanding of PM2.5 sources in background region.

  1. Identification of potential regional sources of atmospheric total gaseous mercury in Windsor, Ontario, Canada using hybrid receptor modeling

    Directory of Open Access Journals (Sweden)

    X. Xu

    2010-08-01

    Full Text Available Windsor (Ontario, Canada experiences trans-boundary air pollution as it is located on the border immediately downwind of industrialized regions of the United States of America. A study was conducted in 2007 to identify the potential regional sources of total gaseous mercury (TGM and investigate the effects of regional sources and other factors on seasonal variability of TGM concentrations in Windsor.

    TGM concentration was measured at the University of Windsor campus using a Tekran® 2537A Hg vapour analyzer. An annual mean of 2.02±1.63 ng/m3 was observed in 2007. The average TGM concentration was high in the summer (2.48±2.68 ng/m3 and winter (2.17±2.01 ng/m3, compared to spring (1.88±0.78 ng/m3 and fall (1.76±0.58 ng/m3. Hybrid receptor modeling potential source contribution function (PSCF was used by incorporating 72-h backward trajectories and measurements of TGM in Windsor. The results of PSCF were analyzed in conjunction with the Hg emissions inventory of North America (by state/province to identify regions affecting Windsor. In addition to annual modeling, seasonal PSCF modeling was also conducted. The potential source region was identified between 24–61° N and 51–143° W. Annual PSCF modeling identified major sources southwest of Windsor, stretching from Ohio to Texas. The emissions inventory also supported the findings, as Hg emissions were high in those regions. Results of seasonal PSCF modeling were analyzed to find the combined effects of regional sources, meteorological conditions, and surface re-emissions, on seasonal variability of Hg concentrations. It was found that the summer and winter highs of atmospheric Hg can be attributed to areas where large numbers of coal fired power plants are located in the USA. Weak atmospheric dispersion due to low winds and high re-emission from surfaces due to higher temperatures also contributed to high concentrations in

  2. Source-receptor probability of atmospheric long-distance dispersal of viruses to Israel from the eastern Mediterranean area.

    Science.gov (United States)

    Klausner, Z; Klement, E; Fattal, E

    2018-02-01

    Viruses that affect the health of humans and farm animals can spread over long distances via atmospheric mechanisms. The phenomenon of atmospheric long-distance dispersal (LDD) is associated with severe consequences because it may introduce pathogens into new areas. The introduction of new pathogens to Israel was attributed to LDD events numerous times. This provided the motivation for this study which is aimed to identify all the locations in the eastern Mediterranean that may serve as sources for pathogen incursion into Israel via LDD. This aim was achieved by calculating source-receptor relationship probability maps. These maps describe the probability that an infected vector or viral aerosol, once airborne, will have an atmospheric route that can transport it to a distant location. The resultant probability maps demonstrate a seasonal tendency in the probability of specific areas to serve as sources for pathogen LDD into Israel. Specifically, Cyprus' season is the summer; southern Turkey and the Greek islands of Crete, Karpathos and Rhodes are associated with spring and summer; lower Egypt and Jordan may serve as sources all year round, except the summer months. The method used in this study can easily be implemented to any other geographic region. The importance of this study is the ability to provide a climatologically valid and accurate risk assessment tool to support long-term decisions regarding preparatory actions for future outbreaks long before a specific outbreak occurs. © 2017 Blackwell Verlag GmbH.

  3. Associating Fast Radio Bursts with Extragalactic Radio Sources: General Methodology and a Search for a Counterpart to FRB 170107

    Science.gov (United States)

    Eftekhari, T.; Berger, E.; Williams, P. K. G.; Blanchard, P. K.

    2018-06-01

    The discovery of a repeating fast radio burst (FRB) has led to the first precise localization, an association with a dwarf galaxy, and the identification of a coincident persistent radio source. However, further localizations are required to determine the nature of FRBs, the sources powering them, and the possibility of multiple populations. Here we investigate the use of associated persistent radio sources to establish FRB counterparts, taking into account the localization area and the source flux density. Due to the lower areal number density of radio sources compared to faint optical sources, robust associations can be achieved for less precise localizations as compared to direct optical host galaxy associations. For generally larger localizations that preclude robust associations, the number of candidate hosts can be reduced based on the ratio of radio-to-optical brightness. We find that confident associations with sources having a flux density of ∼0.01–1 mJy, comparable to the luminosity of the persistent source associated with FRB 121102 over the redshift range z ≈ 0.1–1, require FRB localizations of ≲20″. We demonstrate that even in the absence of a robust association, constraints can be placed on the luminosity of an associated radio source as a function of localization and dispersion measure (DM). For DM ≈1000 pc cm‑3, an upper limit comparable to the luminosity of the FRB 121102 persistent source can be placed if the localization is ≲10″. We apply our analysis to the case of the ASKAP FRB 170107, using optical and radio observations of the localization region. We identify two candidate hosts based on a radio-to-optical brightness ratio of ≳100. We find that if one of these is indeed associated with FRB 170107, the resulting radio luminosity (1029‑ 4 × 1030 erg s‑1 Hz‑1, as constrained from the DM value) is comparable to the luminosity of the FRB 121102 persistent source.

  4. A study of calculation methodology and experimental measurements of the kinetic parameters for source driven subcritical systems

    International Nuclear Information System (INIS)

    Lee, Seung Min

    2009-01-01

    This work presents a theoretical study of reactor kinetics focusing on the methodology of calculation and the experimental measurements of the so-called kinetic parameters. A comparison between the methodology based on the Dulla's formalism and the classical method is made. The objective is to exhibit the dependence of the parameters on subcriticality level and perturbation. Two different slab type systems were considered: thermal one and fast one, both with homogeneous media. One group diffusion model was used for the fast reactor, and for the thermal system, two groups diffusion model, considering, in both case, only one precursor's family. The solutions were obtained using the expansion method. Also, descriptions of the main experimental methods of measurements of the kinetic parameters are presented in order to put a question about the compatibility of these methods in subcritical region. (author)

  5. Different cAMP sources are critically involved in G protein-coupled receptor CRHR1 signaling.

    Science.gov (United States)

    Inda, Carolina; Dos Santos Claro, Paula A; Bonfiglio, Juan J; Senin, Sergio A; Maccarrone, Giuseppina; Turck, Christoph W; Silberstein, Susana

    2016-07-18

    Corticotropin-releasing hormone receptor 1 (CRHR1) activates G protein-dependent and internalization-dependent signaling mechanisms. Here, we report that the cyclic AMP (cAMP) response of CRHR1 in physiologically relevant scenarios engages separate cAMP sources, involving the atypical soluble adenylyl cyclase (sAC) in addition to transmembrane adenylyl cyclases (tmACs). cAMP produced by tmACs and sAC is required for the acute phase of extracellular signal regulated kinase 1/2 activation triggered by CRH-stimulated CRHR1, but only sAC activity is essential for the sustained internalization-dependent phase. Thus, different cAMP sources are involved in different signaling mechanisms. Examination of the cAMP response revealed that CRH-activated CRHR1 generates cAMP after endocytosis. Characterizing CRHR1 signaling uncovered a specific link between CRH-activated CRHR1, sAC, and endosome-based signaling. We provide evidence of sAC being involved in an endocytosis-dependent cAMP response, strengthening the emerging model of GPCR signaling in which the cAMP response does not occur exclusively at the plasma membrane and introducing the notion of sAC as an alternative source of cAMP. © 2016 Inda et al.

  6. Identification and elucidation of anthropogenic source contribution in PM10 pollutant: Insight gain from dispersion and receptor models.

    Science.gov (United States)

    Roy, Debananda; Singh, Gurdeep; Yadav, Pankaj

    2016-10-01

    Source apportionment study of PM 10 (Particulate Matter) in a critically polluted area of Jharia coalfield, India has been carried out using Dispersion model, Principle Component Analysis (PCA) and Chemical Mass Balance (CMB) techniques. Dispersion model Atmospheric Dispersion Model (AERMOD) was introduced to simplify the complexity of sources in Jharia coalfield. PCA and CMB analysis indicates that monitoring stations near the mining area were mainly affected by the emission from open coal mining and its associated activities such as coal transportation, loading and unloading of coal. Mine fire emission also contributed a considerable amount of particulate matters in monitoring stations. Locations in the city area were mostly affected by vehicular, Liquid Petroleum Gas (LPG) & Diesel Generator (DG) set emissions, residential, and commercial activities. The experimental data sampling and their analysis could aid understanding how dispersion based model technique along with receptor model based concept can be strategically used for quantitative analysis of Natural and Anthropogenic sources of PM 10 . Copyright © 2016. Published by Elsevier B.V.

  7. Application of radiocarbon analysis and receptor modeling to the source apportionment of PAHs (polycyclic aromatic hydrocarbons) in the atmosphere

    International Nuclear Information System (INIS)

    Sheffield, A.E.

    1988-01-01

    The radiocarbon tracer technique was used to demonstrate that polycyclic aromatic hydrocarbons (PAHs) can be used for quantitative receptor modeling of air pollution. Fine-particle samples were collected during December, 1985, in Albuquerque, NM. Motor vehicles (fossil) and residential wood combustion (RWC, modern) were the major PAH-sources. For each sample, the PAH-fraction was solvent-extracted, isolated by liquid chromatography, and analyzed by GC-FID and GC-MS. The PAH-fractions from sixteen samples were analyzed for 14 C by Accelerator Mass Spectrometry. Radiocarbon data were used to calculate the relative RWC contribution (f RWC ) for samples analyzed for 14 C. Normalized concentrations of a prospective motor vehicle tracer, benzo(ghi)perylene (BGP) had a strong, negative correlation with f RWC . Normalized BGP concentrations were used to apportion sources for samples not analyzed for 14 C. Multiple Linear Regression (MLR) vs. ADCS and BGP was used to estimate source profiles for use in Target Factor Analysis (TFA). Profiles predicted by TFA were used in Chemical Mass Balances (CMBs). For non-volatile, stable PAHs, agreement between observed and predicted concentrations was excellent. The worst fits were observed for the most volatile PAHs and for coronene. The total RWC contributions predicted by CMBs correlated well with the radiocarbon data

  8. Evaluation of JRC source term methodology using MAAP5 as a fast-running crisis tool for a BWR4 Mark I reactor

    International Nuclear Information System (INIS)

    Vela-García, M.; Simola, K.

    2016-01-01

    JRC participated in the OECD/NEA FASTRUN benchmark reviewing fast-running software tools to model fission product releases during accidents at nuclear power plants. The main goal of fast-running software tools is to foresee the accident progression, so that mitigating actions can be taken and the population can be adequately protected. Within the FASTRUN, JRC used the MAAP 4.0.8 code and developed a methodology to obtain the source term (as activity released per radioisotope) of PWR and BWR station black-out accident scenarios. The modifications made in the MAAP models were limited to a minimum number of important parameters. This aims at reproducing a crisis situation with a limited time to adapt a generic input deck. This paper presents further studies, where JRC analysed the FASTRUN BWR scenario using MAAP 5.0.2 that has the capability of calculating doses. A sensitivity study was performed with the MAAP 5.0.2 DOSE package deactivated, using the same methodology as in the case of MAAP 4.0.8 for source term calculation. The results were close to the reference LTSBO SOARCA case, independently of the methodology used. One of the benefits of using the MAAP code is the short runtime of the simulations.

  9. A methodology for combining multiple commercial data sources to improve measurement of the food and alcohol environment: applications of geographical information systems

    Directory of Open Access Journals (Sweden)

    Dara D. Mendez

    2014-11-01

    Full Text Available Commercial data sources have been increasingly used to measure and locate community resources. We describe a methodology for combining and comparing the differences in commercial data of the food and alcohol environment. We used commercial data from two commercial databases (InfoUSA and Dun&Bradstreet for 2003 and 2009 to obtain infor- mation on food and alcohol establishments and developed a matching process using computer algorithms and manual review by applying ArcGIS to geocode addresses, standard industrial classification and North American industry classification tax- onomy for type of establishment and establishment name. We constructed population and area-based density measures (e.g. grocery stores and assessed differences across data sources and used ArcGIS to map the densities. The matching process resulted in 8,705 and 7,078 unique establishments for 2003 and 2009, respectively. There were more establishments cap- tured in the combined dataset than relying on one data source alone, and the additional establishments captured ranged from 1,255 to 2,752 in 2009. The correlations for the density measures between the two data sources was highest for alcohol out- lets (r = 0.75 and 0.79 for per capita and area, respectively and lowest for grocery stores/supermarkets (r = 0.32 for both. This process for applying geographical information systems to combine multiple commercial data sources and develop meas- ures of the food and alcohol environment captured more establishments than relying on one data source alone. This replic- able methodology was found to be useful for understanding the food and alcohol environment when local or public data are limited.

  10. A methodological framework for assessing agreement between cost-effectiveness outcomes estimated using alternative sources of data on treatment costs and effects for trial-based economic evaluations.

    Science.gov (United States)

    Achana, Felix; Petrou, Stavros; Khan, Kamran; Gaye, Amadou; Modi, Neena

    2018-01-01

    A new methodological framework for assessing agreement between cost-effectiveness endpoints generated using alternative sources of data on treatment costs and effects for trial-based economic evaluations is proposed. The framework can be used to validate cost-effectiveness endpoints generated from routine data sources when comparable data is available directly from trial case report forms or from another source. We illustrate application of the framework using data from a recent trial-based economic evaluation of the probiotic Bifidobacterium breve strain BBG administered to babies less than 31 weeks of gestation. Cost-effectiveness endpoints are compared using two sources of information; trial case report forms and data extracted from the National Neonatal Research Database (NNRD), a clinical database created through collaborative efforts of UK neonatal services. Focusing on mean incremental net benefits at £30,000 per episode of sepsis averted, the study revealed no evidence of discrepancy between the data sources (two-sided p values >0.4), low probability estimates of miscoverage (ranging from 0.039 to 0.060) and concordance correlation coefficients greater than 0.86. We conclude that the NNRD could potentially serve as a reliable source of data for future trial-based economic evaluations of neonatal interventions. We also discuss the potential implications of increasing opportunity to utilize routinely available data for the conduct of trial-based economic evaluations.

  11. Statistical forensic methodology for oil spill source identification using two-tailed student's t approach. Volume 1

    International Nuclear Information System (INIS)

    Yang, C.; Wang, Z.; Hollebone, B.; Brown, C.E.; Landriault, M.

    2007-01-01

    A thorough chemical characterization of oil must be conducted following an oil spill in order to determine the source of the oil, to distinguish the spilled oil from background hydrocarbons and to quantitatively evaluate the extent of impact of the spill. Gas chromatography, flame ionization and mass spectrometry analysis was used in conjunction with statistical data analysis to determine the source of a spill that occurred in 2004 in a harbor in the Netherlands. Three oil samples were collected from the harbor spill, where a thick layer of oil was found between a bunker boat and the quay next to the bunker centre. The 3 samples were sent to different laboratories for a round robin test to defensively correlate the spilled oil to the suspected source candidates. The source characterization and identification was validated by quantitative evaluation of 5 petroleum-characteristic alkylated PAH homologous series (naphthalene, phenanthrene, dibenzothiophene, fluorene and chrysene), pentacyclic biomarkers, bicyclic sesquiterpanes and diamondoid compounds. The use of biomarkers for identifying the source of spilled oils has also increased in recent years due to their specificity and high resistance to biodegradation. There was no strong difference among the 3 oil samples according to radar plots of diagnostic ratios of PAHs, isoprenoids, biomarkers, bicyclic sesquiterpanes and diamondoids. The two-tailed unpaired student's t-tests provided strong evidence for which ship was responsible for the oil spill incident. However, it was cautioned that although two-tailed unpaired student's t-tests along with oil fingerprinting successfully identified the spill source, the method has limitations. Experimental results showed that the spilled oil and two source candidates were quite similar in both chemical fingerprints and concentration profiles for determined target hydrocarbons. 17 refs., 4 tabs., 3 figs

  12. Evaluation of the methodology for dose calculation in microdosimetry with electrons sources using the MCNP5 Code

    International Nuclear Information System (INIS)

    Cintra, Felipe Belonsi de

    2010-01-01

    This study made a comparison between some of the major transport codes that employ the Monte Carlo stochastic approach in dosimetric calculations in nuclear medicine. We analyzed in detail the various physical and numerical models used by MCNP5 code in relation with codes like EGS and Penelope. The identification of its potential and limitations for solving microdosimetry problems were highlighted. The condensed history methodology used by MCNP resulted in lower values for energy deposition calculation. This showed a known feature of the condensed stories: its underestimates both the number of collisions along the trajectory of the electron and the number of secondary particles created. The use of transport codes like MCNP and Penelope for micrometer scales received special attention in this work. Class I and class II codes were studied and their main resources were exploited in order to transport electrons, which have particular importance in dosimetry. It is expected that the evaluation of available methodologies mentioned here contribute to a better understanding of the behavior of these codes, especially for this class of problems, common in microdosimetry. (author)

  13. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  14. Aryl hydrocarbon receptor signaling modulates antiviral immune responses: ligand metabolism rather than chemical source is the stronger predictor of outcome.

    Science.gov (United States)

    Boule, Lisbeth A; Burke, Catherine G; Jin, Guang-Bi; Lawrence, B Paige

    2018-01-29

    The aryl hydrocarbon receptor (AHR) offers a compelling target to modulate the immune system. AHR agonists alter adaptive immune responses, but the consequences differ across studies. We report here the comparison of four agents representing different sources of AHR ligands in mice infected with influenza A virus (IAV): TCDD, prototype exogenous AHR agonist; PCB126, pollutant with documented human exposure; ITE, novel pharmaceutical; and FICZ, degradation product of tryptophan. All four compounds diminished virus-specific IgM levels and increased the proportion of regulatory T cells. TCDD, PCB126 and ITE, but not FICZ, reduced virus-specific IgG levels and CD8 + T cell responses. Similarly, ITE, PCB126, and TCDD reduced Th1 and Tfh cells, whereas FICZ increased their frequency. In Cyp1a1-deficient mice, all compounds, including FICZ, reduced the response to IAV. Conditional Ahr knockout mice revealed that all four compounds require AHR within hematopoietic cells. Thus, differences in the immune response to IAV likely reflect variances in quality, magnitude, and duration of AHR signaling. This indicates that binding affinity and metabolism may be stronger predictors of immune effects than a compound's source of origin, and that harnessing AHR will require finding a balance between dampening immune-mediated pathologies and maintaining sufficient host defenses against infection.

  15. ATMOSPHERIC AEROSOL SOURCE-RECEPTOR RELATIONSHIPS: THE ROLE OF COAL-FIRED POWER PLANTS; SEMIANNUAL

    International Nuclear Information System (INIS)

    Allen L. Robinson; Spyros N. Pandis; Cliff I. Davidson

    2002-01-01

    This report describes the technical progress made on the Pittsburgh Air Quality Study (PAQS) during the period of August 2001 through January of 2002. The major activity during this project period was the continuation of the ambient monitoring effort. Work also progressed on organizing the upcoming source characterization effort, and there was continued development of several three-dimensional air quality models. The first PAQS data analysis workshop for the project was held at Carnegie Mellon in December 2001. Two new instruments were added to site during this project period: a single particle mass spectrometer and an in situ VOC instrument. The single particle mass spectrometer has been deployed since the middle of September and has collected more than 150 days of data. The VOC instrument was only deployed during the intensive sampling period. Several instruments experienced operational issues during this project period. The overall data recovery rate for the project has been high

  16. Methodology for the calculation of source terms related to irradiated fuel accumulated away from nuclear power plants

    International Nuclear Information System (INIS)

    Lima Filho, R.M.; Oliveira, L.F.S. de

    1984-01-01

    A general method for the calculation of the time evolution of source terms related to irradiated fuel is presented. Some applications are discussed which indicated that the method can provide important informations for the engineering design and safety analysis of a temporary storage facility of irradiated fuel elements. (Author) [pt

  17. Theoretical and methodological notes on visual and audiovisual sources in researches on Life Stories and Self-referential Memorials

    Directory of Open Access Journals (Sweden)

    Maria Helena Menna Barreto Abrahão

    2014-01-01

    Full Text Available The text explicits the reflection that bases the use of pictures, films and videofilms as sources in research on Life Stories and Self-re- ferential Memorials in Teachers’ Education. After refering to the researches in which we use this support since 1988, we work with two complementary pairs of theoretical dimensions of narratives in visual and audiovisual sources and their use in such empirical rese- arch: subjectivity/truth and space/time. These dimensions are worked grounded in Barthes (1984 to propose an interpretative effort of these sources to understand the essence of photography according to the photographer and the essence of photography according to the photographed person concocted to the essence of photography according to the researcher. The Barthesian constructs studium and punctum are applied to reading the narratives of the filmic and pho- tographic material, reaching the more radical expression of the Bar- thesian puctum: real or representational death of the referent that serves the photos and movies. The discussion of these dimensions for the analysis of (audio visual sources is complemented with the support of several other authors.

  18. Statistical sources and methodology of elaboration for the regional, national and European energy indicators; Fonti statistiche e metodologie di elaborazione per gli indicatori energetici regionali, nazionali ed europei

    Energy Technology Data Exchange (ETDEWEB)

    Bianco, R; Perrella, G [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dip. Energia

    1996-10-01

    This report is aimed to presenting the energy-economic and structural data system called EIS (energy indicators system), about the statistical sources and the different methodologies of data valuation and indicators elaboration. This system, elaborated in the activity of ERG-PROM Division of ENEA (Italian Agency for New Technologies, Energy and the Environment), is aimed to analyze the energy by quantity and quality use, in the different sectors, for each final aim and for every source. In the report are defined all the procedure and unit of measure used to process the data bank, so the reader can to interpret the right meaning of the variables and indicators and can make further process.

  19. Methodology for and uses of a radiological source term assessment for potential impacts to stormwater and groundwater

    International Nuclear Information System (INIS)

    Teare, A.; Hansen, K.; DeWilde, J.; Yu, L.; Killey, D.

    2001-01-01

    A Radiological Source Term Assessment (RSTA) was conducted by Ontario Power Generation Inc. (OPG) at the Pickering Nuclear Generating Station (PNGS). Tritium had been identified in the groundwater at several locations under the station, and OPG initiated the RSTA as part of its ongoing efforts to improve operations and to identify potential sources of radionuclide impact to groundwater and stormwater at the station. The RSTA provides a systematic approach to collecting information and assessing environmental risk for radioactive contaminants based on a ranking system developed for the purpose. This paper provides an overview of the RSTA focusing on the investigative approach and how it was applied. This approach can find application at other generating stations. (author)

  20. Two methodologies for computing neutron sources from (α,n) and spontaneous fission reactions in vitrified waste

    International Nuclear Information System (INIS)

    Goldberg, H.J.; Morford, R.J.

    1993-06-01

    The disposal of high-level defense waste in a geologic repository necessitates conversion of the waste to a stable form. For this purpose, the Hanford Waste Vitrification Plant (HWVP) will be constructed. In this facility the waste will be converted into 6.3 x 10 5 cm 3 glassified cylinders, 59 cm in diameter and 230 cm in height, which will be placed in steel containers and buried. The waste packages must be adequately shielded to ensure the safety of personnel handling them. To calculate the shielding necessary, the radiation source term must be determined. Although the γ-ray source term does not present a problem, the neutron source term is a concern. Because the glass matrix is composed of light elements, the presence of any α-particle emitting radionuclides in the waste will contribute to the neutron flux. This paper attempts to ascertain the neutron flux and spectrum from (α,n) reactions and add it to the flux resulting from spontaneous fission

  1. A shift in emission time profiles of fossil fuel combustion due to energy transitions impacts source receptor matrices for air quality

    NARCIS (Netherlands)

    Hendriks, C.; Kuenen, J.; Kranenburg, R.; Scholz, Y.; Schaap, M.

    2015-01-01

    Effective air pollution and short-lived climate forcer mitigation strategies can only be designed when the effect of emission reductions on pollutant concentrations and health and ecosystem impacts are quantified. Within integrated assessment modeling source-receptor relationships (SRRs) based on

  2. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  3. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses. Volume 1, Revision 1

    International Nuclear Information System (INIS)

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T.; Helton, J.C.; Murfin, W.B.; Hora, S.C.

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community

  4. Laboratory and field evaluations of a methodology for determining hexavalent-chromium emissions from stationary sources. Final report

    International Nuclear Information System (INIS)

    Carver, A.C.

    1991-10-01

    The study was initiated to determine whether chromium emissions should be regulated under Section 112 of the Clean Air Act National Emissions Standards for Hazardous Air Pollutants (NESHAP). To support stationary source regulations, it is important that (1) the sampling procedure not change the chromium valence state during sampling and (2) an analytical technique for measuring low concentration levels of chromium be available. These goals are achieved with the current EPA 'Draft Method for Sampling and Analysis of Hexavalent Chromium at Stationary Sources.' The draft method utilizes a recirculating system to flush impinger reagent into the sampling nozzle during sample collection. Immediate contact of the stack gas with impinger reagent 'fixes' the chromium valence state. Ion chromatography coupled with post column derivatization and ultraviolet visible detector is used to analyze Cr(VI) in the parts per trillion range. Field tests were conducted at metal plating facilities, industrial cooling towers, municipal waste incinerators, sewage sludge incinerators, and hazardous waste incinerators. It was at the hazardous waste facility that the new method was proven to have acceptable precision and essentially no conversion in the sample train

  5. Source apportionment of ambient non-methane hydrocarbons in Hong Kong: application of a principal component analysis/absolute principal component scores (PCA/APCS) receptor model.

    Science.gov (United States)

    Guo, H; Wang, T; Louie, P K K

    2004-06-01

    Receptor-oriented source apportionment models are often used to identify sources of ambient air pollutants and to estimate source contributions to air pollutant concentrations. In this study, a PCA/APCS model was applied to the data on non-methane hydrocarbons (NMHCs) measured from January to December 2001 at two sampling sites: Tsuen Wan (TW) and Central & Western (CW) Toxic Air Pollutants Monitoring Stations in Hong Kong. This multivariate method enables the identification of major air pollution sources along with the quantitative apportionment of each source to pollutant species. The PCA analysis identified four major pollution sources at TW site and five major sources at CW site. The extracted pollution sources included vehicular internal engine combustion with unburned fuel emissions, use of solvent particularly paints, liquefied petroleum gas (LPG) or natural gas leakage, and industrial, commercial and domestic sources such as solvents, decoration, fuel combustion, chemical factories and power plants. The results of APCS receptor model indicated that 39% and 48% of the total NMHCs mass concentrations measured at CW and TW were originated from vehicle emissions, respectively. 32% and 36.4% of the total NMHCs were emitted from the use of solvent and 11% and 19.4% were apportioned to the LPG or natural gas leakage, respectively. 5.2% and 9% of the total NMHCs mass concentrations were attributed to other industrial, commercial and domestic sources, respectively. It was also found that vehicle emissions and LPG or natural gas leakage were the main sources of C(3)-C(5) alkanes and C(3)-C(5) alkenes while aromatics were predominantly released from paints. Comparison of source contributions to ambient NMHCs at the two sites indicated that the contribution of LPG or natural gas at CW site was almost twice that at TW site. High correlation coefficients (R(2) > 0.8) between the measured and predicted values suggested that the PCA/APCS model was applicable for estimation

  6. Intercomparison of calibration procedures of high dose rate 192 Ir sources in Brazil and a proposal of a new methodology

    International Nuclear Information System (INIS)

    Marechal, M.H.; Almeida, C.E. de

    1998-01-01

    The objective of this paper is to report the results of an intercomparison of the calibration procedures for 192 Ir sources presently in use in Brazil and to proposal a calibration procedure to derive the N k for a Farmer type ionization chamber for 192 Ir energy by interpolating from a 60 Co gamma-rays and 250 kV x-rays calibration factors. the intercomparison results were all within ± 3.0 % except one case where 4.6 % was observed and latter identified as a problem with N-k value for X-rays. The method proposed by the present work make possible the improvement of the metrological coherence among the calibration laboratories and their users once the N k values could then provided by any of the members of SSDL network. (Author)

  7. Contents, Sources and Methodology for the Study of Gender Relations in Ptolemaic and Roman Egypt (4th century B.C.- 4th century A. D.. Women’s identities, power and socioeconomic situation through papyrus sources

    Directory of Open Access Journals (Sweden)

    Amaia Goñi Zabalegui

    2010-03-01

    Full Text Available The following work intends to show the contents and methodology of aresearch that will conclude in a PhD. The main aim of the work is to study in depth the gender relations in Ptolemaic and Roman Egypt between the fourth century b.C. and fourth century a.d, for what it's necessary to know the identities, power and socioeconomic situationof women in this historic framework. In this way, this work, whose basis are exposed in article below, serves as an illustration of the implantation of a gender perspective in the study of the ancient world. In order to achieve this purpose, papyrus sources written or ordered by women will be used, as well as sources that make explicit reference to them. Specifically, the work will focus on personal letters that report on the existence and preoccupations of a certain group of women of Greco-Roman Egypt.

  8. Study and methodology development for quality control in the production process of iodine-125 radioactive sealed sources applied to brachytherapy

    International Nuclear Information System (INIS)

    Moura, Joao Augusto

    2009-01-01

    Today cancer is the second cause of death by disease in several countries, including Brazil. Excluding skin cancer, prostate cancer is the most incident in the population. Prostate tumor can be treated by several ways, including brachytherapy, which consists in introducing sealed radioactive sources (Iodine - 125 seeds) inside the tumor. The target region of treatment receives a high radiation dose, but healthy neighbor tissues receive a significantly reduced radiation dose. The seed is made of a welding sealed titanium capsule, 0.8 mm external diameter and 4.5 mm length, enclosing a 0.5 mm diameter silver wire with Iodine-125 adsorbed. After welded, the seeds have to be submitted to a leak test to prevent any radioactive material release. The aims of this work were: (a) the study of the different leakage test methods applied to radioactive seeds and recommended by the ISO 997820, (b) the choice of the appropriate method and (c) the flowchart determination of the process to be used during the seeds production. The essays exceeded the standards with the use of ultra-sound during immersion and the corresponding benefits to leakage detection. Best results were obtained with the immersion in distilled water at 20 degree C for 24 hours and distilled water at 70 degree C for 30 minutes. These methods will be used during seed production. The process flowchart has all the phases of the leakage tests according to the sequence determined in the experiments. (author)

  9. An open-source framework for analyzing N-electron dynamics. II. Hybrid density functional theory/configuration interaction methodology.

    Science.gov (United States)

    Hermann, Gunter; Pohl, Vincent; Tremblay, Jean Christophe

    2017-10-30

    In this contribution, we extend our framework for analyzing and visualizing correlated many-electron dynamics to non-variational, highly scalable electronic structure method. Specifically, an explicitly time-dependent electronic wave packet is written as a linear combination of N-electron wave functions at the configuration interaction singles (CIS) level, which are obtained from a reference time-dependent density functional theory (TDDFT) calculation. The procedure is implemented in the open-source Python program detCI@ORBKIT, which extends the capabilities of our recently published post-processing toolbox (Hermann et al., J. Comput. Chem. 2016, 37, 1511). From the output of standard quantum chemistry packages using atom-centered Gaussian-type basis functions, the framework exploits the multideterminental structure of the hybrid TDDFT/CIS wave packet to compute fundamental one-electron quantities such as difference electronic densities, transient electronic flux densities, and transition dipole moments. The hybrid scheme is benchmarked against wave function data for the laser-driven state selective excitation in LiH. It is shown that all features of the electron dynamics are in good quantitative agreement with the higher-level method provided a judicious choice of functional is made. Broadband excitation of a medium-sized organic chromophore further demonstrates the scalability of the method. In addition, the time-dependent flux densities unravel the mechanistic details of the simulated charge migration process at a glance. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  10. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  11. Water quality assessment and apportionment of pollution sources using APCS-MLR and PMF receptor modeling techniques in three major rivers of South Florida.

    Science.gov (United States)

    Haji Gholizadeh, Mohammad; Melesse, Assefa M; Reddi, Lakshmi

    2016-10-01

    In this study, principal component analysis (PCA), factor analysis (FA), and the absolute principal component score-multiple linear regression (APCS-MLR) receptor modeling technique were used to assess the water quality and identify and quantify the potential pollution sources affecting the water quality of three major rivers of South Florida. For this purpose, 15years (2000-2014) dataset of 12 water quality variables covering 16 monitoring stations, and approximately 35,000 observations was used. The PCA/FA method identified five and four potential pollution sources in wet and dry seasons, respectively, and the effective mechanisms, rules and causes were explained. The APCS-MLR apportioned their contributions to each water quality variable. Results showed that the point source pollution discharges from anthropogenic factors due to the discharge of agriculture waste and domestic and industrial wastewater were the major sources of river water contamination. Also, the studied variables were categorized into three groups of nutrients (total kjeldahl nitrogen, total phosphorus, total phosphate, and ammonia-N), water murkiness conducive parameters (total suspended solids, turbidity, and chlorophyll-a), and salt ions (magnesium, chloride, and sodium), and average contributions of different potential pollution sources to these categories were considered separately. The data matrix was also subjected to PMF receptor model using the EPA PMF-5.0 program and the two-way model described was performed for the PMF analyses. Comparison of the obtained results of PMF and APCS-MLR models showed that there were some significant differences in estimated contribution for each potential pollution source, especially in the wet season. Eventually, it was concluded that the APCS-MLR receptor modeling approach appears to be more physically plausible for the current study. It is believed that the results of apportionment could be very useful to the local authorities for the control and

  12. Systematic reviews and meta-analyses on psoriasis: role of funding sources, conflict of interest and bibliometric indices as predictors of methodological quality.

    Science.gov (United States)

    Gómez-García, F; Ruano, J; Aguilar-Luque, M; Gay-Mimbrera, J; Maestre-Lopez, B; Sanz-Cabanillas, J L; Carmona-Fernández, P J; González-Padilla, M; Vélez García-Nieto, A; Isla-Tejera, B

    2017-06-01

    The quality of systematic reviews and meta-analyses on psoriasis, a chronic inflammatory skin disease that severely impairs quality of life and is associated with high costs, remains unknown. To assess the methodological quality of systematic reviews published on psoriasis. After a comprehensive search in MEDLINE, Embase and the Cochrane Database (PROSPERO: CDR42016041611), the quality of studies was assessed by two raters using the Assessment of Multiple Systematic Reviews (AMSTAR) tool. Article metadata and journal-related bibliometric indices were also obtained. Systematic reviews were classified as low (0-4), moderate (5-8) or high (9-11) quality. A prediction model for methodological quality was fitted using principal component and multivariate ordinal logistic regression analyses. We classified 220 studies as high (17·2%), moderate (55·0%) or low (27·8%) quality. Lower compliance rates were found for AMSTAR question (Q)5 (list of studies provided, 11·4%), Q10 (publication bias assessed, 27·7%), Q4 (status of publication included, 39·5%) and Q1 (a priori design provided, 40·9%). Factors such as meta-analysis inclusion [odds ratio (OR) 6·22; 95% confidence interval (CI) 2·78-14·86], funding by academic institutions (OR 2·90, 95% CI 1·11-7·89), Article Influence score (OR 2·14, 95% CI 1·05-6·67), 5-year impact factor (OR 1·34, 95% CI 1·02-1·40) and article page count (OR 1·08, 95% CI 1·02-1·15) significantly predicted higher quality. A high number of authors with a conflict of interest (OR 0·90, 95% CI 0·82-0·99) was significantly associated with lower quality. The methodological quality of systematic reviews published about psoriasis remains suboptimal. The type of funding sources and author conflicts may compromise study quality, increasing the risk of bias. © 2017 British Association of Dermatologists.

  13. A shift in emission time profiles of fossil fuel combustion due to energy transitions impacts source receptor matrices for air quality.

    Science.gov (United States)

    Hendriks, Carlijn; Kuenen, Jeroen; Kranenburg, Richard; Scholz, Yvonne; Schaap, Martijn

    2015-03-01

    Effective air pollution and short-lived climate forcer mitigation strategies can only be designed when the effect of emission reductions on pollutant concentrations and health and ecosystem impacts are quantified. Within integrated assessment modeling source-receptor relationships (SRRs) based on chemistry transport modeling are used to this end. Currently, these SRRs are made using invariant emission time profiles. The LOTOS-EUROS model equipped with a source attribution module was used to test this assumption for renewable energy scenarios. Renewable energy availability and thereby fossil fuel back up are strongly dependent on meteorological conditions. We have used the spatially and temporally explicit energy model REMix to derive time profiles for backup power generation. These time profiles were used in LOTOS-EUROS to investigate the effect of emission timing on air pollutant concentrations and SRRs. It is found that the effectiveness of emission reduction in the power sector is significantly lower when accounting for the shift in the way emissions are divided over the year and the correlation of emissions with synoptic situations. The source receptor relationships also changed significantly. This effect was found for both primary and secondary pollutants. Our results indicate that emission timing deserves explicit attention when assessing the impacts of system changes on air quality and climate forcing from short lived substances.

  14. Source apportionment of soil heavy metals using robust absolute principal component scores-robust geographically weighted regression (RAPCS-RGWR) receptor model.

    Science.gov (United States)

    Qu, Mingkai; Wang, Yan; Huang, Biao; Zhao, Yongcun

    2018-06-01

    The traditional source apportionment models, such as absolute principal component scores-multiple linear regression (APCS-MLR), are usually susceptible to outliers, which may be widely present in the regional geochemical dataset. Furthermore, the models are merely built on variable space instead of geographical space and thus cannot effectively capture the local spatial characteristics of each source contributions. To overcome the limitations, a new receptor model, robust absolute principal component scores-robust geographically weighted regression (RAPCS-RGWR), was proposed based on the traditional APCS-MLR model. Then, the new method was applied to the source apportionment of soil metal elements in a region of Wuhan City, China as a case study. Evaluations revealed that: (i) RAPCS-RGWR model had better performance than APCS-MLR model in the identification of the major sources of soil metal elements, and (ii) source contributions estimated by RAPCS-RGWR model were more close to the true soil metal concentrations than that estimated by APCS-MLR model. It is shown that the proposed RAPCS-RGWR model is a more effective source apportionment method than APCS-MLR (i.e., non-robust and global model) in dealing with the regional geochemical dataset. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Scavenger Receptor Class B, Type I, a CD36 Related Protein in Macrobrachium nipponense: Characterization, RNA Interference, and Expression Analysis with Different Dietary Lipid Sources

    Directory of Open Access Journals (Sweden)

    Zhili Ding

    2016-01-01

    Full Text Available The scavenger receptor class B, type I (SR-BI, is a member of the CD36 superfamily comprising transmembrane proteins involved in mammalian and fish lipid homeostasis regulation. We hypothesize that this receptor plays an important role in Macrobrachium nipponense lipid metabolism. However, little attention has been paid to SR-BI in commercial crustaceans. In the present study, we report a cDNA encoding M. nipponense scavenger receptor class B, type I (designated as MnSR-BI, obtained from a hepatopancreas cDNA library. The complete MnSR-BI coding sequence was 1545 bp, encoding 514 amino acid peptides. The MnSR-BI primary structure consisted of a CD36 domain that contained two transmembrane regions at the N- and C-terminals of the protein. SR-BI mRNA expression was specifically detected in muscle, gill, ovum, intestine, hepatopancreas, stomach, and ovary tissues. Furthermore, its expression in the hepatopancreas was regulated by dietary lipid sources, with prawns fed soybean and linseed oils exhibiting higher expression levels. RNAi-based SR-BI silencing resulted in the suppression of its expression in the hepatopancreas and variation in the expression of lipid metabolism-related genes. This is the first report of SR-BI in freshwater prawns and provides the basis for further studies on SR-BI in crustaceans.

  16. Receptor modeling of C2─C7 hydrocarbon sources at an urban background site in Zurich, Switzerland: changes between 1993─1994 and 2005─2006

    Directory of Open Access Journals (Sweden)

    S. Reimann

    2008-05-01

    Full Text Available Hourly measurements of 13 volatile hydrocarbons (C2–C7 were performed at an urban background site in Zurich (Switzerland in the years 1993–1994 and again in 2005–2006. For the separation of the volatile organic compounds by gas-chromatography (GC, an identical chromatographic column was used in both campaigns. Changes in hydrocarbon profiles and source strengths were recovered by positive matrix factorization (PMF. Eight and six factors could be related to hydrocarbon sources in 1993–1994 and in 2005–2006, respectively. The modeled source profiles were verified by hydrocarbon profiles reported in the literature. The source strengths were validated by independent measurements, such as inorganic trace gases (NOx, CO, SO2, methane (CH4, oxidized hydrocarbons (OVOCs and meteorological data (temperature, wind speed etc.. Our analysis suggests that the contribution of most hydrocarbon sources (i.e. road traffic, solvents use and wood burning decreased by a factor of about two to three between the early 1990s and 2005–2006. On the other hand, hydrocarbon losses from natural gas leakage remained at relatively constant levels (−20%. The estimated emission trends are in line with the results from different receptor-based approaches reported for other European cities. Their differences to national emission inventories are discussed.

  17. Using air quality modeling to study source-receptor relationships between nitrogen oxides emissions and ozone exposures over the United States.

    Science.gov (United States)

    Tong, Daniel Q; Muller, Nicholas Z; Kan, Haidong; Mendelsohn, Robert O

    2009-11-01

    Human exposure to ambient ozone (O(3)) has been linked to a variety of adverse health effects. The ozone level at a location is contributed by local production, regional transport, and background ozone. This study combines detailed emission inventory, air quality modeling, and census data to investigate the source-receptor relationships between nitrogen oxides (NO(x)) emissions and population exposure to ambient O(3) in 48 states over the continental United States. By removing NO(x) emissions from each state one at a time, we calculate the change in O(3) exposures by examining the difference between the base and the sensitivity simulations. Based on the 49 simulations, we construct state-level and census region-level source-receptor matrices describing the relationships among these states/regions. We find that, for 43 receptor states, cumulative NO(x) emissions from upwind states contribute more to O(3) exposures than the state's own emissions. In-state emissions are responsible for less than 15% of O(3) exposures in 90% of U.S. states. A state's NO(x) emissions can influence 2 to 40 downwind states by at least a 0.1 ppbv change in population-averaged O(3) exposure. The results suggest that the U.S. generally needs a regional strategy to effectively reduce O(3) exposures. But the current regional emission control program in the U.S. is a cap-and-trade program that assumes the marginal damage of every ton of NO(x) is equal. In this study, the average O(3) exposures caused by one ton of NO(x) emissions ranges from -2.0 to 2.3 ppm-people-hours depending on the state. The actual damage caused by one ton of NO(x) emissions varies considerably over space.

  18. Source-pathway-receptor investigation of the fate of trace elements derived from shotgun pellets discharged in terrestrial ecosystems managed for game shooting

    International Nuclear Information System (INIS)

    Sneddon, Jennifer; Clemente, Rafael; Riby, Philip; Lepp, Nicholas W.

    2009-01-01

    Spent shotgun pellets may contaminate terrestrial ecosystems. We examined the fate of elements originating from shotgun pellets in pasture and woodland ecosystems. Two source-receptor pathways: i) soil-soil pore water-plant and ii) whole earthworm/worm gut contents - washed and unwashed small mammal hair were investigated. Concentrations of Pb and associated contaminants were higher in soils from shot areas than controls. Arsenic and lead concentrations were positively correlated in soils, soil pore water and associated biota. Element concentrations in biota were below statutory levels in all locations. Bioavailability of lead to small mammals, based on concentrations in washed body hair was low. Lead movement from soil water to higher trophic levels was minor compared to lead adsorbed onto body surfaces. Lead was concentrated in earthworm gut and some plants. Results indicate that managed game shooting presents minimal risk in terms of element transfer to soils and their associated biota. - Source-receptor pathway analysis of a managed game shooting site showed no environmental risk of trace element transfer.

  19. Source-pathway-receptor investigation of the fate of trace elements derived from shotgun pellets discharged in terrestrial ecosystems managed for game shooting

    Energy Technology Data Exchange (ETDEWEB)

    Sneddon, Jennifer [School of Natural Sciences and Psychology, Liverpool John Moores University, Byrom Street, Liverpool L3 3AF (United Kingdom); Clemente, Rafael, E-mail: rclemente@cebas.csic.e [School of Natural Sciences and Psychology, Liverpool John Moores University, Byrom Street, Liverpool L3 3AF (United Kingdom); Riby, Philip [School of Pharmacy and Chemistry, Liverpool John Moores University, Liverpool L3 3AF (United Kingdom); Lepp, Nicholas W., E-mail: n.w.lepp@ljmu.ac.u [School of Natural Sciences and Psychology, Liverpool John Moores University, Byrom Street, Liverpool L3 3AF (United Kingdom)

    2009-10-15

    Spent shotgun pellets may contaminate terrestrial ecosystems. We examined the fate of elements originating from shotgun pellets in pasture and woodland ecosystems. Two source-receptor pathways: i) soil-soil pore water-plant and ii) whole earthworm/worm gut contents - washed and unwashed small mammal hair were investigated. Concentrations of Pb and associated contaminants were higher in soils from shot areas than controls. Arsenic and lead concentrations were positively correlated in soils, soil pore water and associated biota. Element concentrations in biota were below statutory levels in all locations. Bioavailability of lead to small mammals, based on concentrations in washed body hair was low. Lead movement from soil water to higher trophic levels was minor compared to lead adsorbed onto body surfaces. Lead was concentrated in earthworm gut and some plants. Results indicate that managed game shooting presents minimal risk in terms of element transfer to soils and their associated biota. - Source-receptor pathway analysis of a managed game shooting site showed no environmental risk of trace element transfer.

  20. Using a source-receptor approach to characterise VOC behaviour in a French urban area influenced by industrial emissions. Part II: source contribution assessment using the Chemical Mass Balance (CMB) model.

    Science.gov (United States)

    Badol, Caroline; Locoge, Nadine; Galloo, Jean-Claude

    2008-01-25

    In Part I of this study (Badol C, Locoge N, Leonardis T, Gallo JC. Using a source-receptor approach to characterise VOC behaviour in a French urban area influenced by industrial emissions, Part I: Study area description, data set acquisition and qualitative data analysis of the data set. Sci Total Environ 2007; submitted as companion manuscript.) the study area, acquisition of the one-year data set and qualitative analysis of the data set have been described. In Part II a source profile has been established for each activity present in the study area: 6 profiles (urban heating, solvent use, natural gas leakage, biogenic emissions, gasoline evaporation and vehicle exhaust) have been extracted from literature to characterise urban sources, 7 industrial profiles have been established via canister sampling around industrial plants (hydrocarbon cracking, oil refinery, hydrocarbon storage, lubricant storage, lubricant refinery, surface treatment and metallurgy). The CMB model is briefly described and its implementation is discussed through the selection of source profiles and fitting species. Main results of CMB modellings for the Dunkerque area are presented. (1) The daily evolution of source contributions for the urban wind sector shows that the vehicle exhaust source contribution varies between 40 and 55% and its relative increase at traffic rush hours is hardly perceptible. (2) The relative contribution of vehicle exhaust varies from 55% in winter down to 30% in summer. This decrease is due to the increase of the relative contribution of hydrocarbon storage source reaching up to 20% in summer. (3) The evolution of source contributions with wind directions has confirmed that in urban wind sectors the contribution of vehicle exhaust dominate with around 45-55%. For the other wind sectors that include some industrial plants, the contribution of industrial sources is around 60% and could reach 80% for the sector 280-310 degrees , which corresponds to the most dense

  1. Chapter three: methodology of exposure modeling

    CSIR Research Space (South Africa)

    Moschandreas, DJ

    2002-12-01

    Full Text Available methodologies and models are reviewed. Three exposure/measurement methodologies are assessed. Estimation methods focus on source evaluation and attribution, sources include those outdoors and indoors as well as in occupational and in-transit environments. Fate...

  2. Exploring 2D and 3D QSARs of benzimidazole derivatives as transient receptor potential melastatin 8 (TRPM8 antagonists using MLR and kNN-MFA methodology

    Directory of Open Access Journals (Sweden)

    Kamlendra Singh Bhadoriya

    2016-09-01

    Full Text Available TRPM8 is now best known as a cold- and menthol-activated channel implicated in thermosensation. TRPM8 is specifically expressed in a subset of pain- and temperature-sensing neuron. TRPM8 plays a major role in the sensation of cold and cooling substances. TRPM8 is a potential new target for the treatment of painful conditions. Thus, TRPM8 antagonists represent a new, novel and potentially useful treatment strategy to treat various disease states such as urological disorders, asthma, COPD, prostate and colon cancers, and painful conditions related to cold, such as cold allodynia and cold hyperalgesia. Better tools such as potent and specific TRPM8 antagonists are mandatory as high unmet medical need for such progress. To achieve this objective quantitative structure–activity relationship (QSAR studies were carried out on a series of 25 benzimidazole-containing TRPM8 antagonists to investigate the structural requirements of their inhibitory activity against cTRPM8. The statistically significant best 2D-QSAR model having correlation coefficient r2 = 0.88 and cross-validated squared correlation coefficient q2 = 0.64 with external predictive ability of pred_r2 = 0.69 was developed by SW-MLR. The physico-chemical descriptors such as polarizabilityAHP, kappa2, XcompDipole, +vePotentialSurfaceArea, XKMostHydrophilic were found to show a significant correlation with biological activity in benzimidazole derivatives. Molecular field analysis was used to construct the best 3D-QSAR model using SW-kNN method, showing good correlative and predictive capabilities in terms of q2 = 0.81 and pred_r2 = 0.55. Developed kNN-MFA model highlighted the importance of shape of the molecules, i.e., steric & electrostatic descriptors at the grid points S_774 & E_1024 for TRPM8 receptor binding. These models (2D & 3D were found to yield reliable clues for further optimization of benzimidazole derivatives in the data set. The information rendered by 2D- and 3D

  3. Response of dissolved trace metals to land use/land cover and their source apportionment using a receptor model in a subtropic river, China.

    Science.gov (United States)

    Li, Siyue; Zhang, Quanfa

    2011-06-15

    Water samples were collected for determination of dissolved trace metals in 56 sampling sites throughout the upper Han River, China. Multivariate statistical analyses including correlation analysis, stepwise multiple linear regression models, and principal component and factor analysis (PCA/FA) were employed to examine the land use influences on trace metals, and a receptor model of factor analysis-multiple linear regression (FA-MLR) was used for source identification/apportionment of anthropogenic heavy metals in the surface water of the River. Our results revealed that land use was an important factor in water metals in the snow melt flow period and land use in the riparian zone was not a better predictor of metals than land use away from the river. Urbanization in a watershed and vegetation along river networks could better explain metals, and agriculture, regardless of its relative location, however slightly explained metal variables in the upper Han River. FA-MLR analysis identified five source types of metals, and mining, fossil fuel combustion, and vehicle exhaust were the dominant pollutions in the surface waters. The results demonstrated great impacts of human activities on metal concentrations in the subtropical river of China. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers.......ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...

  5. SCI Hazard Report Methodology

    Science.gov (United States)

    Mitchell, Michael S.

    2010-01-01

    This slide presentation reviews the methodology in creating a Source Control Item (SCI) Hazard Report (HR). The SCI HR provides a system safety risk assessment for the following Ares I Upper Stage Production Contract (USPC) components (1) Pyro Separation Systems (2) Main Propulsion System (3) Reaction and Roll Control Systems (4) Thrust Vector Control System and (5) Ullage Settling Motor System components.

  6. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  7. Estimation of parameters of finite seismic source model for selected event of West Bohemia year 2008 seismic swarm-methodology improvement and data extension

    Czech Academy of Sciences Publication Activity Database

    Kolář, Petr

    2015-01-01

    Roč. 19, č. 4 (2015), s. 935-947 ISSN 1383-4649 Institutional support: RVO:67985530 Keywords : West Bohemia 2008 seismic swarm * finite seismic source * stopping phases Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 1.550, year: 2015

  8. A multi-criteria methodology for energy planning and developing renewable energy sources at a regional level: A case study Thassos, Greece

    International Nuclear Information System (INIS)

    Mourmouris, J.C.; Potolias, C.

    2013-01-01

    Rational energy planning under the pressure of environmental and economic problems is imperative to humanity. An evaluational framework is proposed in order to support energy planning for promoting the use of renewable energy sources. A multi-criteria decision analysis is adopted, detailing exploitation of renewable energy sources (including Wind, Solar, Biomass, Geothermal, and small Hydro) for power and heat generation. The aim of this paper is the analysis and development of a multilevel decision-making structure, utilizing multiple criteria for energy planning and exploitation of Renewable Energy Sources of at the regional level. The proposed evaluation framework focuses on the use of a multi-criteria approach as a tool for supporting energy planning in the area of concern, based on a pool of qualitative and quantitative evaluation criteria. The final aim of this study is to discover the optimal amount of each Renewable Energy Source that can be produced in the region and to contribute to an optimal energy mix. In this paper, a case study for the island of Thassos, Greece is analyzed. The results prove that Renewable Energy Sources exploitation at a regional level can satisfy increasing power demands through environmental-friendly energy systems that combine wind power, biomass and PV systems. - Highlights: ► An evaluational framework is proposed in order to support energy planning. ► A multi-criteria decision analysis is adopted, detailing exploitation of RES for power and heat generation. ► The aim is to discover the optimal amount of each RES that can be produced in each region.

  9. Development of methodology for the synthesis of poly(lactic acid-co-glycolic acid) for use in the production of radioactive sources

    International Nuclear Information System (INIS)

    Peleias Junior, Fernando dos Santos; Zeituni, Carlos Alberto; Rostelato, Maria Elisa Chuery Martins; Souza, Carla Daruich de; Mattos, Fabio Rodrigues de; Moura, Eduardo Santana de; Moura, Joao Augusto; Benega, Marcos Antonio Gimenes; Feher, Anselmo; Costa, Osvaldo Luiz da; Rodrigues, Bruna Teiga; Fechine, Guilhermino Jose

    2015-01-01

    According to the World Health Organization, cancer is a leading cause of death worldwide. A radiotherapy method extensively used in prostate cancer is brachytherapy, where the area requiring treatment receives radioactive seeds. Iodine-125 seeds can be inserted loose or stranded in bioabsorbable polymers produced from poly(lactic-co-glycolic acid) (PLGA). We developed the synthesis methodology for PLGA and the results obtained show that it was possible to determine the optimal reaction parameters (time and temperature) for PLGA in 80/20 (lactide/glycolide) ratio. The yield was higher than 90% using a temperature of 110 °C and reaction time of 72 hours; however, the molecular weight values obtained are very low compared to those obtained by other authors. New tests using previously synthesized dimers and nitrogen atmosphere are being performed. These conditions could potentially increase the molar mass of PLGA. All techniques used confirmed the expected structure of the polymer. (author)

  10. Source-receptor relationships for PM2.5 during typical pollution episodes in the Pearl River Delta city cluster, China

    Science.gov (United States)

    Fan, Q.; Liu, Y.; Hong, Y.; Wang, X.; Chan, P.; Chen, X.; Lai, A.; Wang, M.; Chen, X.

    2017-12-01

    Located in the Southern China monsoon region, pollution days in Pearl River Delta (PRD) were classified into "Western type", "Central type" or "Eastern type", with a relative percentage of 67%, 24% and 9%, respectively. Using this classification system, three typical pollution events were selected for numerical simulations using the WRF-Chem model. The source sensitivity method for anthropogenic emissions of PM2.5 and its precursors was applied to identify the source-receptor relationships for PM2.5 among 9 cities in PRD. For "Western type" case, the PRD region was under control of a high-pressure system with easterly prevailing winds. The PM2.5 concentrations in the western PRD region were higher than those in the eastern region, with emissions from cities in the eastern PRD region having higher contributions. Within the PRD's urban cluster, PM2.5 in Huizhou, Dongguan and Shenzhen was mainly derived from local emissions, whereas the PM2.5 in the other cities was primarily derived from external transport. For "Eastern type" case, the PRD was influenced by Typhoon Soulik with westerly prevailing winds. Emissions from cities in the western PRD region had the highest impacts on the overall PM2.5 concentration. PM2.5 in Jiangmen and Foshan was primarily derived from local emissions. Regarding "Central type" case, the PRD region was under control of a uniform pressure field with low wind speed. PM2.5 concentrations of each city were primarily caused by local emissions. Overall, wind flows played a significant role in the transport and spatial distribution of PM2.5 across the PRD region. Ideally, local governments would be wise to establish joint prevention and control measures to reduce regional atmospheric pollution, especially for "Western type" pollution.

  11. A study on the uncertainty based on Meteorological fields on Source-receptor Relationships for Total Nitrate in the Northeast Asia

    Science.gov (United States)

    Sunwoo, Y.; Park, J.; Kim, S.; Ma, Y.; Chang, I.

    2010-12-01

    Northeast Asia hosts more than one third of world population and the emission of pollutants trends to increase rapidly, because of economic growth and the increase of the consumption in high energy intensity. In case of air pollutants, especially, its characteristics of emissions and transportation become issued nationally, in terms of not only environmental aspects, but also long-range transboundary transportation. In meteorological characteristics, westerlies area means what air pollutants that emitted from China can be delivered to South Korea. Therefore, considering meteorological factors can be important to understand air pollution phenomena. In this study, we used MM5(Fifth-Generation Mesoscale Model) and WRF(Weather Research and Forecasting Model) to produce the meteorological fields. We analyzed the feature of physics option in each model and the difference due to characteristic of WRF and MM5. We are trying to analyze the uncertainty of source-receptor relationships for total nitrate according to meteorological fields in the Northeast Asia. We produced the each meteorological fields that apply the same domain, same initial and boundary conditions, the best similar physics option. S-R relationships in terms of amount and fractional number for total nitrate (sum of N from HNO3, nitrate and PAN) were calculated by EMEP method 3.

  12. Selection of earthquake resistant design criteria for nuclear power plants: Methodology and technical cases: Dislocation models of near-source earthquake ground motion: A review

    International Nuclear Information System (INIS)

    Luco, J.E.

    1987-05-01

    The solutions available for a number of dynamic dislocation fault models are examined in an attempt at establishing some of the expected characteristics of earthquake ground motion in the near-source region. In particular, solutions for two-dimensional anti-plane shear and plane-strain models as well as for three-dimensional fault models in full space, uniform half-space and layered half-space media are reviewed

  13. A methodological approach to a realistic evaluation of skin absorbed doses during manipulation of radioactive sources by means of GAMOS Monte Carlo simulations

    Science.gov (United States)

    Italiano, Antonio; Amato, Ernesto; Auditore, Lucrezia; Baldari, Sergio

    2018-05-01

    The accurate evaluation of the radiation burden associated with radiation absorbed doses to the skin of the extremities during the manipulation of radioactive sources is a critical issue in operational radiological protection, deserving the most accurate calculation approaches available. Monte Carlo simulation of the radiation transport and interaction is the gold standard for the calculation of dose distributions in complex geometries and in presence of extended spectra of multi-radiation sources. We propose the use of Monte Carlo simulations in GAMOS, in order to accurately estimate the dose to the extremities during manipulation of radioactive sources. We report the results of these simulations for 90Y, 131I, 18F and 111In nuclides in water solutions enclosed in glass or plastic receptacles, such as vials or syringes. Skin equivalent doses at 70 μm of depth and dose-depth profiles are reported for different configurations, highlighting the importance of adopting a realistic geometrical configuration in order to get accurate dosimetric estimations. Due to the easiness of implementation of GAMOS simulations, case-specific geometries and nuclides can be adopted and results can be obtained in less than about ten minutes of computation time with a common workstation.

  14. Methodological guidelines

    International Nuclear Information System (INIS)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-01-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs

  15. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  16. A novel natural source Vicia faba L. membranes as colourant: development and optimisation of the extraction process using response surface methodology (RSM).

    Science.gov (United States)

    Bouatay, Feriel; Baaka, Noureddine; Shahid, Adeel; Mhenni, Mohamed Farouk

    2018-02-02

    In this research paper, an eco-friendly extraction process of dyes from Vicia faba L. membranes was developed. In this regard, the influence of independent process factors like the weight of material, the extraction time, the temperature and the sodium hydroxide concentration on the natural dye extraction from Vicia faba membranes was investigated. The optimisation of the extraction conditions and the effect evaluation of the different operating parameters were carried out using a Box-Behnken design under response surface methodology. The optimum conditions were found to be 66 °C, 90 min, 5 g and 0.1628 mol·L -1 for extraction temperature, time, mass of the material and sodium hydroxide concentration, respectively. The efficiency of this extraction process under these optimum conditions was evaluated by measuring the total phenolic content (TPC), the total flavonoid content and the relative colour yield (K/S). In these operating conditions, good fastness ratios were observed for the dyed fabrics.

  17. Insulin receptors

    International Nuclear Information System (INIS)

    Kahn, C.R.; Harrison, L.C.

    1988-01-01

    This book contains the proceedings on insulin receptors. Part A: Methods for the study of structure and function. Topics covered include: Method for purification and labeling of insulin receptors, the insulin receptor kinase, and insulin receptors on special tissues

  18. The theoretical basis and clinical methodology for stereotactic interstitial brain tumor irradiation using iododeoxyuridine as a radiation sensitizer and samarium-145 as a brachytherapy source

    International Nuclear Information System (INIS)

    Goodman, J.H.; Gahbauer, R.A.; Kanellitsas, C.; Clendenon, N.R.; Laster, B.H.; Fairchild, R.G.

    1989-01-01

    High grade astrocytomas have proven resistant to all conventional therapy. A technique to produce radiation enhancement during interstitial brain tumor irradiation by using a radiation sensitizer (IdUrd) and by stimulation of Auger electron cascades through absorption of low energy photons in iodine (Photon activation) is described. Clinical studies using IdUrd, 192 Ir as a brachytherapy source, and external radiation have produced promising results. Substituting samarium-145 for 192 Ir in this protocol is expected to produce enhanced results. 15 refs

  19. Development of methodology for the synthesis of poly(lactic acid-co-glycolic acid) for use in the production of radioactive sources

    International Nuclear Information System (INIS)

    Peleias Junior, Fernando dos Santos

    2013-01-01

    According to World Health Organization (WHO), cancer is a leading cause of death worldwide. Prostate cancer is the second most common cancer in men. A method of radiotherapy which has been extensively used is brachytherapy, where radioactive seeds are placed inside the area requiring treatment. Iodine-125 seeds can be placed loose or stranded in bioabsorbable polymers. Stranded seeds show some advantages, since they reduce the rate of seed migration, an event that could affect the dosimetry of the prostate and cause unnecessary damage to healthy tissues or organs. For Iodine-125 stranded seeds, polyglactin 910 (poly(lactic-co-glycolic acid)) (PLGA), with a coverage of polyglactin 370 (Vicryl ®) is used. It was purposed in this dissertation, the study and development of the synthesis methodology for PLGA via ring-opening polymerization, as well as its characterization, with the objective of using the synthesized material to manufacture a material similar to RAPID Strand ® . The results obtained show that it was possible to determine the optimal reaction parameters (time and temperature) for PLGA in 80/20 (lactide/glycolide) ratio. Using a temperature of 110 ° C and reaction time of 24h, a yield of 86% was obtained, and increasing the reaction time to 72 hours, the yield was higher than 90%. The molecular mass values obtained from the samples are still very low compared to those obtained by other authors in the literature (about 20%). Failures in the sealing of vials, leaving them vulnerable to moisture and oxygen, or lack of an efficient stirring system might be possible explanations for these results. A suitable chemical reactor could solve the problem. Regarding polymer characterization, all techniques used not only confirmed the expected structure of the polymer, but also showed the highest proportion of lactide units compared to to glycolide units. (author)

  20. Towards an optimal adaptation of exposure to NOAA assessment methodology in Multi-Source Industrial Scenarios (MSIS): the challenges and the decision-making process

    Science.gov (United States)

    López de Ipiña, JM; Vaquero, C.; Gutierrez-Cañas, C.

    2017-06-01

    It is expected a progressive increase of the industrial processes that manufacture of intermediate (iNEPs) and end products incorporating ENMs (eNEPs) to bring about improved properties. Therefore, the assessment of occupational exposure to airborne NOAA will migrate, from the simple and well-controlled exposure scenarios in research laboratories and ENMs production plants using innovative production technologies, to much more complex exposure scenarios located around processes of manufacture of eNEPs that, in many cases, will be modified conventional production processes. Here will be discussed some of the typical challenging situations in the process of risk assessment of inhalation exposure to NOAA in Multi-Source Industrial Scenarios (MSIS), from the basis of the lessons learned when confronted to those scenarios in the frame of some European and Spanish research projects.

  1. MIRD methodology

    International Nuclear Information System (INIS)

    Rojo, Ana M.; Gomez Parada, Ines

    2004-01-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained

  2. PSA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Magne, L

    1997-12-31

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300{sup 1} and EPS 900{sup 2} PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs.

  3. PSA methodology

    International Nuclear Information System (INIS)

    Magne, L.

    1996-01-01

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300 1 and EPS 900 2 PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs

  4. Methodology of environmental risk assessment management

    Directory of Open Access Journals (Sweden)

    Saša T. Bakrač

    2012-04-01

    Full Text Available Successful protection of environment is mostly based on high-quality assessment of potential and present risks. Environmental risk management is a complex process which includes: identification, assessment and control of risk, namely taking measures in order to minimize the risk to an acceptable level. Environmental risk management methodology: In addition to these phases in the management of environmental risk, appropriate measures that affect the reduction of risk occurrence should be implemented: - normative and legal regulations (laws and regulations, - appropriate organizational structures in society, and - establishing quality monitoring of environment. The emphasis is placed on the application of assessment methodologies (three-model concept, as the most important aspect of successful management of environmental risk. Risk assessment methodology - European concept: The first concept of ecological risk assessment methodology is based on the so-called European model-concept. In order to better understand this ecological risk assessment methodology, two concepts - hazard and risk - are introduced. The European concept of environmental risk assessment has the following phases in its implementation: identification of hazard (danger, identification of consequences (if there is hazard, estimate of the scale of consequences, estimate of consequence probability and risk assessment (also called risk characterization. The European concept is often used to assess risk in the environment as a model for addressing the distribution of stressors along the source - path - receptor line. Risk assessment methodology - Canadian concept: The second concept of the methodology of environmental risk assessment is based on the so-called Canadian model-concept. The assessment of ecological risk includes risk arising from natural events (floods, extreme weather conditions, etc., technological processes and products, agents (chemical, biological, radiological, etc

  5. Renewable Energy Monitoring Protocol. Update 2010. Methodology for the calculation and recording of the amounts of energy produced from renewable sources in the Netherlands

    Energy Technology Data Exchange (ETDEWEB)

    Te Buck, S.; Van Keulen, B.; Bosselaar, L.; Gerlagh, T.; Skelton, T.

    2010-07-15

    This is the fifth, updated edition of the Dutch Renewable Energy Monitoring Protocol. The protocol, compiled on behalf of the Ministry of Economic Affairs, can be considered as a policy document that provides a uniform calculation method for determining the amount of energy produced in the Netherlands in a renewable manner. Because all governments and organisations use the calculation methods described in this protocol, this makes it possible to monitor developments in this field well and consistently. The introduction of this protocol outlines the history and describes its set-up, validity and relationship with other similar documents and agreements. The Dutch Renewable Energy Monitoring Protocol is compiled by NL Agency, and all relevant parties were given the chance to provide input. This has been incorporated as far as is possible. Statistics Netherlands (CBS) uses this protocol to calculate the amount of renewable energy produced in the Netherlands. These data are then used by the Ministry of Economic Affairs to gauge the realisation of policy objectives. In June 2009 the European Directive for energy from renewable sources was published with renewable energy targets for the Netherlands. This directive used a different calculation method - the gross energy end-use method - whilst the Dutch definition is based on the so-called substitution method. NL Agency was asked to add the calculation according to the gross end use method, although this is not clearly defined on a number of points. In describing the method, the unanswered questions become clear, as do, for example, the points the Netherlands should bring up in international discussions.

  6. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  7. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  8. Differentiated human midbrain-derived neural progenitor cells express excitatory strychnine-sensitive glycine receptors containing α2β subunits.

    Directory of Open Access Journals (Sweden)

    Florian Wegner

    Full Text Available BACKGROUND: Human fetal midbrain-derived neural progenitor cells (NPCs may deliver a tissue source for drug screening and regenerative cell therapy to treat Parkinson's disease. While glutamate and GABA(A receptors play an important role in neurogenesis, the involvement of glycine receptors during human neurogenesis and dopaminergic differentiation as well as their molecular and functional characteristics in NPCs are largely unknown. METHODOLOGY/PRINCIPAL FINDINGS: Here we investigated NPCs in respect to their glycine receptor function and subunit expression using electrophysiology, calcium imaging, immunocytochemistry, and quantitative real-time PCR. Whole-cell recordings demonstrate the ability of NPCs to express functional strychnine-sensitive glycine receptors after differentiation for 3 weeks in vitro. Pharmacological and molecular analyses indicate a predominance of glycine receptor heteromers containing α2β subunits. Intracellular calcium measurements of differentiated NPCs suggest that glycine evokes depolarisations mediated by strychnine-sensitive glycine receptors and not by D-serine-sensitive excitatory glycine receptors. Culturing NPCs with additional glycine, the glycine-receptor antagonist strychnine, or the Na(+-K(+-Cl(- co-transporter 1 (NKCC1-inhibitor bumetanide did not significantly influence cell proliferation and differentiation in vitro. CONCLUSIONS/SIGNIFICANCE: These data indicate that NPCs derived from human fetal midbrain tissue acquire essential glycine receptor properties during neuronal maturation. However, glycine receptors seem to have a limited functional impact on neurogenesis and dopaminergic differentiation of NPCs in vitro.

  9. Lipophorin Receptor: The Insect Lipoprotein Receptor

    Indian Academy of Sciences (India)

    IAS Admin

    Director of ... function of the Lp is to deliver lipids throughout the insect body for metabolism ... Lipid is used as a major energy source for development as well as other metabolic .... LpR4 receptor variant was expressed exclusively in the brain and.

  10. Somatostatin receptors

    DEFF Research Database (Denmark)

    Møller, Lars Neisig; Stidsen, Carsten Enggaard; Hartmann, Bolette

    2003-01-01

    functional units, receptors co-operate. The total receptor apparatus of individual cell types is composed of different-ligand receptors (e.g. SRIF and non-SRIF receptors) and co-expressed receptor subtypes (e.g. sst(2) and sst(5) receptors) in characteristic proportions. In other words, levels of individual......-peptides, receptor agonists and antagonists. Relatively long half lives, as compared to those of the endogenous ligands, have been paramount from the outset. Motivated by theoretical puzzles or the shortcomings of present-day diagnostics and therapy, investigators have also aimed to produce subtype...

  11. GENESIS OF METHODOLOGY OF MANAGEMENT BY DEVELOPMENT OF ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Z.N. Varlamova

    2007-06-01

    Full Text Available In clause the genesis of methodology of management of development of organizations as sets of the used methodological approaches and methods is investigated. The results of the comparative analysis of the methodological approaches to management of organizational development are submitted. The traditional methodological approaches are complemented strategic experiment and methodology case studies. The approaches to formation of new methodology and technique of research of sources of competitive advantages of organization are considered.

  12. Methodology for the case studies

    NARCIS (Netherlands)

    Smits, M.J.W.; Woltjer, G.B.

    2017-01-01

    This document is about the methodology and selection of the case studies. It is meant as a guideline for the case studies, and together with the other reports in this work package can be a source of inform ation for policy officers, interest groups and researchers evaluating or performing impact

  13. Gut-Sourced Vasoactive Intestinal Polypeptide Induced by the Activation of α7 Nicotinic Acetylcholine Receptor Substantially Contributes to the Anti-inflammatory Effect of Sinomenine in Collagen-Induced Arthritis

    Directory of Open Access Journals (Sweden)

    MengFan Yue

    2018-06-01

    Full Text Available Sinomenine has long been used for the treatment of rheumatoid arthritis in China. However, its anti-inflammatory mechanism is still debatable because the in vitro minimal effective concentration (≥250 μM is hardly reached in either synovium or serum after oral administration at a therapeutic dose. Recent findings suggest that the α7 nicotinic acetylcholine receptor (α7nAChR might mediate the inhibitory effect of sinomenine on macrophage activation, which attracts us to explore the anti-arthritis mechanism of sinomenine by taking neuroendocrine-inflammation axis into consideration. Here, we showed that orally administered sinomenine ameliorated the systemic inflammation of collagen-induced arthritis (CIA rats, which was significantly diminished by either vagotomy or the antagonists of nicotinic acetylcholine receptors (especially the antagonist of α7nAChR, but not by the antagonists of muscarinic receptor. Sinomenine might bind to α7nAChR through interacting with the residues Tyr184 and Tyr191 in the pocket. In addition, the generation of vasoactive intestinal polypeptide (VIP from the gut of CIA rats and cultured neuron-like cells was selectively enhanced by sinomenine through the activation of α7nAChR-PI3K/Akt/mTOR pathway. The elevated levels of VIP in the serum and small intestine of rats were negatively correlated with the scores of joint destruction. The crucial role of VIP in the anti-arthritic effect of sinomenine was confirmed by using VIP hybrid, a non-specific antagonist of VIP receptor. Taken together, intestine-sourced VIP mediates the anti-arthritic effect of sinomenine, which is generated by the activation of α7nAChR.

  14. Neurotransmitter receptor imaging

    International Nuclear Information System (INIS)

    Cordes, M.; Hierholzer, J.; Nikolai-Beyer, K.

    1993-01-01

    The importance of neuroreceptor imaging in vivo using single photon emission tomography (SPECT) and positron emission tomography (PET) has increased enormously. The principal neurotransmitters, such as dopamine, GABA/benzodiazepine, acetylcholine, and serotonin, are presented with reference to anatomical, biochemical, and physiological features. The main radioligands for SPECT and PET are introduced, and methodological characteristics of both PET and SPECT presented. Finally, the results of neurotransmitter receptor imaging obtained so far will be discussed. (orig.) [de

  15. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  16. Sources of trace elements observed in the Arctic aerosol

    International Nuclear Information System (INIS)

    Hopke, P.K.; Cheng, M.D.; Landsberger, S.; Barrie, L.A.

    1991-01-01

    There have been many efforts made to identify the sources of the airborne particles seen in the Arctic. In this study, the Potential Source Contribution Function (PSCF), a probability function based on the air parcel trajectory data coupled with the contaminant concentrations measured in that air parcel, has been calculated for a series of week long airborne particle samples collected at Alert, N.W.T. between 1983 and 1987. These samples have been analyzed by instrumental neutron activation. Using calculated 3 level back trajectories and extended Total Potential Source Contribution methodology, the patterns of total potential source contribution probabilities can be examined for each individual species or based on the results of a principal components analysis of the elemental data to examine covarying species. Regions with high PSCF values have a higher probability of contributing pollutants to the measured concentrations at the receptor site. The implications of these results for more specific identification of source regions of the various species are discussed

  17. RHIC Data Correlation Methodology

    International Nuclear Information System (INIS)

    Michnoff, R.; D'Ottavio, T.; Hoff, L.; MacKay, W.; Satogata, T.

    1999-01-01

    A requirement for RHIC data plotting software and physics analysis is the correlation of data from all accelerator data gathering systems. Data correlation provides the capability for a user to request a plot of multiple data channels vs. time, and to make meaningful time-correlated data comparisons. The task of data correlation for RHIC requires careful consideration because data acquisition triggers are generated from various asynchronous sources including events from the RHIC Event Link, events from the two Beam Sync Links, and other unrelated clocks. In order to correlate data from asynchronous acquisition systems a common time reference is required. The RHIC data correlation methodology will allow all RHIC data to be converted to a common wall clock time, while still preserving native acquisition trigger information. A data correlation task force team, composed of the authors of this paper, has been formed to develop data correlation design details and provide guidelines for software developers. The overall data correlation methodology will be presented in this paper

  18. Relative Hazard Calculation Methodology

    International Nuclear Information System (INIS)

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-01-01

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation)

  19. (3H)leukotriene B4 binding to the guinea pig spleen membranes: a rich tissue source for a high affinity leukotriene B4 receptor site

    International Nuclear Information System (INIS)

    Cheng, J.B.; Kohi, F.; Townley, R.G.

    1986-01-01

    To select a tissue rich for the high affinity leukotriene (LT)B 4 receptor site, they compared binding of 1 nM ( 3 H)LTB 4 (180 Ci/mmol) to the crude membrane preparations of guinea pig spleen, thymus, lung, uterus, bladder, brain, adrenal gland, small intestine, liver, kidney and heart. They found that the membrane preparations from spleen contained the highest binding activity per mg protein. They characterized the LTB 4 binding to the spleen preparation in detail. LTB 4 binding was rapid, reversible, stereoselective and saturable. The data from equilibrium experiments showed a linear Scatchard plot with a K/sub d/ of 1.6 nM and a binding site density of 259 fmol/mg prot. The rank order of agents competing for spleen ( 3 H)LTB 4 binding at 25 0 C was: LTB 4 (K/sub i/ = 2.8 nM) > 20-OH-LTB 4 (23 nM) > LTA 4 (48 nM) > LTA 4 methyl ester (0.13 μM) > 20-COOH-LTB 4 (> 6.6 μM) ≥ arachidonic acid (0.15 mM) similarly ordered FPL-55,712 (0.11 mM). At 4 0 C, LTB 4 (2.3 nM) competed at least 10x more effectively than 20-OH-LTB 4 (29 nM) and 20-COOH-LTB 4 (> 6.6 μM). HPLC analysis indicated that incubation of 84 ng LTB 4 with the spleen membrane at 25 0 C did not result in the formation of 20-OH-LTB 4 ( 3 H)LTB 4 receptor binding sites

  20. Receptor assay

    Energy Technology Data Exchange (ETDEWEB)

    Kato, K; Ibayashi, H [Kyushu Univ., Fukuoka (Japan). Faculty of Medicine

    1975-05-01

    This paper summarized present status and problems of analysis of hormone receptor and a few considerations on clinical significance of receptor abnormalities. It was pointed that in future clinical field quantitative and qualitative analysis of receptor did not remain only in the etiological discussion, but that it was an epoch-making field of investigation which contained the possiblity of artificial change of sensitivity of living body on drugs and the development connected directly with treatment of various diseases.

  1. Methodology for building confidence measures

    Science.gov (United States)

    Bramson, Aaron L.

    2004-04-01

    This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.

  2. Country report: a methodology

    International Nuclear Information System (INIS)

    Colin, A.

    2013-01-01

    This paper describes a methodology which could be applicable to establish a country report. In the framework of nuclear non proliferation appraisal and IAEA safeguards implementation, it is important to be able to assess the potential existence of undeclared nuclear materials and activities as undeclared facilities in the country under review. In our views a country report should aim at providing detailed information on nuclear related activities for each country examined taken 'as a whole' such as nuclear development, scientific and technical capabilities, etc. In order to study a specific country, we need to know if there is already an operating nuclear civil programme or not. In the first case, we have to check carefully if it could divert nuclear material, if there are misused declared facilities or if they operate undeclared facilities and conduct undeclared activities aiming at manufacturing nuclear weapon. In the second case, we should pay attention to the development of a nuclear civil project. A country report is based on a wide span of information (most of the time coming from open sources but sometimes coming also from confidential or private ones). Therefore, it is important to carefully check the nature and the credibility (reliability?) of these sources through cross-check examination. Eventually, it is necessary to merge information from different sources and apply an expertise filter. We have at our disposal a lot of performing tools to help us to assess, understand and evaluate the situation (cartography, imagery, bibliometry, etc.). These tools allow us to offer the best conclusions as far as possible. The paper is followed by the slides of the presentation. (author)

  3. Measurement of the α4β2* nicotinic acetylcholine receptor ligand 2-[18F]Fluoro-A-85380 and its metabolites in human blood during PET investigation: a methodological study

    International Nuclear Information System (INIS)

    Sorger, Dietlind; Becker, Georg A.; Patt, Marianne; Schildan, Andreas; Grossmann, Udo; Schliebs, Reinhard; Seese, Anita; Kendziorra, Kai; Kluge, Magnus; Brust, Peter; Mukhin, Alexey G.; Sabri, Osama

    2007-01-01

    2-[ 18 F]fluoro-A-85380 (2-[ 18 F]FA) is a new radioligand for noninvasive imaging of α4β2* nicotinic acetylcholine receptors (nAChRs) by positron emission tomography (PET) in human brain. In most cases, quantification of 2-[ 18 F]FA receptor binding involves measurement of free nonmetabolized radioligand concentration in blood. This requires an efficient and reliable method to separate radioactive metabolites from the parent compound. In the present study, three analytical methods, thin layer chromatography (TLC), high-performance liquid chromatography (HPLC) and solid phase extraction (SPE) have been tested. Reversed-phase TLC of deproteinized aqueous samples of plasma provides good estimates of 2-[ 18 F]FA and its metabolites. However, because of the decreased radioactivity in plasma samples, this method can be used in humans over the first 2 h after radioligand injection only. Reliable quantification of the parent radioligand and its main metabolites was obtained using reversed-phase HPLC, followed by counting of eluted fractions in a well gamma counter. Three main and five minor metabolites of 2-[ 18 F]FA were detected in human blood using this method. On average, the unchanged 2-[ 18 F]FA fraction in plasma of healthy volunteers measured at 14, 60, 120, 240 and 420 min after radioligand injection was 87.3±2.2%, 74.4±3%, 68.8±5%, 62.3±8% and 61.0±8%, respectively. In patients with neurodegenerative disorders, the values corresponding to the three last time points were significantly lower. The fraction of nonmetabolized 2-[ 18 F]FA in plasma determined using SPE did not differ significantly from that obtained by HPLC (+gamma counting) (n=73, r=.95). Since SPE is less time-consuming than HPLC and provides comparable results, we conclude that SPE appears to be the most suitable method for measurement of 2-[ 18 F]FA parent fraction during PET investigations

  4. Enhanced electricity system analysis for decision making - A reference book[Inter-agency joint project on data bases and methodologies for comparative assessment of different energy sources for electricity generation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-06-01

    The objective of electricity system analysis in support of decision making is to provide comparative assessment results upon which relevant policy choices between alternative technology options and supply strategies can be based. This reference book offers analysts, planners and decision makers documented information on enhanced approaches to electricity system analysis, that can assist in achieving this objective. The book describes the main elements of comprehensive electricity system analysis and outlines an advanced integrated analysis and decision making framework for the electric power sector. Emphasis is placed on mechanisms for building consensus between interested and affected parties, and on aspects of planning that go beyond the traditional economic optimisation approach. The scope and contents of the book cover the topics to be addressed in decision making for the power sector and the process of integrating economic, social, health and environmental aspects in the comparative assessment of alternative options and strategies. The book describes and discusses overall frameworks, processes and state of the art methods and techniques available to analysts and planners for carrying out comparative assessment studies, in order to provide sound information to decision makers. This reference book is published as part of a series of technical reports and documents prepared in the framework of the inter-agency joint project (DECADES) on databases and methodologies for comparative assessment of different energy sources for electricity generation. The overall objective of the DECADES project is to enhance capabilities for incorporating economic, social, health and environmental issues in the comparative assessment of electricity generation options and strategies in the process of decision making for the power sector. The project, established in 1992, is carried out jointly by the European Commission (EC), the Economic and Social Commission for Asia and the Pacific

  5. Methodology Using MELCOR Code to Model Proposed Hazard Scenario

    Energy Technology Data Exchange (ETDEWEB)

    Gavin Hawkley

    2010-07-01

    This study demonstrates a methodology for using the MELCOR code to model a proposed hazard scenario within a building containing radioactive powder, and the subsequent evaluation of a leak path factor (LPF) (or the amount of respirable material which that escapes a facility into the outside environment), implicit in the scenario. This LPF evaluation will analyzes the basis and applicability of an assumed standard multiplication of 0.5 × 0.5 (in which 0.5 represents the amount of material assumed to leave one area and enter another), for calculating an LPF value. The outside release is dependsent upon the ventilation/filtration system, both filtered and un-filtered, and from other pathways from the building, such as doorways (, both open and closed). This study is presents ed to show how the multiple leak path factorsLPFs from the interior building can be evaluated in a combinatory process in which a total leak path factorLPF is calculated, thus addressing the assumed multiplication, and allowing for the designation and assessment of a respirable source term (ST) for later consequence analysis, in which: the propagation of material released into the environmental atmosphere can be modeled and the dose received by a receptor placed downwind can be estimated and the distance adjusted to maintains such exposures as low as reasonably achievableALARA.. Also, this study will briefly addresses particle characteristics thatwhich affect atmospheric particle dispersion, and compares this dispersion with leak path factorLPF methodology.

  6. Radioreceptor assays: plasma membrane receptors and assays for polypeptide and glycoprotein hormones

    International Nuclear Information System (INIS)

    Schulster, D.

    1977-01-01

    Receptors for peptide, protein and glycoprotein hormones, and the catecholamines are located on the plasma membranes of their target cells. Preparations of the receptors may be used as specific, high-affinity binding agents for these hormones in assay methodology akin to that for radioimmunoassay. A particular advantage of the radioreceptor assay is that it has a specificity directed towards the biologically active region of the hormone, rather than to some immunologically active region that may have little (or no) involvement in the expression of hormonal activity. Methods for hormone receptor preparation vary greatly, and range from the use of intact cells (as the source of hormone receptor) to the use of purified or solubilized membrane receptors. Receptors isolated from plasma membranes have proved to be of variable stability, and may be damaged during preparation and/or storage. Moreover, since they are present in relatively low concentration in the cell, their preparation in sufficient quantity for use in a radioreceptor assay may present technical problems. In general, there is good correlation between radioreceptor assays and in-vitro bioassays; differences between results from radioreceptor assays and radioimmunoassays are similar to those noted between in-vitro bioassays and radioimmunoassays. The sensitivity of the method is such that normal plasma concentrations of various hormones have been assayed by this technique. (author)

  7. A methodology for string resolution

    International Nuclear Information System (INIS)

    Karonis, N.T.

    1992-11-01

    In this paper we present a methodology, not a tool. We present this methodology with the intent that it be adopted, on a case by case basis, by each of the existing tools in EPICS. In presenting this methodology, we describe each of its two components in detail and conclude with an example depicting how the methodology can be used across a pair of tools. The task of any control system is to provide access to the various components of the machine being controlled, for example, the Advanced Photon Source (APS). By access, we mean the ability to monitor the machine's status (reading) as well as the ability to explicitly change its status (writing). The Experimental Physics and Industrial Control System (EPICS) is a set of tools, designed to act in concert, that allows one to construct a control system. EPICS provides the ability to construct a control system that allows reading and writing access to the machine. It does this through the notion of databases. Each of the components of the APS that is accessed by the control system is represented in EPICS by a set of named database records. Once this abstraction is made, from physical device to named database records, the process of monitoring and changing the state of that device becomes the simple process of reading and writing information from and to its associated named records

  8. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  9. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  10. Nondestructive assay methodologies in nuclear forensics analysis

    International Nuclear Information System (INIS)

    Tomar, B.S.

    2016-01-01

    In the present chapter, the nondestructive assay (NDA) methodologies used for analysis of nuclear materials as a part of nuclear forensic investigation have been described. These NDA methodologies are based on (i) measurement of passive gamma and neutrons emitted by the radioisotopes present in the nuclear materials, (ii) measurement of gamma rays and neutrons emitted after the active interrogation of the nuclear materials with a source of X-rays, gamma rays or neutrons

  11. Big data and virtual communities: methodological issues

    OpenAIRE

    Martínez Torres, María del Rocío; Toral, S. L.; Fornara, Nicoletta

    2014-01-01

    Virtual communities represent today en emergent phenomenon through which users get together to create ideas, to obtain help from one another, or just to casually engage in discussions. Their increasing popularity as well as their utility as a source of business value and marketing strategies justify the necessity of defi ning some specifi c methodologies for analyzing them. The aim of this paper is providing new insights into virtual communities from a methodological viewpoint, hi...

  12. Imaginative methodologies in the social sciences

    DEFF Research Database (Denmark)

    Imaginative Methodologies develops, expands and challenges conventional social scientific methodology and language by way of literary, poetic and other alternative sources of inspiration. Sociologists, social workers, anthropologists, criminologists and psychologists all try to rethink, provoke...... and reignite social scientific methodology. Imaginative Methodologies challenges the mainstream social science methodological orthodoxy closely guarding the boundaries between the social sciences and the arts and humanities, pointing out that authors and artists are often engaged in projects parallel to those...... of the social sciences and vice versa, and that artistic and cultural productions today do not constitute a specialist field, but are integral to our social reality. The book will be of interest to scholars and students in the social sciences and across the arts and humanities working with questions...

  13. Environmental risk assessment of water quality in harbor areas: a new methodology applied to European ports.

    Science.gov (United States)

    Gómez, Aina G; Ondiviela, Bárbara; Puente, Araceli; Juanes, José A

    2015-05-15

    This work presents a standard and unified procedure for assessment of environmental risks at the contaminant source level in port aquatic systems. Using this method, port managers and local authorities will be able to hierarchically classify environmental hazards and proceed with the most suitable management actions. This procedure combines rigorously selected parameters and indicators to estimate the environmental risk of each contaminant source based on its probability, consequences and vulnerability. The spatio-temporal variability of multiple stressors (agents) and receptors (endpoints) is taken into account to provide accurate estimations for application of precisely defined measures. The developed methodology is tested on a wide range of different scenarios via application in six European ports. The validation process confirms its usefulness, versatility and adaptability as a management tool for port water quality in Europe and worldwide. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Multivariate Receptor Models for Spatially Correlated Multipollutant Data

    KAUST Repository

    Jun, Mikyoung; Park, Eun Sug

    2013-01-01

    The goal of multivariate receptor modeling is to estimate the profiles of major pollution sources and quantify their impacts based on ambient measurements of pollutants. Traditionally, multivariate receptor modeling has been applied to multiple air

  15. Radioisotope methodology course radioprotection aspects

    International Nuclear Information System (INIS)

    Bergoc, R.M.; Caro, R.A.; Menossi, C.A.

    1996-01-01

    The advancement knowledge in molecular and cell biology, biochemistry, medicine and pharmacology, which has taken place during the last 50 years, after World War II finalization, is really outstanding. It can be safely said that this fact is principally due to the application of radioisotope techniques. The research on metabolisms, biodistribution of pharmaceuticals, pharmacodynamics, etc., is mostly carried out by means of techniques employing radioactive materials. Radioisotopes and radiation are frequently used in medicine both as diagnostic and therapeutic tools. The radioimmunoanalysis is today a routine method in endocrinology and in general clinical medicine. The receptor determination and characterization is a steadily growing methodology used in clinical biochemistry, pharmacology and medicine. The use of radiopharmaceuticals and radiation of different origins, for therapeutic purposes, should not be overlooked. For these reasons, the importance to teach radioisotope methodology is steadily growing. This is principally the case for specialization at the post-graduate level but at the pre graduate curriculum it is worthwhile to give some elementary theoretical and practical notions on this subject. These observations are justified by a more than 30 years teaching experience at both levels at the School of Pharmacy and Biochemistry of the University of Buenos Aires, Argentina. In 1960 we began to teach Physics III, an obligatory pregraduate course for biochemistry students, in which some elementary notions of radioactivity and measurement techniques were given. Successive modifications of the biochemistry pregraduate curriculum incorporated radiochemistry as an elective subject and since 1978, radioisotope methodology, as obligatory subject for biochemistry students. This subject is given at the radioisotope laboratory during the first semester of each year and its objective is to provide theoretical and practical knowledge to the biochemistry students, even

  16. Introduction to LCA Methodology

    DEFF Research Database (Denmark)

    Hauschild, Michael Z.

    2018-01-01

    In order to offer the reader an overview of the LCA methodology in the preparation of the more detailed description of its different phases, a brief introduction is given to the methodological framework according to the ISO 14040 standard and the main elements of each of its phases. Emphasis...

  17. Methodologies, languages and tools

    International Nuclear Information System (INIS)

    Amako, Katsuya

    1994-01-01

    This is a summary of the open-quotes Methodologies, Languages and Toolsclose quotes session in the CHEP'94 conference. All the contributions to methodologies and languages are relevant to the object-oriented approach. Other topics presented are related to various software tools in the down-sized computing environment

  18. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Menopause and Methodological Doubt

    Science.gov (United States)

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  20. VEM: Virtual Enterprise Methodology

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a virtual enterprise methodology (VEM) that outlines activities to consider when setting up and managing virtual enterprises (VEs). As a methodology the VEM helps companies to ask the right questions when preparing for and setting up an enterprise network, which works...

  1. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  2. The Methodology of Magpies

    Science.gov (United States)

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  3. Open-Source Colorimeter

    OpenAIRE

    Anzalone, Gerald C.; Glover, Alexandra G.; Pearce, Joshua M.

    2013-01-01

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial porta...

  4. Screening radon risks: A methodology for policymakers

    International Nuclear Information System (INIS)

    Eisinger, D.S.; Simmons, R.A.; Lammering, M.; Sotiros, R.

    1991-01-01

    This paper provides an easy-to-use screening methodology to estimate potential excess lifetime lung cancer risk resulting from indoor radon exposure. The methodology was developed under U.S. EPA Office of Policy, Planning, and Evaluation sponsorship of the agency's Integrated Environmental Management Projects (IEMP) and State/Regional Comparative Risk Projects. These projects help policymakers understand and use scientific data to develop environmental problem-solving strategies. This research presents the risk assessment methodology, discusses its basis, and identifies appropriate applications. The paper also identifies assumptions built into the methodology and qualitatively addresses methodological uncertainties, the direction in which these uncertainties could bias analyses, and their relative importance. The methodology draws from several sources, including risk assessment formulations developed by the U.S. EPA's Office of Radiation Programs, the EPA's Integrated Environmental Management Project (Denver), the International Commission on Radiological Protection, and the National Institute for Occupational Safety and Health. When constructed as a spreadsheet program, the methodology easily facilitates analyses and sensitivity studies (the paper includes several sensitivity study options). The methodology will be most helpful to those who need to make decisions concerning radon testing, public education, and exposure prevention and mitigation programs.26 references

  5. Energy Efficiency Indicators Methodology Booklet

    Energy Technology Data Exchange (ETDEWEB)

    Sathaye, Jayant; Price, Lynn; McNeil, Michael; de la rue du Can, Stephane

    2010-05-01

    This Methodology Booklet provides a comprehensive review and methodology guiding principles for constructing energy efficiency indicators, with illustrative examples of application to individual countries. It reviews work done by international agencies and national government in constructing meaningful energy efficiency indicators that help policy makers to assess changes in energy efficiency over time. Building on past OECD experience and best practices, and the knowledge of these countries' institutions, relevant sources of information to construct an energy indicator database are identified. A framework based on levels of hierarchy of indicators -- spanning from aggregate, macro level to disaggregated end-use level metrics -- is presented to help shape the understanding of assessing energy efficiency. In each sector of activity: industry, commercial, residential, agriculture and transport, indicators are presented and recommendations to distinguish the different factors affecting energy use are highlighted. The methodology booklet addresses specifically issues that are relevant to developing indicators where activity is a major factor driving energy demand. A companion spreadsheet tool is available upon request.

  6. GPS system simulation methodology

    Science.gov (United States)

    Ewing, Thomas F.

    1993-01-01

    The following topics are presented: background; Global Positioning System (GPS) methodology overview; the graphical user interface (GUI); current models; application to space nuclear power/propulsion; and interfacing requirements. The discussion is presented in vugraph form.

  7. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  8. Nonlinear Image Denoising Methodologies

    National Research Council Canada - National Science Library

    Yufang, Bao

    2002-01-01

    In this thesis, we propose a theoretical as well as practical framework to combine geometric prior information to a statistical/probabilistic methodology in the investigation of a denoising problem...

  9. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    "Now viewed as its own scientific discipline, clinical trial methodology encompasses the methods required for the protection of participants in a clinical trial and the methods necessary to provide...

  10. Methodology of sustainability accounting

    Directory of Open Access Journals (Sweden)

    O.H. Sokil

    2017-03-01

    Full Text Available Modern challenges of the theory and methodology of accounting are realized through the formation and implementation of new concepts, the purpose of which is to meet the needs of users in standard and unique information. The development of a methodology for sustainability accounting is a key aspect of the management of an economic entity. The purpose of the article is to form the methodological bases of accounting for sustainable development and determine its goals, objectives, object, subject, methods, functions and key aspects. The author analyzes the theoretical bases of the definition and considers the components of the traditional accounting methodology. Generalized structural diagram of the methodology for accounting for sustainable development is offered in the article. The complex of methods and principles of sustainable development accounting for systematized and non-standard provisions has been systematized. The new system of theoretical and methodological provisions of accounting for sustainable development is justified in the context of determining its purpose, objective, subject, object, methods, functions and key aspects.

  11. Comprehensive Binary Interaction Mapping of SH2 Domains via Fluorescence Polarization Reveals Novel Functional Diversification of ErbB Receptors

    Science.gov (United States)

    Ciaccio, Mark F.; Chuu, Chih-pin; Jones, Richard B.

    2012-01-01

    First-generation interaction maps of Src homology 2 (SH2) domains with receptor tyrosine kinase (RTK) phosphosites have previously been generated using protein microarray (PM) technologies. Here, we developed a large-scale fluorescence polarization (FP) methodology that was able to characterize interactions between SH2 domains and ErbB receptor phosphosites with higher fidelity and sensitivity than was previously achieved with PMs. We used the FP assay to query the interaction of synthetic phosphopeptides corresponding to 89 ErbB receptor intracellular tyrosine sites against 93 human SH2 domains and 2 phosphotyrosine binding (PTB) domains. From 358,944 polarization measurements, the affinities for 1,405 unique biological interactions were determined, 83% of which are novel. In contrast to data from previous reports, our analyses suggested that ErbB2 was not more promiscuous than the other ErbB receptors. Our results showed that each receptor displays unique preferences in the affinity and location of recruited SH2 domains that may contribute to differences in downstream signaling potential. ErbB1 was enriched versus the other receptors for recruitment of domains from RAS GEFs whereas ErbB2 was enriched for recruitment of domains from tyrosine and phosphatidyl inositol phosphatases. ErbB3, the kinase inactive ErbB receptor family member, was predictably enriched for recruitment of domains from phosphatidyl inositol kinases and surprisingly, was enriched for recruitment of domains from tyrosine kinases, cytoskeletal regulatory proteins, and RHO GEFs but depleted for recruitment of domains from phosphatidyl inositol phosphatases. Many novel interactions were also observed with phosphopeptides corresponding to ErbB receptor tyrosines not previously reported to be phosphorylated by mass spectrometry, suggesting the existence of many biologically relevant RTK sites that may be phosphorylated but below the detection threshold of standard mass spectrometry procedures. This

  12. Comprehensive binary interaction mapping of SH2 domains via fluorescence polarization reveals novel functional diversification of ErbB receptors.

    Directory of Open Access Journals (Sweden)

    Ronald J Hause

    Full Text Available First-generation interaction maps of Src homology 2 (SH2 domains with receptor tyrosine kinase (RTK phosphosites have previously been generated using protein microarray (PM technologies. Here, we developed a large-scale fluorescence polarization (FP methodology that was able to characterize interactions between SH2 domains and ErbB receptor phosphosites with higher fidelity and sensitivity than was previously achieved with PMs. We used the FP assay to query the interaction of synthetic phosphopeptides corresponding to 89 ErbB receptor intracellular tyrosine sites against 93 human SH2 domains and 2 phosphotyrosine binding (PTB domains. From 358,944 polarization measurements, the affinities for 1,405 unique biological interactions were determined, 83% of which are novel. In contrast to data from previous reports, our analyses suggested that ErbB2 was not more promiscuous than the other ErbB receptors. Our results showed that each receptor displays unique preferences in the affinity and location of recruited SH2 domains that may contribute to differences in downstream signaling potential. ErbB1 was enriched versus the other receptors for recruitment of domains from RAS GEFs whereas ErbB2 was enriched for recruitment of domains from tyrosine and phosphatidyl inositol phosphatases. ErbB3, the kinase inactive ErbB receptor family member, was predictably enriched for recruitment of domains from phosphatidyl inositol kinases and surprisingly, was enriched for recruitment of domains from tyrosine kinases, cytoskeletal regulatory proteins, and RHO GEFs but depleted for recruitment of domains from phosphatidyl inositol phosphatases. Many novel interactions were also observed with phosphopeptides corresponding to ErbB receptor tyrosines not previously reported to be phosphorylated by mass spectrometry, suggesting the existence of many biologically relevant RTK sites that may be phosphorylated but below the detection threshold of standard mass spectrometry

  13. Validation of methodologies for the analysis of lead and methyl-ether in gasoline, using the techniques of atomic emission with plasma source coupled inductively and micellar liquid chromatography

    International Nuclear Information System (INIS)

    Redondo Escalante, M.

    1995-01-01

    This study established and optimized the experimental variables for the lead quantization through the Icp-Aes technique, in aqueous media. A comparative study of several proposal methods, that appears in the literature for the extraction in aqueous media of the lead in gasoline was made. It determined that it is not possible, to make this procedure using the reaction of hydrolysis of tetraethyl lead. The op tim conditions were established, for the lead quantization in gasoline, using methyl-isobutyl-ketone and also ethanol as dis solvents. The conditions of the proposed methodologies were optimized, and the variables of analytical performance were defined. It was demonstrated, that it is possible to prepare lead dissolution patterns, in organic media, starting from inorganic salts of this metal. The techniques of chromatography of gases and of liquid chromatography of high pressure, in the analysis of methyl-ter butyl-ether (Mtbe), were compared. It demonstrated that it is possible, to quantize the Mtbe through the HPLC technique, and it found that the 'micellar' liquid chromatography. (author) [es

  14. Metodologias para determinação da digestibilidade de dietas contendo fontes proteicas vegetal ou animal em cães Methodology for determination of digestibility of diets containing vegetable or animal protein sources in dogs

    Directory of Open Access Journals (Sweden)

    Carolina Pedro Zanatta

    2013-04-01

    Full Text Available Objetivou-se avaliar diferentes metodologias de determinação da digestibilidade em cães, alimentados com duas dietas contendo fontes proteicas animal (farinha de vísceras de aves - FVA e vegetal (farelo de soja - FS. As metodologias avaliadas foram: colheita total de fezes (CTF e os indicadores cinza insolúvel em ácido (CIA, fibra em detergente ácido (FDA e fibra bruta (FB. Foram utilizados 8 cães adultos, em delineamento Cross over, em parcela subdividida no tempo (parcela: fontes proteicas; e subparcela: metodologias de digestibilidade. O período experimental foi constituído por cinco dias de adaptação, com cinco dias de colheita total de fezes. A dieta contendo FS apresentou maior CDA da PB, enquanto a dieta contendo FVA apresentou maior CDA dos demais nutrientes e energia metabolizável (EM. Os CDA e EM determinados pela CTF e pelos indicadores não diferiram entre si, podendo ser determinados pelos indicadores FB, FDA e CIA, independentemente da fonte proteica da dieta.The objective was to evaluate different methods of measuring digestibility in dogs fed two diets containing animal (poultry by products - PBP and vegetable (soybean meal - SBM protein sources. The methods evaluated were: total fecal collection (TFC and indicators: acid insoluble ash (AIA, acid detergent fiber (ADF and crude fiber (CF. Eight dogs were distributed in Cross Over in split plots (plot: sources of protein; subplot: methods for digestibility, fed by five days of adaptation and five days of total fecal collection. The diet containing SBM had a higher ADC of CP, while the diet containing POM showed higher ADC of all nutrients and metabolizable energy (ME. The ADC and ME determined by the TFC and the indicators did not differ. Thus, the ADC of diets in dogs can be determined by TFC and CF, ADF and AIA indicators, regardless of source of dietary protein.

  15. Expression of GABAergic receptors in mouse taste receptor cells.

    Directory of Open Access Journals (Sweden)

    Margaret R Starostik

    Full Text Available BACKGROUND: Multiple excitatory neurotransmitters have been identified in the mammalian taste transduction, with few studies focused on inhibitory neurotransmitters. Since the synthetic enzyme glutamate decarboxylase (GAD for gamma-aminobutyric acid (GABA is expressed in a subset of mouse taste cells, we hypothesized that other components of the GABA signaling pathway are likely expressed in this system. GABA signaling is initiated by the activation of either ionotropic receptors (GABA(A and GABA(C or metabotropic receptors (GABA(B while it is terminated by the re-uptake of GABA through transporters (GATs. METHODOLOGY/PRINCIPAL FINDINGS: Using reverse transcriptase-PCR (RT-PCR analysis, we investigated the expression of different GABA signaling molecules in the mouse taste system. Taste receptor cells (TRCs in the circumvallate papillae express multiple subunits of the GABA(A and GABA(B receptors as well as multiple GATs. Immunocytochemical analyses examined the distribution of the GABA machinery in the circumvallate papillae. Both GABA(A-and GABA(B- immunoreactivity were detected in the peripheral taste receptor cells. We also used transgenic mice that express green fluorescent protein (GFP in either the Type II taste cells, which can respond to bitter, sweet or umami taste stimuli, or in the Type III GAD67 expressing taste cells. Thus, we were able to identify that GABAergic receptors are expressed in some Type II and Type III taste cells. Mouse GAT4 labeling was concentrated in the cells surrounding the taste buds with a few positively labeled TRCs at the margins of the taste buds. CONCLUSIONS/SIGNIFICANCE: The presence of GABAergic receptors localized on Type II and Type III taste cells suggests that GABA is likely modulating evoked taste responses in the mouse taste bud.

  16. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  17. Non-invasive cardiac imaging. Spectrum, methodology, indication and interpretation

    International Nuclear Information System (INIS)

    Schaefers, Michael; Flachskampf, Frank; Sechtem, Udo; Achenbach, Stephan; Krause, Bernd J.; Schwaiger, Markus; Breithardt, Guenter

    2008-01-01

    The book contains 13 contributions concerning the following chapters: (1)methodology: echo cardiography; NMR imaging; nuclear medicine; computer tomography, (2) clinical protocols: contraction; cardiac valve function; perfusion and perfusion reserve; vitality; corona imaging; transmitters, receptors, enzymes; (3) clinic: coronary heart diseases; non-ischemic heart diseases. The appendix contains two contributions on future developments and certification/standardization

  18. PM10 standards and nontraditional particulate source controls: Research perspective

    International Nuclear Information System (INIS)

    Watson, J.G.

    1992-01-01

    Knowledge of how to measure suspended particles, what their concentrations are, what they are composed of, and where they come from has increased substantially since 1975. At that time, much of the pioneering work in these areas was just being conducted and published. Size-classified measurements, low-level elemental analysis, inorganic ion analysis, and carbon determinations for aerosol samples were novel research developments. Receptor modeling was not considered to be a scientific discipline, let alone a useful tool for source apportionment. Presentations at earlier conferences went to great lengths to document and justify methodologies which are taken for granted at this conference. This paper goes on to discuss research findings in control of wood smoke, fugitive dusts, motor vehicle exhausts, and secondary aerosols. Research results in source apportionment are also discussed

  19. The policy trail methodology

    DEFF Research Database (Denmark)

    Holford, John; Larson, Anne; Melo, Susana

    of ‘policy trail’, arguing that it can overcome ‘methodological nationalism’ and link structure and agency in research on the ‘European educational space’. The ‘trail’ metaphor, she suggests, captures the intentionality and the erratic character of policy. The trail connects sites and brings about change......, but – although policy may be intended to be linear, with specific outcomes – policy often has to bend, and sometimes meets insurmountable obstacles. This symposium outlines and develops the methodology, but also reports on research undertaken within a major FP7 project (LLLIght’in’Europe, 2012-15) which made use......In recent years, the “policy trail” has been proposed as a methodology appropriate to the shifting and fluid governance of lifelong learning in the late modern world (Holford et al. 2013, Holford et al. 2013, Cort 2014). The contemporary environment is marked by multi-level governance (global...

  20. Changing methodologies in TESOL

    CERN Document Server

    Spiro, Jane

    2013-01-01

    Covering core topics from vocabulary and grammar to teaching, writing speaking and listening, this textbook shows you how to link research to practice in TESOL methodology. It emphasises how current understandings have impacted on the language classroom worldwide and investigates the meaning of 'methods' and 'methodology' and the importance of these for the teacher: as well as the underlying assumptions and beliefs teachers bring to bear in their practice. By introducing you to language teaching approaches, you will explore the way these are influenced by developments in our understanding of l

  1. Creativity in phenomenological methodology

    DEFF Research Database (Denmark)

    Dreyer, Pia; Martinsen, Bente; Norlyk, Annelise

    2014-01-01

    on the methodologies of van Manen, Dahlberg, Lindseth & Norberg, the aim of this paper is to argue that the increased focus on creativity and arts in research methodology is valuable to gain a deeper insight into lived experiences. We illustrate this point through examples from empirical nursing studies, and discuss......Nursing research is often concerned with lived experiences in human life using phenomenological and hermeneutic approaches. These empirical studies may use different creative expressions and art-forms to describe and enhance an embodied and personalised understanding of lived experiences. Drawing...... may support a respectful renewal of phenomenological research traditions in nursing research....

  2. Computer Network Operations Methodology

    Science.gov (United States)

    2004-03-01

    means of their computer information systems. Disrupt - This type of attack focuses on disrupting as “attackers might surreptitiously reprogram enemy...by reprogramming the computers that control distribution within the power grid. A disruption attack introduces disorder and inhibits the effective...between commanders. The use of methodologies is widespread and done subconsciously to assist individuals in decision making. The processes that

  3. A Functional HAZOP Methodology

    DEFF Research Database (Denmark)

    Liin, Netta; Lind, Morten; Jensen, Niels

    2010-01-01

    A HAZOP methodology is presented where a functional plant model assists in a goal oriented decomposition of the plant purpose into the means of achieving the purpose. This approach leads to nodes with simple functions from which the selection of process and deviation variables follow directly...

  4. Complicating Methodological Transparency

    Science.gov (United States)

    Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.

    2016-01-01

    A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…

  5. Methodological Advances in Dea

    NARCIS (Netherlands)

    L. Cherchye (Laurens); G.T. Post (Thierry)

    2001-01-01

    textabstractWe survey the methodological advances in DEA over the last 25 years and discuss the necessary conditions for a sound empirical application. We hope this survey will contribute to the further dissemination of DEA, the knowledge of its relative strengths and weaknesses, and the tools

  6. NUSAM Methodology for Assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Leach, Janice [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Snell, Mark K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-07-01

    This document provides a methodology for the performance-based assessment of security systems designed for the protection of nuclear and radiological materials and the processes that produce and/or involve them. It is intended for use with both relatively simple installations and with highly regulated complex sites with demanding security requirements.

  7. MIRD methodology. Part 1

    International Nuclear Information System (INIS)

    Rojo, Ana M.

    2004-01-01

    This lecture develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this first part, the basic concepts and the main equations are presented. The ICRP Dosimetric System is also explained. (author)

  8. Response Surface Methodology

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.

    2014-01-01

    Abstract: This chapter first summarizes Response Surface Methodology (RSM), which started with Box and Wilson’s article in 1951 on RSM for real, non-simulated systems. RSM is a stepwise heuristic that uses first-order polynomials to approximate the response surface locally. An estimated polynomial

  9. MIRD methodology. Part 2

    International Nuclear Information System (INIS)

    Gomez Parada, Ines

    2004-01-01

    This paper develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this second part, different methods for the calculation of the accumulated activity are presented, together with the effective half life definition. Different forms of Retention Activity curves are also shown. (author)

  10. Overview of the ISAM safety assessment methodology

    International Nuclear Information System (INIS)

    Simeonov, G.

    2003-01-01

    The ISAM safety assessment methodology consists of the following key components: specification of the assessment context description of the disposal system development and justification of scenarios formulation and implementation of models running of computer codes and analysis and presentation of results. Common issues run through two or more of these assessment components, including: use of methodological and computer tools, collation and use of data, need to address various sources of uncertainty, building of confidence in the individual components, as well as the overall assessment. The importance of the iterative nature of the assessment should be recognised

  11. CIAU methodology and BEPU applications

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.

    2009-01-01

    Best-Estimate calculation results from complex thermal-hydraulic system codes (like Relap5, Cathare, Athlet, Trace, etc..) are affected by unavoidable approximations that are unpredictable without the use of computational tools that account for the various sources of uncertainty. Therefore the use of best-estimate codes within the reactor technology, either for design or safety purposes, implies understanding and accepting the limitations and the deficiencies of those codes. Uncertainties may have different origins ranging from the approximation of the models, to the approximation of the numerical solution, and to the lack of precision of the values adopted for boundary and initial conditions. The amount of uncertainty that affects a calculation may strongly depend upon the codes and the modeling techniques (i.e. the code's users). A consistent and robust uncertainty methodology must be developed taking into consideration all the above aspects. The CIAU (Code with the capability of Internal Assessment of Uncertainty) and the UMAE (Uncertainty Methodology based on Accuracy Evaluation) methods have been developed by University of Pisa (UNIPI) in the framework of a long lasting research activities started since 80's and involving several researchers. CIAU is extensively discussed in the available technical literature, Refs. [1, 2, 3, 4, 5, 6, 7], and tens of additional relevant papers, that provide comprehensive details about the method, can be found in the bibliography lists of the above references. Therefore, the present paper supplies only 'spot-information' about CIAU and focuses mostly on the applications to some cases of industrial interest. In particular the application of CIAU to the OECD BEMUSE (Best Estimate Methods Uncertainty and Sensitivity Evaluation, [8, 9]) project is discussed and a critical comparison respect with other uncertainty methods (in relation to items like: sources of uncertainties, selection of the input parameters and quantification of

  12. A performance assessment methodology for low-level waste facilities

    International Nuclear Information System (INIS)

    Kozak, M.W.; Chu, M.S.Y.; Mattingly, P.A.

    1990-07-01

    A performance assessment methodology has been developed for use by the US Nuclear Regulatory Commission in evaluating license applications for low-level waste disposal facilities. This report provides a summary of background reports on the development of the methodology and an overview of the models and codes selected for the methodology. The overview includes discussions of the philosophy and structure of the methodology and a sequential procedure for applying the methodology. Discussions are provided of models and associated assumptions that are appropriate for each phase of the methodology, the goals of each phase, data required to implement the models, significant sources of uncertainty associated with each phase, and the computer codes used to implement the appropriate models. In addition, a sample demonstration of the methodology is presented for a simple conceptual model. 64 refs., 12 figs., 15 tabs

  13. Receptor study of psychiatric disorders using PET

    International Nuclear Information System (INIS)

    Suhara, Tetsuya

    1992-01-01

    Recent receptor studies of psychiatric disorders using PET have been focused on the change in the number of D 2 dopamine receptors in the striatum of drug-naive schizophrenic patients. One study confirmed an increase in D 2 receptors, while another study denied it. Although there were some differences in the approaches of the two groups, the reason for the discrepancy is not clear yet. Looking to psychiatric disorders other than schizophrenia, our recent study revealed a possible role of dopamine D 1 receptors in bipolar mood disorders. However, some problems must be resolved for further receptor studies with PET. For example, our recent study shows that desipamine decreases the in vivo binding of dopramine D 1 and D 2 receptors whereas these is no effect on dopamine D 1 and D 2 receptors in vitro. Additionally significant methodological problems lie in the method of evaluation of the non-specific binding and the effect of endogenous neurotransmitters. Moreover, difficulties in the diagnosis of psychiatric disorders and ethical problems in psychiatric research are critical factors in receptor studies with PET in psychiatric disorders. (author)

  14. Shifting renewable energy in transport into the next gear. Developing a methodology for taking into account all electricity, hydrogen and methane from renewable sources in the 10% transport target; Hernieuwbare energie in transport naar een hogere versnelling. Ontwikkeling van een methode dat rekening houdt met alle elektriciteit, waterstof en methaan uit hernieuwbare bronnen in de 10% transportdoelsteling

    Energy Technology Data Exchange (ETDEWEB)

    Kampman, B.; Leguijt, C.; Bennink, D. [CE Delft, Delft (Netherlands); Wentrup, K.; Dreblow, E.; Gruenig, M. [Ecologic Institute, Berlin (Germany); Schmidt, P.; Wurster, R.; Weindorf, W. [Ludwig-Boelkow-Systemtechnik, Muenchen-Ottobrunn (Germany)

    2012-01-15

    The European Union has set a 10% target of renewable energy use in the transport sector for 2020 in the Renewable Energy Directive (RED, 2009/28/EC). This directive also defines the associated calculation methodologies, for biofuels and renewable electricity used in transport. Regarding biofuels, only those biofuels can contribute that are actually used in the transport sector. The contribution of electricity from renewable sources is treated somewhat differently, as it is typically taken from the electricity grid, where the exact source of the energy used is not monitored: Member States should use the average share of renewable electricity production in their calculations. The RED required the European Commission to present, if appropriate, a proposal to consider the whole amount of the electricity from renewable sources used to power electric vehicles, as well as a methodology to include the contribution of hydrogen from renewable sources in the transport sector. At the same time, there is the question how biomethane injected into the natural gas grid should be counted towards the transport target if vehicles are filled from that same grid - a similar route to that of electricity use in transport. DG Energy of the Commission needs to be supported in the decision making process related to these three routes: renewable electricity, hydrogen and biomethane use in transport, where distribution is taking place via national grids. The result is a comprehensive report in which different methodological options are designed and assessed, and conclusions are drawn, both for the short to medium term (until 2020) and the longer term (post-2020). In the short term, where the contribution of these routes is still limited, a relatively simple approach will be sufficient, but more sophisticated monitoring methodologies may be needed in the future, depending on the way these routes develop [Dutch] In de Richtlijn Hernieuwbare Energie (RED, 2009/28/EC) heeft de Europese Unie

  15. Soft Systems Methodology

    Science.gov (United States)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  16. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    As part of learning at the Nordic Workshop of Evidence-based Medicine, we have read with interest the practice guidelines for central venous access, published in your Journal in 2012.1 We appraised the quality of this guideline using the checklist developed by The Evidence-Based Medicine Working ...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....... Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...

  17. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  18. Steganography: LSB Methodology

    Science.gov (United States)

    2012-08-02

    of LSB steganography in grayscale and color images . In J. Dittmann, K. Nahrstedt, and P. Wohlmacher, editors, Proceedings of the ACM, Special...Fridrich, M. Gojan and R. Du paper titled “Reliable detection of LSB steganography in grayscale and color images ”. From a general perspective Figure 2...REPORT Steganography : LSB Methodology (Progress Report) 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: In computer science, steganography is the science

  19. Soil Radiological Characterisation Methodology

    International Nuclear Information System (INIS)

    Attiogbe, Julien; Aubonnet, Emilie; De Maquille, Laurence; De Moura, Patrick; Desnoyers, Yvon; Dubot, Didier; Feret, Bruno; Fichet, Pascal; Granier, Guy; Iooss, Bertrand; Nokhamzon, Jean-Guy; Ollivier Dehaye, Catherine; Pillette-Cousin, Lucien; Savary, Alain

    2014-12-01

    This report presents the general methodology and best practice approaches which combine proven existing techniques for sampling and characterisation to assess the contamination of soils prior to remediation. It is based on feedback of projects conducted by main French nuclear stakeholders involved in the field of remediation and dismantling (EDF, CEA, AREVA and IRSN). The application of this methodology will enable the project managers to obtain the elements necessary for the drawing up of files associated with remediation operations, as required by the regulatory authorities. It is applicable to each of the steps necessary for the piloting of remediation work-sites, depending on the objectives targeted (release into the public domain, re-use, etc.). The main part describes the applied statistical methodology with the exploratory analysis and variogram data, identification of singular points and their location. The results obtained permit assessment of a mapping to identify the contaminated surface and subsurface areas. It stakes the way for radiological site characterisation since the initial investigations from historical and functional analysis to check that the remediation objectives have been met. It follows an example application from the feedback of the remediation of a contaminated site on the Fontenay aux Roses facility. It is supplemented by a glossary of main terms used in the field from different publications or international standards. This technical report is a support of the ISO Standard ISO ISO/TC 85/SC 5 N 18557 'Sampling and characterisation principles for soils, buildings and infrastructures contaminated by radionuclides for remediation purposes'. (authors) [fr

  20. The LDL receptor.

    Science.gov (United States)

    Goldstein, Joseph L; Brown, Michael S

    2009-04-01

    In this article, the history of the LDL receptor is recounted by its codiscoverers. Their early work on the LDL receptor explained a genetic cause of heart attacks and led to new ways of thinking about cholesterol metabolism. The LDL receptor discovery also introduced three general concepts to cell biology: receptor-mediated endocytosis, receptor recycling, and feedback regulation of receptors. The latter concept provides the mechanism by which statins selectively lower plasma LDL, reducing heart attacks and prolonging life.

  1. Salivary agglutinin and lung scavenger receptor cysteine-rich glycoprotein 340 have broad anti-influenza activities and interactions with surfactant protein D that vary according to donor source and sialylation

    DEFF Research Database (Denmark)

    Hartshorn, Kevan L.; Ligtenberg, Antoon; White, Mitchell R.

    2006-01-01

    We previously found that scavenger receptor cysteine-rich gp-340 (glycoprotein-340), isolated from lung or saliva, directly inhibits human IAVs (influenza A viruses). We now show that salivary gp-340 has broad antiviral activity against human, equine and porcine IAV strains. Although lung...

  2. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available This section will address the methodology that was applied to develop the South African Digital Health Innovation Ecosystem (DHIE). Each chapter under Section B represents a specific phase in the methodology....

  3. CAGE IIIA Distributed Simulation Design Methodology

    Science.gov (United States)

    2014-05-01

    2 VHF Very High Frequency VLC Video LAN Codec – an Open-source cross-platform multimedia player and framework VM Virtual Machine VOIP Voice Over...Implementing Defence Experimentation (GUIDEx). The key challenges for this methodology are with understanding how to: • design it o define the...operation and to be available in the other nation’s simulations. The challenge for the CAGE campaign of experiments is to continue to build upon this

  4. Muscarinic Receptor Agonists and Antagonists

    Directory of Open Access Journals (Sweden)

    David R. Kelly

    2001-02-01

    Full Text Available A comprehensive review of pharmacological and medical aspects of the muscarinic class of acetylcholine agonists and antagonists is presented. The therapeutic benefits of achieving receptor subtype selectivity are outlined and applications in the treatment of Alzheimer’s disease are discussed. A selection of chemical routes are described, which illustrate contemporary methodology for the synthesis of chiral medicinal compounds (asymmetric synthesis, chiral pool, enzymes. Routes to bicyclic intrannular amines and intramolecular Diels-Alder reactions are highlighted.

  5. Analyzing and Interpreting Historical Sources

    DEFF Research Database (Denmark)

    Kipping, Matthias; Wadhwani, Dan; Bucheli, Marcelo

    2014-01-01

    This chapter outlines a methodology for the interpretation of historical sources, helping to realize their full potential for the study of organization, while overcoming their challenges in terms of distortions created by time, changes in context, and selective production or preservation. Drawing....... The chapter contributes to the creation of a language for describing the use of historical sources in management research....

  6. Case Study Research Methodology

    Directory of Open Access Journals (Sweden)

    Mark Widdowson

    2011-01-01

    Full Text Available Commenting on the lack of case studies published in modern psychotherapy publications, the author reviews the strengths of case study methodology and responds to common criticisms, before providing a summary of types of case studies including clinical, experimental and naturalistic. Suggestions are included for developing systematic case studies and brief descriptions are given of a range of research resources relating to outcome and process measures. Examples of a pragmatic case study design and a hermeneutic single-case efficacy design are given and the paper concludes with some ethical considerations and an exhortation to the TA community to engage more widely in case study research.

  7. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  8. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    The reports comprising this volume concern the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in granites in Cornwall, with particular reference to the effect of structures imposed by discontinuities on the engineering behaviour of rock masses. The topics covered are: in-situ stress measurements using (a) the hydraulic fracturing method, or (b) the US Bureau of Mines deformation probe; scanline discontinuity survey - coding form and instructions, and data; applicability of geostatistical estimation methods to scalar rock properties; comments on in-situ stress at the Carwynnen test mine and the state of stress in the British Isles. (U.K.)

  9. Microphysics evolution and methodology

    International Nuclear Information System (INIS)

    Dionisio, J.S.

    1990-01-01

    A few general features of microscopics evolution and their relationship with microscopics methodology are briefly surveyed. Several pluri-disciplinary and interdisciplinary aspects of microscopics research are also discussed in the present scientific context. The need for an equilibrium between individual tendencies and collective constraints required by team work, already formulated thirty years ago by Frederic Joliot, is particularly stressed in the present conjuncture of Nuclear Research favouring very large team projects and discouraging individual initiatives. The increasing importance of the science of science (due to their multiple social, economical, ecological aspects) and the stronger competition between national and international tendencies of scientific (and technical) cooperation are also discussed. (author)

  10. MIRD methodology; Metodologia MIRD

    Energy Technology Data Exchange (ETDEWEB)

    Rojo, Ana M [Autoridad Regulatoria Nuclear, Buenos Aires (Argentina); Gomez Parada, Ines [Sociedad Argentina de Radioproteccion, Buenos Aires (Argentina)

    2004-07-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained.

  11. Beam optimization: improving methodology

    International Nuclear Information System (INIS)

    Quinteiro, Guillermo F.

    2004-01-01

    Different optimization techniques commonly used in biology and food technology allow a systematic and complete analysis of response functions. In spite of the great interest in medical and nuclear physics in the problem of optimizing mixed beams, little attention has been given to sophisticate mathematical tools. Indeed, many techniques are perfectly suited to the typical problem of beam optimization. This article is intended as a guide to the use of two methods, namely Response Surface Methodology and Simplex, that are expected to fasten the optimization process and, meanwhile give more insight into the relationships among the dependent variables controlling the response

  12. Literacy research methodologies

    CERN Document Server

    Duke, Nell K

    2012-01-01

    The definitive reference on literacy research methods, this book serves as a key resource for researchers and as a text in graduate-level courses. Distinguished scholars clearly describe established and emerging methodologies, discuss the types of questions and claims for which each is best suited, identify standards of quality, and present exemplary studies that illustrate the approaches at their best. The book demonstrates how each mode of inquiry can yield unique insights into literacy learning and teaching and how the methods can work together to move the field forward.   New to This Editi

  13. Methodology of site studies

    International Nuclear Information System (INIS)

    Caries, J.C.; Hugon, J.; Grauby, A.

    1980-01-01

    This methodology consists in an essentially dynamic, estimated and follow-up analysis of the impact of discharges on all the environment compartments, whether natural or not, that play a part in the protection of man and his environment. It applies at two levels, to wit: the choice of site, or the detailed study of the site selected. Two examples of its application will be developed, namely: at the choice of site level in the case of marine sites, and of the detailed study level of the chosen site in that of a riverside site [fr

  14. Alternative pricing methodologies

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    With the increased interest in competitive market forces and growing recognition of the deficiencies in current practices, FERC and others are exploring alternatives to embedded cost pricing. A number of these alternatives are discussed in this chapter. Marketplace pricing, discussed briefly here, is the subject of the next chapter. Obviously, the pricing formula may combine several of these methodologies. One utility of which the authors are aware is seeking a price equal to the sum of embedded costs, opportunity costs, line losses, value of service, FERC's percentage adder formula and a contract service charge

  15. Econometric Analysis of 2003 Data on the Post-Service Earnings of Military Retirees: Methodology Report

    National Research Council Canada - National Science Library

    Mackin, Patrick C; Darling, Kimberly L

    2004-01-01

    ...). This report details how the estimation datasets were constructed from these two data sources and describes the econometric methodology in detail, including the definition of alternative models...

  16. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  17. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  18. Intelligent systems engineering methodology

    Science.gov (United States)

    Fouse, Scott

    1990-01-01

    An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.

  19. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  20. Insights into PRA methodologies

    International Nuclear Information System (INIS)

    Gallagher, D.; Lofgren, E.; Atefi, B.; Liner, R.; Blond, R.; Amico, P.

    1984-08-01

    Probabilistic Risk Assessments (PRAs) for six nuclear power plants were examined to gain insight into how the choice of analytical methods can affect the results of PRAs. The PRA sreflectope considered was limited to internally initiated accidents sequences through core melt. For twenty methodological topic areas, a baseline or minimal methodology was specified. The choice of methods for each topic in the six PRAs was characterized in terms of the incremental level of effort above the baseline. A higher level of effort generally reflects a higher level of detail or a higher degree of sophistication in the analytical approach to a particular topic area. The impact on results was measured in terms of how additional effort beyond the baseline level changed the relative importance and ordering of dominant accident sequences compared to what would have been observed had methods corresponding to the baseline level of effort been employed. This measure of impact is a more useful indicator of how methods affect perceptions of plant vulnerabilities than changes in core melt frequency would be. However, the change in core melt frequency was used as a secondary measure of impact for nine topics where availability of information permitted. Results are presented primarily in the form of effort-impact matrices for each of the twenty topic areas. A suggested effort-impact profile for future PRAs is presented

  1. Scrum methodology in banking environment

    OpenAIRE

    Strihová, Barbora

    2015-01-01

    Bachelor thesis "Scrum methodology in banking environment" is focused on one of agile methodologies called Scrum and description of the methodology used in banking environment. Its main goal is to introduce the Scrum methodology and outline a real project placed in a bank focused on software development through a case study, address problems of the project, propose solutions of the addressed problems and identify anomalies of Scrum in software development constrained by the banking environmen...

  2. Experimental Economics: Some Methodological Notes

    OpenAIRE

    Fiore, Annamaria

    2009-01-01

    The aim of this work is presenting in a self-contained paper some methodological aspects as they are received in the current experimental literature. The purpose has been to make a critical review of some very influential papers dealing with methodological issues. In other words, the idea is to have a single paper where people first approaching experimental economics can find summarised (some) of the most important methodological issues. In particular, the focus is on some methodological prac...

  3. The visual in sport history: approaches, methodologies and sources

    OpenAIRE

    Huggins, Mike

    2015-01-01

    Historians of sport now increasingly accept that visual inquiry offers another dimension to social and cultural research into sport and its history. It is complex and its boundaries are rapidly evolving. This overview offers a justification for placing more emphasis on visual approaches and an introduction to the study and interpretation of visual culture in relation to the history of sport. It stresses the importance of adopting a critical approach and the need to be reflective about that cr...

  4. Chemical Contaminant and Decontaminant Test Methodology Source Document. Second Edition

    Science.gov (United States)

    2012-07-01

    endorsement of any commercial products. This report may not be cited for purposes of advertisement . This report has been approved for public release...Department of Justice: Washington, DC, 2001. UNCLASSIFIED Report. 6. Stuempfle, A. K.; Howells, D. J.; Armour , S. J.; Boulet, C. A. International Task Force

  5. Source Region Identification Using Kernel Smoothing

    Science.gov (United States)

    As described in this paper, Nonparametric Wind Regression is a source-to-receptor source apportionment model that can be used to identify and quantify the impact of possible source regions of pollutants as defined by wind direction sectors. It is described in detail with an exam...

  6. Serotonin 5-HT4 receptors and forebrain cholinergic system: receptor expression in identified cell populations.

    Science.gov (United States)

    Peñas-Cazorla, Raúl; Vilaró, M Teresa

    2015-11-01

    Activation of serotonin 5-HT4 receptors has pro-cognitive effects on memory performance. The proposed underlying neurochemical mechanism is the enhancement of acetylcholine release in frontal cortex and hippocampus elicited by 5-HT4 agonists. Although 5-HT4 receptors are present in brain areas related to cognition, e.g., hippocampus and cortex, the cellular localization of the receptors that might modulate acetylcholine release is unknown at present. We have analyzed, using dual label in situ hybridization, the cellular localization of 5-HT4 receptor mRNA in identified neuronal populations of the rat basal forebrain, which is the source of the cholinergic innervation to cortex and hippocampus. 5-HT4 receptor mRNA was visualized with isotopically labeled oligonucleotide probes, whereas cholinergic, glutamatergic, GABAergic and parvalbumin-synthesizing neurons were identified with digoxigenin-labeled oligonucleotide probes. 5-HT4 receptor mRNA was not detected in the basal forebrain cholinergic cell population. In contrast, basal forebrain GABAergic, parvalbumin synthesizing, and glutamatergic cells contained 5-HT4 receptor mRNA. Hippocampal and cortical glutamatergic neurons also express this receptor. These results indicate that 5-HT4 receptors are not synthesized by cholinergic cells, and thus would be absent from cholinergic terminals. In contrast, several non-cholinergic cell populations within the basal forebrain and its target hippocampal and cortical areas express these receptors and are thus likely to mediate the enhancement of acetylcholine release elicited by 5-HT4 agonists.

  7. Fine Particulate Pollution and Source Apportionment in the Urban Centers for Africa, Asia and Latin America

    Science.gov (United States)

    Guttikunda, S. K.; Johnson, T. M.; Procee, P.

    2004-12-01

    Fossil fuel combustion for domestic cooking and heating, power generation, industrial processes, and motor vehicles are the primary sources of air pollution in the developing country cities. Over the past twenty years, major advances have been made in understanding the social and economic consequences of air pollution. In both industrialized and developing countries, it has been shown that air pollution from energy combustion has detrimental impacts on human health and the environment. Lack of information on the sectoral contributions to air pollution - especially fine particulates, is one of the typical constraints for an effective integrated urban air quality management program. Without such information, it is difficult, if not impossible, for decision makers to provide policy advice and make informed investment decisions related to air quality improvements in developing countries. This also raises the need for low-cost ways of determining the principal sources of fine PM for a proper planning and decision making. The project objective is to develop and verify a methodology to assess and monitor the sources of PM, using a combination of ground-based monitoring and source apportionment techniques. This presentation will focus on four general tasks: (1) Review of the science and current activities in the combined use of monitoring data and modeling for better understanding of PM pollution. (2) Review of recent advances in atmospheric source apportionment techniques (e.g., principal component analysis, organic markers, source-receptor modeling techniques). (3) Develop a general methodology to use integrated top-down and bottom-up datasets. (4) Review of a series of current case studies from Africa, Asia and Latin America and the methodologies applied to assess the air pollution and its sources.

  8. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    A final report summarizing the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in Cornwall. The geological setting of the test site in the Cornubian granite batholith is described. The effect of structure imposed by discontinuities on the engineering behaviour of rock masses is discussed and the scanline survey method of obtaining data on discontinuities in the rock mass is described. The applicability of some methods of statistical analysis for discontinuity data is reviewed. The requirement for remote geophysical methods of characterizing the mass is discussed and experiments using seismic and ultrasonic velocity measurements are reported. Methods of determining the in-situ stresses are described and the final results of a programme of in-situ stress measurements using the overcoring and hydrofracture methods are reported. (author)

  9. UNCOMMON SENSORY METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    Vladimír Vietoris

    2015-02-01

    Full Text Available Sensory science is the young but the rapidly developing field of the food industry. Actually, the great emphasis is given to the production of rapid techniques of data collection, the difference between consumers and trained panel is obscured and the role of sensory methodologists is to prepare the ways for evaluation, by which a lay panel (consumers can achieve identical results as a trained panel. Currently, there are several conventional methods of sensory evaluation of food (ISO standards, but more sensory laboratories are developing methodologies that are not strict enough in the selection of evaluators, their mechanism is easily understandable and the results are easily interpretable. This paper deals with mapping of marginal methods used in sensory evaluation of food (new types of profiles, CATA, TDS, napping.

  10. Safety class methodology

    International Nuclear Information System (INIS)

    Donner, E.B.; Low, J.M.; Lux, C.R.

    1992-01-01

    DOE Order 6430.1A, General Design Criteria (GDC), requires that DOE facilities be evaluated with respect to ''safety class items.'' Although the GDC defines safety class items, it does not provide a methodology for selecting safety class items. The methodology described in this paper was developed to assure that Safety Class Items at the Savannah River Site (SRS) are selected in a consistent and technically defensible manner. Safety class items are those in the highest of four categories determined to be of special importance to nuclear safety and, merit appropriately higher-quality design, fabrication, and industrial test standards and codes. The identification of safety class items is approached using a cascading strategy that begins at the 'safety function' level (i.e., a cooling function, ventilation function, etc.) and proceeds down to the system, component, or structure level. Thus, the items that are required to support a safety function are SCls. The basic steps in this procedure apply to the determination of SCls for both new project activities, and for operating facilities. The GDC lists six characteristics of SCls to be considered as a starting point for safety item classification. They are as follows: 1. Those items whose failure would produce exposure consequences that would exceed the guidelines in Section 1300-1.4, ''Guidance on Limiting Exposure of the Public,'' at the site boundary or nearest point of public access 2. Those items required to maintain operating parameters within the safety limits specified in the Operational Safety Requirements during normal operations and anticipated operational occurrences. 3. Those items required for nuclear criticality safety. 4. Those items required to monitor the release of radioactive material to the environment during and after a Design Basis Accident. Those items required to achieve, and maintain the facility in a safe shutdown condition 6. Those items that control Safety Class Item listed above

  11. Situating methodology within qualitative research.

    Science.gov (United States)

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  12. Waste Package Design Methodology Report

    Energy Technology Data Exchange (ETDEWEB)

    D.A. Brownson

    2001-09-28

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report.

  13. Waste Package Design Methodology Report

    International Nuclear Information System (INIS)

    D.A. Brownson

    2001-01-01

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report

  14. Risk management methodology for RBMN project

    International Nuclear Information System (INIS)

    Borssatto, Maria F.B.; Tello, Cledola C.O.; Uemura, George

    2013-01-01

    RBMN Project has been developed to design, construct and commission a national repository to dispose the low- and intermediate-level radioactive wastes from the operation of nuclear power plants and other industries that use radioactive sources and materials. Risk is a characteristic of all projects. The risks arise from uncertainties due to assumptions associated with the project and the environment in which it is executed. Risk management is the method by which these uncertainties are systematically monitored to ensure that the objectives of the project will be achieved. Considering the peculiarities of the Project, that is, comprehensive scope, multidisciplinary team, apparently polemic due to the unknowing of the subject by the stake holders, especially the community, it is being developed a specific methodology for risk management of this Project. This methodology will be critical for future generations who will be responsible for the final stages of the repository. It will provide greater guarantee to the processes already implemented and will maintain a specific list of risks and solutions for this Project, ensuring safety and security of the repository throughout its life cycle that is the planned to last at least three hundred years. This paper presents the tools and processes already defined, management actions aimed at developing a culture of proactive risk in order to minimize threats to this Project and promote actions that bring opportunities to its success. The methodology is based on solid research on the subject, considering methodologies already established and globally recognized as best practices for project management. (author)

  15. Regulation of endocrine and paracrine sources of insulin-like growth factors and growth hormone receptor during compensatory growth in hybrid striped bass (Morone chrysops x Morone saxatilis)

    DEFF Research Database (Denmark)

    Picha, Matthew E; Turano, Marc J; Tipsmark, Christian K

    2008-01-01

    relieved, renders a subsequent phase of CG. The catabolic state was characterized by depressed levels of hepatic Type I and II GH receptor (Ghr1, Ghr2) and insulin-like growth factor-I (Igf-I) mRNA, along with considerable decreases in plasma IGF-I. The state of catabolism also resulted in significant...... liver production, rather than as a fraction of total RNA, may be a more biologically appropriate method of quantifying hepatic gene expression when using real-time PCR....

  16. Long-range air transport of dioxin from North American sources to ecologically vulnerable receptors in Nunavut, Arctic Canada. Final report to the North American Commission for Environmental Cooperation

    Energy Technology Data Exchange (ETDEWEB)

    Commoner, B.; Woods Bartlett, P.; Eisl, H.; Couchot, K. [City University of New York, Queens College, Center for Biology of Natural Systems, New York, NY (United States)

    2000-07-01

    This study was commissioned by the North American Commission for Environmental Cooperation (NACEC). It was designed to model on a continental scale the rates of deposition of airborne dioxin (polychlorinated dibenzo-p-dioxin and polychlorinated dibenzofurans PCDD/PCDF) in the Canadian Arctic territory of Nunavut and to identify the major contributing North American sources. The study was commissioned in response to findings showing twice the level of dioxin concentration in the milk of Inuit mothers than that observed in southern Quebec, despite the fact that there are no significant sources of dioxin in Nunavut or within 500 kms of its boundaries. This high concentration is attributed to indigenous diet, i. e. traditional foods such as caribou, fish and marine mammals, which in turn ingest it from airborne sources through the terrestrial food chain, chiefly through lichen, mosses, shrubs and marine algae. Since these avenues of entry into the food chain cannot be protected from airborne pollutants, remedial action must be directed at the sources that emit dioxin. Results of the study show that of the total North American annual emission of airborne dioxin (4,713 grams toxicity equivalent quotient (TEQ)), Canadian sources account for 364 grams TEQ, United States sources for 2,937 TEQ, Mexican sources 1.412 grams TEQ, and emissions from sources within Nunavut a total of 0.12 grams TEQ. The North American national dioxin inventories include 44,091 sources, of which 5,343 are individual facilities such as trash-burning incinerators, the rest are sources such as backyard trash-burning in the United States and Mexico, but only a handful of sources are responsible for the deposition in Nunavut. The overall conclusion of the study confirm that atmospheric and ecological processes that carry dioxin from its numerous sources through terrestrial and marine food chains to human beings is a problem of continental dimensions. The challenge is to establish analytical methods and

  17. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  18. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  19. Methodological Problems of Nanotechnoscience

    Science.gov (United States)

    Gorokhov, V. G.

    Recently, we have reported on the definitions of nanotechnology as a new type of NanoTechnoScience and on the nanotheory as a cluster of the different natural and engineering theories. Nanotechnology is not only a new type of scientific-engineering discipline, but it evolves also in a “nonclassical” way. Nanoontology or nano scientific world view has a function of the methodological orientation for the choice the theoretical means and methods toward a solution to the scientific and engineering problems. This allows to change from one explanation and scientific world view to another without any problems. Thus, nanotechnology is both a field of scientific knowledge and a sphere of engineering activity, in other words, NanoTechnoScience is similar to Systems Engineering as the analysis and design of large-scale, complex, man/machine systems but micro- and nanosystems. Nano systems engineering as well as Macro systems engineering includes not only systems design but also complex research. Design orientation has influence on the change of the priorities in the complex research and of the relation to the knowledge, not only to “the knowledge about something”, but also to the knowledge as the means of activity: from the beginning control and restructuring of matter at the nano-scale is a necessary element of nanoscience.

  20. Methodological themes and variations

    International Nuclear Information System (INIS)

    Tetlock, P.E.

    1989-01-01

    This paper reports on the tangible progress that has been made in clarifying the underlying processes that affect both the likelihood of war in general and of nuclear war in particular. It also illustrates how difficult it is to make progress in this area. Nonetheless, what has been achieved should not be minimized. We have learned a good deal on both the theoretical and the methodological fronts and, perhaps, most important, we have learned a good deal about the limits of our knowledge. Knowledge of our ignorance---especially in a policy domain where confident, even glib, causal assertions are so common---can be a major contribution in itself. The most important service the behavioral and social sciences can currently provide to the policy making community may well be to make thoughtful skepticism respectable: to sensitize those who make key decisions to the uncertainty surrounding our understanding of international conflict and to the numerous qualifications that now need to be attached to simple causal theories concerning the origins of war

  1. Engineering radioecology: Methodological considerations

    International Nuclear Information System (INIS)

    Nechaev, A.F.; Projaev, V.V.; Sobolev, I.A.; Dmitriev, S.A.

    1995-01-01

    The term ''radioecology'' has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ''engineering radioecology'', seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology

  2. Receptor-receptor interactions within receptor mosaics. Impact on neuropsychopharmacology.

    Science.gov (United States)

    Fuxe, K; Marcellino, D; Rivera, A; Diaz-Cabiale, Z; Filip, M; Gago, B; Roberts, D C S; Langel, U; Genedani, S; Ferraro, L; de la Calle, A; Narvaez, J; Tanganelli, S; Woods, A; Agnati, L F

    2008-08-01

    Future therapies for diseases associated with altered dopaminergic signaling, including Parkinson's disease, schizophrenia and drug addiction or drug dependence may substantially build on the existence of intramembrane receptor-receptor interactions within dopamine receptor containing receptor mosaics (RM; dimeric or high-order receptor oligomers) where it is believed that the dopamine D(2) receptor may operate as the 'hub receptor' within these complexes. The constitutive adenosine A(2A)/dopamine D(2) RM, located in the dorsal striato-pallidal GABA neurons, are of particular interest in view of the demonstrated antagonistic A(2A)/D(2) interaction within these heteromers; an interaction that led to the suggestion and later demonstration that A(2A) antagonists could be used as novel anti-Parkinsonian drugs. Based on the likely existence of A(2A)/D(2)/mGluR5 RM located both extrasynaptically on striato-pallidal GABA neurons and on cortico-striatal glutamate terminals, multiple receptor-receptor interactions within this RM involving synergism between A(2A)/mGluR5 to counteract D(2) signaling, has led to the proposal of using combined mGluR5 and A(2A) antagonists as a future anti-Parkinsonian treatment. Based on the same RM in the ventral striato-pallidal GABA pathways, novel strategies for the treatment of schizophrenia, building on the idea that A(2A) agonists and/or mGluR5 agonists will help reduce the increased dopaminergic signaling associated with this disease, have been suggested. Such treatment may ensure the proper glutamatergic drive from the mediodorsal thalamic nucleus to the prefrontal cortex, one which is believed to be reduced in schizophrenia due to a dominance of D(2)-like signaling in the ventral striatum. Recently, A(2A) receptors also have been shown to counteract the locomotor and sensitizing actions of cocaine and increases in A(2A) receptors have also been observed in the nucleus accumbens after extended cocaine self-administration, probably

  3. Source-Term and building-Wake Consequence Modeling for the Godiva IV Reactor at Los Alamos National Laboratory

    International Nuclear Information System (INIS)

    Letellier, B.C.; McClure, P.; Restrepo, L.

    1999-01-01

    The objectives of this work were to evaluate the consequences of a postulated accident to onsite security personnel stationed near the facility during operations of the Godiva IV critical assembly and to identify controls needed to protect these personnel in case of an extreme criticality excursion equivalent to the design-basis accident (DBA). This paper presents the methodology and results of the source-term calculations, building ventilation rates, air concentrations, and consequence calculations that were performed using a multidisciplinary approach with several phenomenology models. Identification of controls needed to mitigate the consequences to near-field receptors is discussed

  4. NESHAP Dose-Release Factor Isopleths for Five Source-to-Receptor Distances from the Center of Site and H-Area for all Compass Sectors at SRS using CAP88-PC Version 4.0

    Energy Technology Data Exchange (ETDEWEB)

    Trimor, P. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-08-09

    The Environmental Protection Agency (EPA) requires the use of the computer model CAP88-PC to estimate the total effective doses (TED) for demonstrating compliance with 40 CFR 61, Subpart H (EPA 2006), the National Emission Standards for Hazardous Air Pollutants (NESHAP) regulations. As such, CAP88 Version 4.0 was used to calculate the receptor dose due to routine atmospheric releases at the Savannah River Site (SRS). For estimation, NESHAP dose-release factors (DRFs) have been supplied to Environmental Compliance and Area Closure Projects (EC&ACP) for many years. DRFs represent the dose to a maximum receptor exposed to 1 Ci of a specified radionuclide being released into the atmosphere. They are periodically updated to include changes in the CAP88 version, input parameter values, site meteorology, and location of the maximally exposed individual (MEI). This report presents the DRFs of tritium oxide released at two onsite locations, center-of-site (COS) and H-Area, at 0 ft. elevation to maximally exposed individuals (MEIs) located 1000, 3000, 6000, 9000, and 12000 meters from the release areas for 16 compass sectors. The analysis makes use of area-specific meteorological data (Viner 2014).

  5. The fairness of the PPS reimbursement methodology.

    Science.gov (United States)

    Gianfrancesco, F D

    1990-01-01

    In FY 1984 the Medicare program implemented a new method of reimbursing hospitals for inpatient services, the Prospective Payment System (PPS). Under this system, hospitals are paid a predetermined amount per Medicare discharge, which varies according to certain patient and hospital characteristics. This article investigates the presence of systematic biases and other potential imperfections in the PPS reimbursement methodology as revealed by its effects on Medicare operating ratios. The study covers the first three years of the PPS (approximately 1984-1986) and is based on hospital data from the Medicare cost reports and other related sources. Regression techniques were applied to these data to determine how Medicare operating ratios were affected by specific aspects of the reimbursement methodology. Several possible imbalances were detected. The potential undercompensation relating to these can be harmful to certain classes of hospitals and to the Medicare populations that they serve. PMID:2109738

  6. Gamma ray auto absorption correction evaluation methodology

    International Nuclear Information System (INIS)

    Gugiu, Daniela; Roth, Csaba; Ghinescu, Alecse

    2010-01-01

    Neutron activation analysis (NAA) is a well established nuclear technique, suited to investigate the microstructural or elemental composition and can be applied to studies of a large variety of samples. The work with large samples involves, beside the development of large irradiation devices with well know neutron field characteristics, the knowledge of perturbing phenomena and adequate evaluation of correction factors like: neutron self shielding, extended source correction, gamma ray auto absorption. The objective of the works presented in this paper is to validate an appropriate methodology for gamma ray auto absorption correction evaluation for large inhomogeneous samples. For this purpose a benchmark experiment has been defined - a simple gamma ray transmission experiment, easy to be reproduced. The gamma ray attenuation in pottery samples has been measured and computed using MCNP5 code. The results show a good agreement between the computed and measured values, proving that the proposed methodology is able to evaluate the correction factors. (authors)

  7. Dosimetric methodology of the ICRP

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    1994-01-01

    Establishment of guidance for the protection of workers and members of the public from radiation exposures necessitates estimation of the radiation dose to tissues of the body at risk. The dosimetric methodology formulated by the International Commission on Radiological Protection (ICRP) is intended to be responsive to this need. While developed for radiation protection, elements of the methodology are often applied in addressing other radiation issues; e.g., risk assessment. This chapter provides an overview of the methodology, discusses its recent extension to age-dependent considerations, and illustrates specific aspects of the methodology through a number of numerical examples

  8. Transmission pricing: paradigms and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Shirmohammadi, Dariush [Pacific Gas and Electric Co., San Francisco, CA (United States); Vieira Filho, Xisto; Gorenstin, Boris [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, Mario V.P. [Power System Research, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    In this paper we describe the principles of several paradigms and methodologies for pricing transmission services. The paper outlines some of the main characteristics of these paradigms and methodologies such as where they may be used for best results. Due to their popularity, power flow based MW-mile and short run marginal cost pricing methodologies will be covered in some detail. We conclude the paper with examples of the application of these two pricing methodologies for pricing transmission services in Brazil. (author) 25 refs., 2 tabs.

  9. Analytical methodology for nuclear safeguards

    International Nuclear Information System (INIS)

    Ramakumar, K.L.

    2011-01-01

    This paper attempts to briefly describe the analytical methodologies available and also highlight some of the challenges, expectations from nuclear material accounting and control (NUMAC) point of view

  10. Microbiological Methodology in Astrobiology

    Science.gov (United States)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  11. Kaupapa Maori Methodology: Trusting the Methodology through Thick and Thin

    Science.gov (United States)

    Hiha, Anne Aroha

    2016-01-01

    Kaupapa Maori is thoroughly theorised in academia in Aotearoa and those wishing to use it as their research methodology can find support through the writing of a number of Maori academics. What is not so well articulated, is the experiential voice of those who have used Kaupapa Maori as research methodology. My identity as a Maori woman…

  12. 2011 Radioactive Materials Usage Survey for Unmonitored Point Sources

    Energy Technology Data Exchange (ETDEWEB)

    Sturgeon, Richard W. [Los Alamos National Laboratory

    2012-06-27

    organized. The RMUS Interview Form with the attached RMUS Process Form(s) provides the radioactive materials survey data by technical area (TA) and building number. The survey data for each release point includes information such as: exhaust stack identification number, room number, radioactive material source type (i.e., potential source or future potential source of air emissions), radionuclide, usage (in curies) and usage basis, physical state (gas, liquid, particulate, solid, or custom), release fraction (from Appendix D to 40 CFR 61, Subpart H), and process descriptions. In addition, the interview form also calculates emissions (in curies), lists mrem/Ci factors, calculates PEDEs, and states the location of the critical receptor for that release point. [The critical receptor is the maximum exposed off-site member of the public, specific to each individual facility.] Each of these data fields is described in this section. The Tier classification of release points, which was first introduced with the 1999 usage survey, is also described in detail in this section. Section 4 includes a brief discussion of the dose estimate methodology, and includes a discussion of several release points of particular interest in the CY 2011 usage survey report. It also includes a table of the calculated PEDEs for each release point at its critical receptor. Section 5 describes ES's approach to Quality Assurance (QA) for the usage survey. Satisfactory completion of the survey requires that team members responsible for Rad-NESHAP (National Emissions Standard for Hazardous Air Pollutants) compliance accurately collect and process several types of information, including radioactive materials usage data, process information, and supporting information. They must also perform and document the QA reviews outlined in Section 5.2.6 (Process Verification and Peer Review) of ES-RN, 'Quality Assurance Project Plan for the Rad-NESHAP Compliance Project' to verify that all information is

  13. Acetylcholine receptor antibody

    Science.gov (United States)

    ... medlineplus.gov/ency/article/003576.htm Acetylcholine receptor antibody To use the sharing features on this page, please enable JavaScript. Acetylcholine receptor antibody is a protein found in the blood of ...

  14. Cooperative ethylene receptor signaling

    OpenAIRE

    Liu, Qian; Wen, Chi-Kuang

    2012-01-01

    The gaseous plant hormone ethylene is perceived by a family of five ethylene receptor members in the dicotyledonous model plant Arabidopsis. Genetic and biochemical studies suggest that the ethylene response is suppressed by ethylene receptor complexes, but the biochemical nature of the receptor signal is unknown. Without appropriate biochemical measures to trace the ethylene receptor signal and quantify the signal strength, the biological significance of the modulation of ethylene responses ...

  15. A methodology for social experimentation

    DEFF Research Database (Denmark)

    Ravn, Ib

    A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations......A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations...

  16. Workshops as a Research Methodology

    Science.gov (United States)

    Ørngreen, Rikke; Levinsen, Karin

    2017-01-01

    This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and workshops as a research methodology. Focusing primarily on…

  17. Methodological Pluralism and Narrative Inquiry

    Science.gov (United States)

    Michie, Michael

    2013-01-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on "what meaning is being made" rather than "what is happening here" (quadrant 2 rather than…

  18. Building ASIPS the Mescal methodology

    CERN Document Server

    Gries, Matthias

    2006-01-01

    A number of system designers use ASIP's rather than ASIC's to implement their system solutions. This book gives a comprehensive methodology for the design of these application-specific instruction processors (ASIPs). It includes demonstrations of applications of the methodologies using the Tipi research framework.

  19. A methodology for software documentation

    OpenAIRE

    Torres Júnior, Roberto Dias; Ahlert, Hubert

    2000-01-01

    With the growing complexity of window based software and the use of object-oriented, the development of software is getting more complex than ever. Based on that, this article intends to present a methodology for software documentation and to analyze our experience and how this methodology can aid the software maintenance

  20. A multimedia exposure assessment methodology for evaluating the performance of the design of structures containing chemical and radioactive wastes

    International Nuclear Information System (INIS)

    Stephanatos, B.N.; Molholt, B.; Walter, K.P.; MacGregor, A.

    1991-01-01

    The objectives of this work are to develop a multimedia exposure assessment methodology for the evaluation of existing and future design of structures containing chemical and radioactive wastes and to identify critical parameters for design optimization. The designs are evaluated in terms of their compliance with various federal and state regulatory requirements. Evaluation of the performance of a particular design is presented within the scope of a given exposure pathway. An exposure pathway has four key components: (1) a source and mechanism of chemical release, (2) a transport medium; (3) a point of exposure; and (4) a route of exposure. The first step in the analysis is the characterization of the waste source behavior. The rate and concentration of releases from the source are evaluated using appropriate mathematical models. The migration of radionuclides and chemicals is simulated through each environmental medium to the exposure point. The total exposure to the potential receptor is calculated, and an estimate of the health effects of the exposure is made. Simulation of the movement of radionuclides and chemical wastes from the source to the receptor point includes several processes. If the predicted human exposure to contaminants meets the performance criteria, the design has been validated. Otherwise the structure design is improved to meet the performance criteria. A phased modeling approach is recommended at a particular mixed waste site. A relatively simple model is initially used to pinpoint critical fate and transport processes and design parameters. The second phase of the modeling effort involves the use of more complex and resource intensive fate and transport models. This final step in the modeling process provides more accurate estimates of contaminant concentrations at the point of exposure. Thus the human dose is more accurately predicted, providing better design validation

  1. Sociocultural Meanings of Nanotechnology: Research Methodologies

    Science.gov (United States)

    Bainbridge, William Sims

    2004-06-01

    This article identifies six social-science research methodologies that will be useful for charting the sociocultural meaning of nanotechnology: web-based questionnaires, vignette experiments, analysis of web linkages, recommender systems, quantitative content analysis, and qualitative textual analysis. Data from a range of sources are used to illustrate how the methods can delineate the intellectual content and institutional structure of the emerging nanotechnology culture. Such methods will make it possible in future to test hypotheses such as that there are two competing definitions of nanotechnology - the technical-scientific and the science-fiction - that are influencing public perceptions by different routes and in different directions.

  2. An Application Of Receptor Modeling To Identify Airborne Particulate ...

    African Journals Online (AJOL)

    An Application Of Receptor Modeling To Identify Airborne Particulate Sources In Lagos, Nigeria. FS Olise, OK Owoade, HB Olaniyi. Abstract. There have been no clear demarcations between industrial and residential areas of Lagos with focus on industry as the major source. There is need to identify potential source types in ...

  3. AEGIS methodology and a perspective from AEGIS methodology demonstrations

    International Nuclear Information System (INIS)

    Dove, F.H.

    1981-03-01

    Objectives of AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) are to develop the capabilities needed to assess the post-closure safety of waste isolation in geologic formation; demonstrate these capabilities on reference sites; apply the assessment methodology to assist the NWTS program in site selection, waste package and repository design; and perform repository site analyses for the licensing needs of NWTS. This paper summarizes the AEGIS methodology, the experience gained from methodology demonstrations, and provides an overview in the following areas: estimation of the response of a repository to perturbing geologic and hydrologic events; estimation of the transport of radionuclides from a repository to man; and assessment of uncertainties

  4. Source Water Protection Contaminant Sources

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Simplified aggregation of potential contaminant sources used for Source Water Assessment and Protection. The data is derived from IDNR, IDALS, and US EPA program...

  5. Tritium-labelled leukotriene B4 binding to the guinea-pig spleen membrane preparation: a rich tissue source for a high-affinity leukotriene B4 receptor site

    International Nuclear Information System (INIS)

    Cheng, J.B.; Cheng, E.I.; Kohi, F.; Townley, R.G.

    1986-01-01

    Intact human granulocytes contain a leukotriene (LT) B4 receptor binding site, but the limited supply of these cells could adversely affect further progress of the study of this receptor. To select a tissue homogenate rich for this site, we have characterized the binding of highly specific [ 3 H]LTB4 to guinea-pig spleen membranes and we have determined the ability of LTB4 to compete with [ 3 H]LTB4 for binding sites in the membranes of 10 nonspleen tissues. In the spleen membrane, MgCl2 and CaCl2 enhanced [ 3 H]LTB4 binding, but NaCl and KCl decreased it. Spleen [ 3 H] LTB4 binding was a function of protein concentration and was rapid, reversible, stereoselective and saturable. Kinetic analyses showed that the rate constant for association and dissociation at 25 0 C was 0.47 nM-1 min-1 and 0.099 min-1, respectively. A Scatchard plot of the data of the equilibrium experiment resulted a straight line with a dissociation constant of 1.8 nM and a density of 274 fmol/mg of protein. Moreover, the LTB4/[ 3 H]LTB4 competition study performed at 4 or 25 0 C revealed the inhibitory constant (Ki) of LTB4 to be in the nanomolar range. The rank order of agents competing for spleen [ 3 H]LTB4 binding was: LTB4 (Ki = 2.8 nM) greater than 20-hydroxy-LTB4 (23 nM) greater than LTA4 (48 nM) greater than LTA4 methyl ester (0.13 microM) greater than 20-carboxy-LTB4 (greater than 6.6 microM) greater than or equal to arachidonic acid (0.15mM) = FPL-55,712 and FPL-57,231 (0.1-0.2 mM). Competition studies further indicated that felodipine, a 1,4-dihyropyridine Ca++ channel blocker, exhibited micromolar inhibition of spleen [ 3 H]LTB4 binding

  6. Evolution of source term definition and analysis

    International Nuclear Information System (INIS)

    Lutz, R.J. Jr.

    2004-01-01

    The objective of this presentation was to provide an overview of the evolution of accident fission product release analysis methodology and the obtained results; and to provide an overview of the source term implementation analysis in regulatory decisions

  7. Deterministic extraction from weak random sources

    CERN Document Server

    Gabizon, Ariel

    2011-01-01

    In this research monograph, the author constructs deterministic extractors for several types of sources, using a methodology of recycling randomness which enables increasing the output length of deterministic extractors to near optimal length.

  8. Calibration of sources for alpha spectroscopy systems

    International Nuclear Information System (INIS)

    Freitas, I.S.M.; Goncalez, O.L.

    1992-01-01

    This paper describes the calibration methodology for measuring the total alpha activity of plane and thin sources with the Alpha Spectrometer for Silicon Detector in the Nuclear Measures and Dosimetry laboratory at IEAv/CTA. (author)

  9. Does Metformin Reduce Cancer Risks? Methodologic Considerations.

    Science.gov (United States)

    Golozar, Asieh; Liu, Shuiqing; Lin, Joeseph A; Peairs, Kimberly; Yeh, Hsin-Chieh

    2016-01-01

    The substantial burden of cancer and diabetes and the association between the two conditions has been a motivation for researchers to look for targeted strategies that can simultaneously affect both diseases and reduce their overlapping burden. In the absence of randomized clinical trials, researchers have taken advantage of the availability and richness of administrative databases and electronic medical records to investigate the effects of drugs on cancer risk among diabetic individuals. The majority of these studies suggest that metformin could potentially reduce cancer risk. However, the validity of this purported reduction in cancer risk is limited by several methodological flaws either in the study design or in the analysis. Whether metformin use decreases cancer risk relies heavily on the availability of valid data sources with complete information on confounders, accurate assessment of drug use, appropriate study design, and robust analytical techniques. The majority of the observational studies assessing the association between metformin and cancer risk suffer from methodological shortcomings and efforts to address these issues have been incomplete. Future investigations on the association between metformin and cancer risk should clearly address the methodological issues due to confounding by indication, prevalent user bias, and time-related biases. Although the proposed strategies do not guarantee a bias-free estimate for the association between metformin and cancer, they will reduce synthesis of and reporting of erroneous results.

  10. In situ aromatase expression in primary tumor is associated with estrogen receptor expression but is not predictive of response to endocrine therapy in advanced breast cancer

    DEFF Research Database (Denmark)

    Lykkesfeldt, Anne E; Henriksen, Katrine L; Rasmussen, Birgitte B

    2009-01-01

    BACKGROUND: New, third-generation aromatase inhibitors (AIs) have proven comparable or superior to the anti-estrogen tamoxifen for treatment of estrogen receptor (ER) and/or progesterone receptor (PR) positive breast cancer. AIs suppress total body and intratumoral estrogen levels. It is unclear...... whether in situ carcinoma cell aromatization is the primary source of estrogen production for tumor growth and whether the aromatase expression is predictive of response to endocrine therapy. Due to methodological difficulties in the determination of the aromatase protein, COX-2, an enzyme involved...... of advanced breast cancer. Semi-quantitative immunohistochemical (IHC) analysis was performed for ER, PR, COX-2 and aromatase using Tissue Microarrays (TMAs). Aromatase was also analyzed using whole sections (WS). Kappa analysis was applied to compare association of protein expression levels. Univariate...

  11. Selection of skin dose calculation methodologies

    International Nuclear Information System (INIS)

    Farrell, W.E.

    1987-01-01

    This paper reports that good health physics practice dictates that a dose assessment be performed for any significant skin contamination incident. There are, however, several methodologies that could be used, and while there is probably o single methodology that is proper for all cases of skin contamination, some are clearly more appropriate than others. This can be demonstrated by examining two of the more distinctly different options available for estimating skin dose the calculational methods. The methods compiled by Healy require separate beta and gamma calculations. The beta calculational method is the derived by Loevinger, while the gamma dose is calculated from the equation for dose rate from an infinite plane source with an absorber between the source and the detector. Healy has provided these formulas in graphical form to facilitate rapid dose rate determinations at density thicknesses of 7 and 20 mg/cm 2 . These density thicknesses equate to the regulatory definition of the sensitive layer of the skin and a more arbitrary value to account of beta absorption in contaminated clothing

  12. Application of the Spanish methodological approach for biosphere assessment to a generic high-level waste disposal site

    International Nuclear Information System (INIS)

    Agueero, A.; Pinedo, P.; Simon, I.; Cancio, D.; Moraleda, M.; Trueba, C.; Perez-Sanchez, D.

    2008-01-01

    A methodological approach which includes conceptual developments, methodological aspects and software tools have been developed in the Spanish context, based on the BIOMASS 'Reference Biospheres Methodology'. The biosphere assessments have to be undertaken with the aim of demonstrating compliance with principles and regulations established to limit the possible radiological impact of radioactive waste disposals on human health and on the environment, and to ensure that future generations will not be exposed to higher radiation levels than those that would be acceptable today. The biosphere in the context of high-level waste disposal is defined as the collection of various radionuclide transfer pathways that may result in releases into the surface environment, transport within and between the biosphere receptors, exposure of humans and biota, and the doses/risks associated with such exposures. The assessments need to take into account the complexity of the biosphere, the nature of the radionuclides released and the long timescales considered. It is also necessary to make assumptions related to the habits and lifestyle of the exposed population, human activities in the long term and possible modifications of the biosphere. A summary on the Spanish methodological approach for biosphere assessment are presented here as well as its application in a Spanish generic case study. A reference scenario has been developed based on current conditions at a site located in Central-West Spain, to indicate the potential impact to the actual population. In addition, environmental change has been considered qualitatively through the use of interaction matrices and transition diagrams. Unit source terms of 36 Cl, 79 Se, 99 Tc, 129 I, 135 Cs, 226 Ra, 231 Pa, 238 U, 237 Np and 239 Pu have been taken. Two exposure groups of infants and adults have been chosen for dose calculations. Results are presented and their robustness is evaluated through the use of uncertainty and sensitivity

  13. Application of the Spanish methodological approach for biosphere assessment to a generic high-level waste disposal site.

    Science.gov (United States)

    Agüero, A; Pinedo, P; Simón, I; Cancio, D; Moraleda, M; Trueba, C; Pérez-Sánchez, D

    2008-09-15

    A methodological approach which includes conceptual developments, methodological aspects and software tools have been developed in the Spanish context, based on the BIOMASS "Reference Biospheres Methodology". The biosphere assessments have to be undertaken with the aim of demonstrating compliance with principles and regulations established to limit the possible radiological impact of radioactive waste disposals on human health and on the environment, and to ensure that future generations will not be exposed to higher radiation levels than those that would be acceptable today. The biosphere in the context of high-level waste disposal is defined as the collection of various radionuclide transfer pathways that may result in releases into the surface environment, transport within and between the biosphere receptors, exposure of humans and biota, and the doses/risks associated with such exposures. The assessments need to take into account the complexity of the biosphere, the nature of the radionuclides released and the long timescales considered. It is also necessary to make assumptions related to the habits and lifestyle of the exposed population, human activities in the long term and possible modifications of the biosphere. A summary on the Spanish methodological approach for biosphere assessment are presented here as well as its application in a Spanish generic case study. A reference scenario has been developed based on current conditions at a site located in Central-West Spain, to indicate the potential impact to the actual population. In addition, environmental change has been considered qualitatively through the use of interaction matrices and transition diagrams. Unit source terms of (36)Cl, (79)Se, (99)Tc, (129)I, (135)Cs, (226)Ra, (231)Pa, (238)U, (237)Np and (239)Pu have been taken. Two exposure groups of infants and adults have been chosen for dose calculations. Results are presented and their robustness is evaluated through the use of uncertainty and

  14. GABA receptor imaging

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Doo [Yonsei University College of Medicine, Seoul (Korea, Republic of)

    2007-04-15

    GABA is primary an inhibitory neurotransmitter that is localized in inhibitory interneurons. GABA is released from presynaptic terminals and functions by binding to GABA receptors. There are two types of GABA receptors, GABA{sub A}-receptor that allows chloride to pass through a ligand gated ion channel and GABA{sub B}-receptor that uses G-proteins for signaling. The GABA{sub A}-receptor has a GABA binding site as well as a benzodiazepine binding sites, which modulate GABA{sub A}-receptor function. Benzodiazepine GABAA receptor imaging can be accomplished by radiolabeling derivates that activates benzodiazepine binding sites. There has been much research on flumazenil (FMZ) labeled with {sup 11}C-FMZ, a benzodiazepine derivate that is a selective, reversible antagonist to GABAA receptors. Recently, {sup 18}F-fluoroflumazenil (FFMZ) has been developed to overcome {sup 11}C's short half-life. {sup 18}F-FFMZ shows high selective affinity and good pharmacodynamics, and is a promising PET agent with better central benzodiazepine receptor imaging capabilities. In an epileptic focus, because the GABA/benzodiazepine receptor amount is decreased, using '1{sup 1}C-FMZ PET instead of {sup 18}F-FDG, PET, restrict the foci better and may also help find lesions better than high resolution MR. GABA{sub A} receptors are widely distributed in the cerebral cortex, and can be used as an viable neuronal marker. Therefore it can be used as a neuronal cell viability marker in cerebral ischemia. Also, GABA-receptors decrease in areas where neuronal plasticity develops, therefore, GABA imaging can be used to evaluate plasticity. Besides these usages, GABA receptors are related with psychological diseases, especially depression and schizophrenia as well as cerebral palsy, a motor-related disorder, so further in-depth studies are needed for these areas.

  15. GABA receptor imaging

    International Nuclear Information System (INIS)

    Lee, Jong Doo

    2007-01-01

    GABA is primary an inhibitory neurotransmitter that is localized in inhibitory interneurons. GABA is released from presynaptic terminals and functions by binding to GABA receptors. There are two types of GABA receptors, GABA A -receptor that allows chloride to pass through a ligand gated ion channel and GABA B -receptor that uses G-proteins for signaling. The GABA A -receptor has a GABA binding site as well as a benzodiazepine binding sites, which modulate GABA A -receptor function. Benzodiazepine GABAA receptor imaging can be accomplished by radiolabeling derivates that activates benzodiazepine binding sites. There has been much research on flumazenil (FMZ) labeled with 11 C-FMZ, a benzodiazepine derivate that is a selective, reversible antagonist to GABAA receptors. Recently, 18 F-fluoroflumazenil (FFMZ) has been developed to overcome 11 C's short half-life. 18 F-FFMZ shows high selective affinity and good pharmacodynamics, and is a promising PET agent with better central benzodiazepine receptor imaging capabilities. In an epileptic focus, because the GABA/benzodiazepine receptor amount is decreased, using '1 1 C-FMZ PET instead of 18 F-FDG, PET, restrict the foci better and may also help find lesions better than high resolution MR. GABA A receptors are widely distributed in the cerebral cortex, and can be used as an viable neuronal marker. Therefore it can be used as a neuronal cell viability marker in cerebral ischemia. Also, GABA-receptors decrease in areas where neuronal plasticity develops, therefore, GABA imaging can be used to evaluate plasticity. Besides these usages, GABA receptors are related with psychological diseases, especially depression and schizophrenia as well as cerebral palsy, a motor-related disorder, so further in-depth studies are needed for these areas

  16. Positron sources

    International Nuclear Information System (INIS)

    Chehab, R.

    1994-01-01

    A tentative survey of positron sources is given. Physical processes on which positron generation is based are indicated and analyzed. Explanation of the general features of electromagnetic interactions and nuclear β + decay makes it possible to predict the yield and emittance for a given optical matching system between the positron source and the accelerator. Some kinds of matching systems commonly used - mainly working with solenoidal field - are studied and the acceptance volume calculated. Such knowledge is helpful in comparing different matching systems. Since for large machines, a significant distance exists between the positron source and the experimental facility, positron emittance has to be preserved during beam transfer over large distances and methods used for that purpose are indicated. Comparison of existing positron sources leads to extrapolation to sources for future linear colliders. Some new ideas associated with these sources are also presented. (orig.)

  17. Methodological practicalities in analytical generalization

    DEFF Research Database (Denmark)

    Halkier, Bente

    2011-01-01

    generalization. Theoretically, the argumentation in the article is based on practice theory. The main part of the article describes three different examples of ways of generalizing on the basis of the same qualitative data material. There is a particular focus on describing the methodological strategies......In this article, I argue that the existing literature on qualitative methodologies tend to discuss analytical generalization at a relatively abstract and general theoretical level. It is, however, not particularly straightforward to “translate” such abstract epistemological principles into more...... operative methodological strategies for producing analytical generalizations in research practices. Thus, the aim of the article is to contribute to the discussions among qualitatively working researchers about generalizing by way of exemplifying some of the methodological practicalities in analytical...

  18. Nanotoxicology materials, methodologies, and assessments

    CERN Document Server

    Durán, Nelson; Alves, Oswaldo L; Zucolotto, Valtencir

    2014-01-01

    This book begins with a detailed introduction to engineered nanostructures, followed by a section on methodologies used in research on cytotoxicity and genotoxicity, and concluding with evidence for the cyto- and genotoxicity of specific nanoparticles.

  19. Reflective Methodology: The Beginning Teacher

    Science.gov (United States)

    Templeton, Ronald K.; Siefert, Thomas E.

    1970-01-01

    Offers a variety of specific techniques which will help the beginning teacher to implement reflective methodology and create an inquiry-centered classroom atmosphere, at the same time meeting the many more pressing demands of first-year teaching. (JES)

  20. Sources management

    International Nuclear Information System (INIS)

    Mansoux, H.; Gourmelon; Scanff, P.; Fournet, F.; Murith, Ch.; Saint-Paul, N.; Colson, P.; Jouve, A.; Feron, F.; Haranger, D.; Mathieu, P.; Paycha, F.; Israel, S.; Auboiroux, B.; Chartier, P.

    2005-01-01

    Organized by the section of technical protection of the French society of radiation protection ( S.F.R.P.), these two days had for objective to review the evolution of the rule relative to the sources of ionising radiations 'sealed and unsealed radioactive sources, electric generators'. They addressed all the actors concerned by the implementation of the new regulatory system in the different sectors of activities ( research, medicine and industry): Authorities, manufacturers, and suppliers of sources, holders and users, bodies involved in the approval of sources, carriers. (N.C.)

  1. Methodologies used in Project Management

    OpenAIRE

    UNGUREANU, Adrian; UNGUREANU, Anca

    2014-01-01

    Undoubtedly, a methodology properly defined and strictly followed for project management provides a firm guarantee that the work will be done on time, in budget and according to specifications. A project management methodology in simple terms is a “must-have” to avoid failure and reduce risks, because is one of the critical success factors, such basic skills of the management team. This is the simple way to guide the team through the design and execution phases, processes and tasks throughout...

  2. Methodology for ranking restoration options

    DEFF Research Database (Denmark)

    Jensen, Per Hedemann

    1999-01-01

    techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps:-characterisation of relevant contaminated sites -identication and characterisation of relevant restoration...... techniques -assessment of the radiological impact -development and application of a selection methodology for restoration options -formulation ofgeneric conclusions and development of a manual The project is intended to apply to situations in which sites with nuclear installations have been contaminated...

  3. Glucocorticoid receptor modulators.

    Science.gov (United States)

    Meijer, Onno C; Koorneef, Lisa L; Kroon, Jan

    2018-06-01

    The glucocorticoid hormone cortisol acts throughout the body to support circadian processes and adaptation to stress. The glucocorticoid receptor is the target of cortisol and of synthetic glucocorticoids, which are used widely in the clinic. Both agonism and antagonism of the glucocorticoid receptor may be beneficial in disease, but given the wide expression of the receptor and involvement in various processes, beneficial effects are often accompanied by unwanted side effects. Selective glucocorticoid receptor modulators are ligands that induce a receptor conformation that allows activation of only a subset of downstream signaling pathways. Such molecules thereby combine agonistic and antagonistic properties. Here we discuss the mechanisms underlying selective receptor modulation and their promise in treating diseases in several organ systems where cortisol signaling plays a role. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  4. Internal sources dosimetry

    International Nuclear Information System (INIS)

    Savio, Eduardo

    1994-01-01

    The absorbed dose, need of estimation in risk evaluation in the application of radiopharmaceuticals in Nuclear Medicine practice,internal dosimetry,internal and external sources. Calculation methodology,Marinelli model,MIRD system for absorbed dose calculation based on biological parameters of radiopharmaceutical in human body or individual,energy of emitted radiations by administered radionuclide, fraction of emitted energy that is absorbed by target body.Limitation of the MIRD calculation model. A explanation of Marinelli method of dosimetry calculation,β dosimetry. Y dosimetry, effective dose, calculation in organs and tissues, examples. Bibliography .

  5. Dengue virus receptor

    OpenAIRE

    Hidari, Kazuya I.P.J.; Suzuki, Takashi

    2011-01-01

    Dengue virus is an arthropod-borne virus transmitted by Aedes mosquitoes. Dengue virus causes fever and hemorrhagic disorders in humans and non-human primates. Direct interaction of the virus introduced by a mosquito bite with host receptor molecule(s) is crucial for virus propagation and the pathological progression of dengue diseases. Therefore, elucidation of the molecular mechanisms underlying the interaction between dengue virus and its receptor(s) in both humans and mosquitoes is essent...

  6. Sourcing Excellence

    DEFF Research Database (Denmark)

    Adeyemi, Oluseyi

    2011-01-01

    Sourcing Excellence is one of the key performance indicators (KPIs) in this world of ever changing sourcing strategies. Manufacturing companies need to access and diagnose the reliability and competencies of existing suppliers in order to coordinate and develop them. This would help in managing...

  7. Methodology for Calculating Latency of GPS Probe Data

    Energy Technology Data Exchange (ETDEWEB)

    Young, Stanley E [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wang, Zhongxiang [University of Maryland; Hamedi, Masoud [University of Maryland

    2017-10-01

    Crowdsourced GPS probe data, such as travel time on changeable-message signs and incident detection, have been gaining popularity in recent years as a source for real-time traffic information to driver operations and transportation systems management and operations. Efforts have been made to evaluate the quality of such data from different perspectives. Although such crowdsourced data are already in widespread use in many states, particularly the high traffic areas on the Eastern seaboard, concerns about latency - the time between traffic being perturbed as a result of an incident and reflection of the disturbance in the outsourced data feed - have escalated in importance. Latency is critical for the accuracy of real-time operations, emergency response, and traveler information systems. This paper offers a methodology for measuring probe data latency regarding a selected reference source. Although Bluetooth reidentification data are used as the reference source, the methodology can be applied to any other ground truth data source of choice. The core of the methodology is an algorithm for maximum pattern matching that works with three fitness objectives. To test the methodology, sample field reference data were collected on multiple freeway segments for a 2-week period by using portable Bluetooth sensors as ground truth. Equivalent GPS probe data were obtained from a private vendor, and their latency was evaluated. Latency at different times of the day, impact of road segmentation scheme on latency, and sensitivity of the latency to both speed-slowdown and recovery-from-slowdown episodes are also discussed.

  8. Theoretical and methodological approaches to economic competitiveness (Part I

    Directory of Open Access Journals (Sweden)

    Macari Vadim

    2013-01-01

    Full Text Available The article is based on study of several representative bibliographical sources,it is tried to examine, to order from logical and scientific point of view some of the most common theoretical and methodological understandings of the essence, definition, phenomenon, types, haracteristics and indices of economic competitiveness.

  9. Theoretical and methodological approaches to economic competitiveness (part II

    Directory of Open Access Journals (Sweden)

    Macari Vadim

    2013-02-01

    Full Text Available This article, on the basis of the study of many representative bibliographic sources, examines and tries to order from logical and scientific point of view some of the most common theoretical and methodological treatments of the essence, definition, phenomenon, types, characteristics and indices of economic competitiveness.

  10. THEORETICAL AND METHODOLOGICAL APPROACHES TO ECONOMIC COMPETITIVENESS (Part II

    Directory of Open Access Journals (Sweden)

    Vadim MACARI

    2013-02-01

    Full Text Available This article, on the basis of the study of many representative bibliographic sources, examines and tries to order from logical and scientific point of view some of the most common theoretical and methodological treatments of the essence, definition, phenomenon, types, characteristics and indices of economic competitiveness.

  11. THEORETICAL AND METHODOLOGICAL APPROACHES TO ECONOMIC COMPETITIVENESS (Part I

    Directory of Open Access Journals (Sweden)

    Vadim MACARI

    2013-01-01

    Full Text Available The article is based on study of several representative bibliographical sources,it is tried to examine, to order from logical and scientific point of view some of the most common theoretical and methodological understandings of the essence, definition, phenomenon, types, characteristics and indices of economic competitiveness.

  12. Beta adrenergic receptors in human cavernous tissue

    Energy Technology Data Exchange (ETDEWEB)

    Dhabuwala, C.B.; Ramakrishna, C.V.; Anderson, G.F.

    1985-04-01

    Beta adrenergic receptor binding was performed with /sup 125/I iodocyanopindolol on human cavernous tissue membrane fractions from normal tissue and transsexual procedures obtained postoperatively, as well as from postmortem sources. Isotherm binding studies on normal fresh tissues indicated that the receptor density was 9.1 fmoles/mg. with a KD of 23 pM. Tissue stored at room temperature for 4 to 6 hours, then at 4C in saline solution for 19 to 20 hours before freezing showed no significant changes in receptor density or affinity, and provided evidence for the stability of postmortem tissue obtained within the same time period. Beta receptor density of 2 cavernous preparations from transsexual procedures was not significantly different from normal control tissues, and showed that high concentrations of estrogen received by these patients had no effect on beta adrenergic receptor density. Displacement of /sup 125/iodocyanopindolol by 5 beta adrenergic agents demonstrated that 1-propranolol had the greatest affinity followed by ICI 118,551, zinterol, metoprolol and practolol. When the results of these displacement studies were subjected to Scatfit, non- linear regression line analysis, a single binding site was described. Based on the relative potency of the selective beta adrenergic agents it appears that these receptors were of the beta 2 subtype.

  13. Positron sources

    International Nuclear Information System (INIS)

    Chehab, R.

    1989-01-01

    A tentative survey of positron sources is given. Physical processes on which positron generation is based are indicated and analyzed. Explanation of the general features of electromagnetic interactions and nuclear β + decay makes it possible to predict the yield and emittance for a given optical matching system between the positron source and the accelerator. Some kinds of matching systems commonly used - mainly working with solenoidal fields - are studied and the acceptance volume calculated. Such knowledge is helpful in comparing different matching systems. Since for large machines, a significant distance exists between the positron source and the experimental facility, positron emittance has to be preserved during beam transfer over large distances and methods used for that purpose are indicated. Comparison of existing positron sources leads to extrapolation to sources for future linear colliders

  14. Effect of nutrient sources on bench scale vinegar production using response surface methodology Efeito das fontes de nutrientes sobre a produção de vinagre em escala de bancada, usando-se a metodologia de superfície de resposta

    Directory of Open Access Journals (Sweden)

    Joelma M. Ferreira

    2005-03-01

    Full Text Available The present work aims to evaluate on a bench scale, the effects of nitrogen and phosphorous nutrient source concentrations in vinegar production, a process that is used by small scale industries in the State of Paraiba. The response surface methodology has been utilized for modeling and optimization of the fermentation process. Initially a 2³ complete factorial design was used, where the effects of initial concentrations of ethyl alcohol, phosphorous and nitrogen sources were observed. After this analysis the concentration range of the nutrient variables were extended and a two level plus a star configuration factorial experimental design was performed. The experimental values are well represented by the linear and quadratic model equations obtained. The optimum concentration of ethanol was 4% in which the yield and the productivity of the acetic acid were maximized to the values of 70% and 0.87 g L-1 h-1 respectively, for a 24 hours fermentation period. The evaluation of the quadratic models showed that the yield of vinegar is maximized from 28.1 to 51.04% and the productivity from 0.69 to 1.29 g L-1 h-1 when the concentration of the nitrogen nutrient in the medium is increased from 0.2 to 2.3 g mL-1. Thus, at the optimized nitrogen nutrient concentration both the yield and the productivity of the vinegar are increased by 1.85 times.Objetivou-se com o presente trabalho, estudar em escala de bancada, os efeitos de concentrações de fontes dos nutrientes nitrogênio e fósforo sobre a produção de vinagre de álcool, um processo muito utilizado nas indústrias de pequeno porte do Estado da Paraíba. A metodologia de superfície de resposta foi usada na modelagem e otimização de processo de fermentação acética. Inicialmente, a metodologia de planejamento fatorial completo 2³ foi utilizada, onde os efeitos das concentrações iniciais de etanol, de fontes de fósforo e de nitrogênio foram observados. Após esta análise as faixas das

  15. Managerial Methodology in Public Institutions

    Directory of Open Access Journals (Sweden)

    Ion VERBONCU

    2010-10-01

    Full Text Available One of the most important ways of making public institutions more efficient is by applying managerial methodology, embodied in the promotion of management tools, modern and sophisticated methodologies, as well as operation of designing/redesigning and maintenance of the management process and its components. Their implementation abides the imprint of constructive and functional particularities of public institutions, decentralized and devolved, and, of course, the managers’ expertise of these organizations. Managerial methodology is addressed through three important instruments diagnosis, management by objectives and scoreboard. Its presence in the performance management process should be mandatory, given the favorable influence on the management and economic performance and the degree of scholastic approach of the managers’ performance.

  16. Blanket safety by GEMSAFE methodology

    International Nuclear Information System (INIS)

    Sawada, Tetsuo; Saito, Masaki

    2001-01-01

    General Methodology of Safety Analysis and Evaluation for Fusion Energy Systems (GEMSAFE) has been applied to a number of fusion system designs, such as R-tokamak, Fusion Experimental Reactor (FER), and the International Thermonuclear Experimental Reactor (ITER) designs in the both stages of Conceptual Design Activities (CDA) and Engineering Design Activities (EDA). Though the major objective of GEMSAFE is to reasonably select design basis events (DBEs) it is also useful to elucidate related safety functions as well as requirements to ensure its safety. In this paper, we apply the methodology to fusion systems with future tritium breeding blankets and make clear which points of the system should be of concern from safety ensuring point of view. In this context, we have obtained five DBEs that are related to the blanket system. We have also clarified the safety functions required to prevent accident propagations initiated by those blanket-specific DBEs. The outline of the methodology is also reviewed. (author)

  17. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  18. Angiotensin type 2 receptors

    DEFF Research Database (Denmark)

    Sumners, Colin; de Kloet, Annette D; Krause, Eric G

    2015-01-01

    In most situations, the angiotensin AT2-receptor (AT2R) mediates physiological actions opposing those mediated by the AT1-receptor (AT1R), including a vasorelaxant effect. Nevertheless, experimental evidence vastly supports that systemic application of AT2R-agonists is blood pressure neutral...

  19. Geologic modeling in risk assessment methodology for radioactive waste management

    International Nuclear Information System (INIS)

    Logan, S.E.; Berbano, M.C.

    1977-01-01

    Under contract to the U.S. Environmental Protection Agency (EPA), the University of New Mexico is developing a computer based assessment methodology for evaluating public health and environmental impacts from the disposal of radioactive waste in geologic formations. Methodology incorporates a release or fault tree model, an environmental model, and an economic model. The release model and its application to a model repository in bedded salt is described. Fault trees are constructed to provide the relationships between various geologic and man-caused events which are potential mechanisms for release of radioactive material beyond the immediate environs of the repository. The environmental model includes: 1) the transport to and accumulations at various receptors in the biosphere, 2) pathways from these environmental concentrations, and 3) radiation dose to man. Finally, economic results are used to compare and assess various disposal configurations as a basis for formulatin

  20. Methodology, theoretical framework and scholarly significance: An ...

    African Journals Online (AJOL)

    Methodology, theoretical framework and scholarly significance: An overview ... AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search ... Keywords: Legal Research, Methodology, Theory, Pedagogy, Legal Training, Scholarship ...

  1. Glutamate receptor agonists

    DEFF Research Database (Denmark)

    Vogensen, Stine Byskov; Greenwood, Jeremy R; Bunch, Lennart

    2011-01-01

    The neurotransmitter (S)-glutamate [(S)-Glu] is responsible for most of the excitatory neurotransmission in the central nervous system. The effect of (S)-Glu is mediated by both ionotropic and metabotropic receptors. Glutamate receptor agonists are generally a-amino acids with one or more...... stereogenic centers due to strict requirements in the agonist binding pocket of the activated state of the receptor. By contrast, there are many examples of achiral competitive antagonists. The present review addresses how stereochemistry affects the activity of glutamate receptor ligands. The review focuses...... mainly on agonists and discusses stereochemical and conformational considerations as well as biostructural knowledge of the agonist binding pockets, which is useful in the design of glutamate receptor agonists. Examples are chosen to demonstrate how stereochemistry not only determines how the agonist...

  2. AMPA receptor ligands

    DEFF Research Database (Denmark)

    Strømgaard, Kristian; Mellor, Ian

    2004-01-01

    Alpha-amino-3-hydroxy-5-methyl-4-isoxazole propionic acid (AMPA) receptors (AMPAR), subtype of the ionotropic glutamate receptors (IGRs), mediate fast synaptic transmission in the central nervous system (CNS), and are involved in many neurological disorders, as well as being a key player in the f......Alpha-amino-3-hydroxy-5-methyl-4-isoxazole propionic acid (AMPA) receptors (AMPAR), subtype of the ionotropic glutamate receptors (IGRs), mediate fast synaptic transmission in the central nervous system (CNS), and are involved in many neurological disorders, as well as being a key player...... in the formation of memory. Hence, ligands affecting AMPARs are highly important for the study of the structure and function of this receptor, and in this regard polyamine-based ligands, particularly polyamine toxins, are unique as they selectively block Ca2+ -permeable AMPARs. Indeed, endogenous intracellular...

  3. Methodological Guidelines for Advertising Research

    DEFF Research Database (Denmark)

    Rossiter, John R.; Percy, Larry

    2017-01-01

    In this article, highly experienced advertising academics and advertising research consultants John R. Rossiter and Larry Percy present and discuss what they believe to be the seven most important methodological guidelines that need to be implemented to improve the practice of advertising research....... Their focus is on methodology, defined as first choosing a suitable theoretical framework to guide the research study and then identifying the advertising responses that need to be studied. Measurement of those responses is covered elsewhere in this special issue in the article by Bergkvist and Langner. Most...

  4. Acoustic emission methodology and application

    CERN Document Server

    Nazarchuk, Zinoviy; Serhiyenko, Oleh

    2017-01-01

    This monograph analyses in detail the physical aspects of the elastic waves radiation during deformation or fracture of materials. I presents the  methodological bases for the practical use of acoustic emission device, and describes the results of theoretical and experimental researches of evaluation of the crack growth resistance of materials, selection of the useful AE signals. The efficiency of this methodology is shown through the diagnostics of various-purpose industrial objects. The authors obtain results of experimental researches with the help of the new methods and facilities.

  5. An LWR design decision Methodology

    International Nuclear Information System (INIS)

    Leahy, T.J.; Rees, D.C.; Young, J.

    1982-01-01

    While all parties involved in nuclear plant regulation endeavor to make decisions which optimize the considerations of plant safety and financial impacts, these decisions are generally made without the benefit of a systematic and rigorous approach to the questions confronting the decision makers. A Design Decision Methodology has been developed which provides such a systematic approach. By employing this methodology, which makes use of currently accepted probabilistic risk assessment techniques and cost estimation, informed decisions may be made against a background of comparisons between the relative levels of safety and costs associated with various design alternatives

  6. Qualitative methodology in developmental psychology

    DEFF Research Database (Denmark)

    Demuth, Carolin; Mey, Günter

    2015-01-01

    Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing rec...... in qualitative research offers a promising avenue to advance the field in this direction.......Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing...

  7. Qualitative Methodology in Unfamiliar Cultures

    DEFF Research Database (Denmark)

    Svensson, Christian Franklin

    2014-01-01

    on a qualitative methodology, conscious reflection on research design and objectivity is important when doing fieldwork. This case study discusses such reflections. Emphasis throughout is given to applied qualitative methodology and its contributions to the social sciences, in particular having to do......This case study discusses qualitative fieldwork in Malaysia. The trends in higher education led to investigating how and why young Indians and Chinese in Malaysia are using the university to pursue a life strategy. Given the importance of field context in designing and analysing research based...

  8. Observational methodology in sport sciences

    Directory of Open Access Journals (Sweden)

    M. Teresa Anguera

    2013-11-01

    Full Text Available This paper reviews the conceptual framework, the key literature and the methods (observation tools, such as category systems and field formats, and coding software, etc. that should be followed when conducting research from the perspective of observational methodology. The observational designs used by the authors’ research group over the last twenty years are discussed, and the procedures for analysing data and assessing their quality are described. Mention is also made of the latest methodological trends in this field, such as the use of mixed methods.

  9. Reflections on Design Methodology Research

    DEFF Research Database (Denmark)

    2011-01-01

    We shall reflect on the results of Design Methodology research and their impact on design practice. In the past 50 years the number of researchers in the field has expanded enormously – as has the number of publications. During the same period design practice and its products have changed...... and produced are also now far more complex and distributed, putting designers under ever increasing pressure. We shall address the question: Are the results of Design Methodology research appropriate and are they delivering the expected results in design practice? In our attempt to answer this question we...

  10. Open-source colorimeter.

    Science.gov (United States)

    Anzalone, Gerald C; Glover, Alexandra G; Pearce, Joshua M

    2013-04-19

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories.

  11. Open-Source Colorimeter

    Science.gov (United States)

    Anzalone, Gerald C.; Glover, Alexandra G.; Pearce, Joshua M.

    2013-01-01

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories. PMID:23604032

  12. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  13. MicroComputed Tomography: Methodology and Applications

    International Nuclear Information System (INIS)

    Stock, Stuart R.

    2009-01-01

    Due to the availability of commercial laboratory systems and the emergence of user facilities at synchrotron radiation sources, studies of microcomputed tomography or microCT have increased exponentially. MicroComputed Technology provides a complete introduction to the technology, describing how to use it effectively and understand its results. The first part of the book focuses on methodology, covering experimental methods, data analysis, and visualization approaches. The second part addresses various microCT applications, including porous solids, microstructural evolution, soft tissue studies, multimode studies, and indirect analyses. The author presents a sufficient amount of fundamental material so that those new to the field can develop a relative understanding of how to design their own microCT studies. One of the first full-length references dedicated to microCT, this book provides an accessible introduction to field, supplemented with application examples and color images.

  14. Methodologic frontiers in environmental epidemiology.

    OpenAIRE

    Rothman, K J

    1993-01-01

    Environmental epidemiology comprises the epidemiologic study of those environmental factors that are outside the immediate control of the individual. Exposures of interest to environmental epidemiologists include air pollution, water pollution, occupational exposure to physical and chemical agents, as well as psychosocial elements of environmental concern. The main methodologic problem in environmental epidemiology is exposure assessment, a problem that extends through all of epidemiologic re...

  15. IMSF: Infinite Methodology Set Framework

    Science.gov (United States)

    Ota, Martin; Jelínek, Ivan

    Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.

  16. Unattended Monitoring System Design Methodology

    International Nuclear Information System (INIS)

    Drayer, D.D.; DeLand, S.M.; Harmon, C.D.; Matter, J.C.; Martinez, R.L.; Smith, J.D.

    1999-01-01

    A methodology for designing Unattended Monitoring Systems starting at a systems level has been developed at Sandia National Laboratories. This proven methodology provides a template that describes the process for selecting and applying appropriate technologies to meet unattended system requirements, as well as providing a framework for development of both training courses and workshops associated with unattended monitoring. The design and implementation of unattended monitoring systems is generally intended to respond to some form of policy based requirements resulting from international agreements or domestic regulations. Once the monitoring requirements are established, a review of the associated process and its related facilities enables identification of strategic monitoring locations and development of a conceptual system design. The detailed design effort results in the definition of detection components as well as the supporting communications network and data management scheme. The data analyses then enables a coherent display of the knowledge generated during the monitoring effort. The resultant knowledge is then compared to the original system objectives to ensure that the design adequately addresses the fundamental principles stated in the policy agreements. Implementation of this design methodology will ensure that comprehensive unattended monitoring system designs provide appropriate answers to those critical questions imposed by specific agreements or regulations. This paper describes the main features of the methodology and discusses how it can be applied in real world situations

  17. Environmental Zoning: Some methodological implications

    NARCIS (Netherlands)

    Ike, Paul; Voogd, Henk

    1991-01-01

    The purpose of this article is to discuss some methodological problems of environmental zoning. The principle of environmental zoning will be elaborated. In addition an overview is given of a number of approaches that have been followed in practice to arrive at an integral judgement. Finally some

  18. Counting stem cells : methodological constraints

    NARCIS (Netherlands)

    Bystrykh, Leonid V.; Verovskaya, Evgenia; Zwart, Erik; Broekhuis, Mathilde; de Haan, Gerald

    The number of stem cells contributing to hematopoiesis has been a matter of debate. Many studies use retroviral tagging of stem cells to measure clonal contribution. Here we argue that methodological factors can impact such clonal analyses. Whereas early studies had low resolution, leading to

  19. Test reactor risk assessment methodology

    International Nuclear Information System (INIS)

    Jennings, R.H.; Rawlins, J.K.; Stewart, M.E.

    1976-04-01

    A methodology has been developed for the identification of accident initiating events and the fault modeling of systems, including common mode identification, as these methods are applied in overall test reactor risk assessment. The methods are exemplified by a determination of risks to a loss of primary coolant flow in the Engineering Test Reactor

  20. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  1. 16 Offsetting deficit conceptualisations: methodological ...

    African Journals Online (AJOL)

    uses the concepts of literacy practices and knowledge recontextualisation to ... 1996, 2000) theory of 'knowledge recontextualisation' in the development of curricula .... cognitive, social and cultural abilities needed to fit in and thrive in the HE learning .... this argument, that a methodology and analytic framework that seeks to ...

  2. Safety at Work : Research Methodology

    NARCIS (Netherlands)

    Beurden, van K. (Karin); Boer, de J. (Johannes); Brinks, G. (Ger); Goering-Zaburnenko, T. (Tatiana); Houten, van Y. (Ynze); Teeuw, W. (Wouter)

    2012-01-01

    In this document, we provide the methodological background for the Safety atWork project. This document combines several project deliverables as defined inthe overall project plan: validation techniques and methods (D5.1.1), performanceindicators for safety at work (D5.1.2), personal protection

  3. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  4. Assessing potential health effects from municipal sludge incinerators: screening methodology

    Energy Technology Data Exchange (ETDEWEB)

    Fradkin, L.; Bruins, J.F.; Lutkenhoff, S.D.; Stara, J.F.; Lomnitz, E.; Rubin, A.

    1987-04-01

    This paper describes a risk assessment methodology for preliminary assessment of municipal sludge incineration. The methodology is a valuable tool in that it can be used for determining the hazard indices of chemical contaminants that might be present in sewage sludge used in incineration. The paper examines source characteristics (i.e., facility design), atmospheric dispersion of emission, and resulting human exposure and risk from sludge incinerators. Seven of the ten organics were screened for further investigation. An example of the calculations are presented for cadmium.

  5. Neutron source

    International Nuclear Information System (INIS)

    Cason, J.L. Jr.; Shaw, C.B.

    1975-01-01

    A neutron source which is particularly useful for neutron radiography consists of a vessel containing a moderating media of relatively low moderating ratio, a flux trap including a moderating media of relatively high moderating ratio at the center of the vessel, a shell of depleted uranium dioxide surrounding the moderating media of relatively high moderating ratio, a plurality of guide tubes each containing a movable source of neutrons surrounding the flux trap, a neutron shield surrounding one part of each guide tube, and at least one collimator extending from the flux trap to the exterior of the neutron source. The shell of depleted uranium dioxide has a window provided with depleted uranium dioxide shutters for each collimator. Reflectors are provided above and below the flux trap and on the guide tubes away from the flux trap

  6. Crowd Sourcing.

    Science.gov (United States)

    Baum, Neil

    2016-01-01

    The Internet has contributed new words and slang to our daily vernacular. A few terms, such as tweeting, texting, sexting, blogging, and googling, have become common in most vocabularies and in many languages, and are now included in the dictionary. A new buzzword making the rounds in industry is crowd sourcing, which involves outsourcing an activity, task, or problem by sending it to people or groups outside a business or a practice. Crowd sourcing allows doctors and practices to tap the wisdom of many instead of relying only on the few members of their close-knit group. This article defines "crowd sourcing," offers examples, and explains how to get started with this approach that can increase your ability to finish a task or solve problems that you don't have the time or expertise to accomplish.

  7. A Methodological Approach to Support Collaborative Media Creation in an E-Learning Higher Education Context

    Science.gov (United States)

    Ornellas, Adriana; Muñoz Carril, Pablo César

    2014-01-01

    This article outlines a methodological approach to the creation, production and dissemination of online collaborative audio-visual projects, using new social learning technologies and open-source video tools, which can be applied to any e-learning environment in higher education. The methodology was developed and used to design a course in the…

  8. Energy sources

    International Nuclear Information System (INIS)

    Vajda, Gy.

    1998-01-01

    A comprehensive review is presented of the available sources of energy in the world is presented. About 80 percent of primary energy utilization is based on fossile fuels, and their dominant role is not expected to change in the foreseeable future. Data are given on petroleum, natural gas and coal based power production. The role and economic aspects of nuclear power are analyzed. A brief summary of renewable energy sources is presented. The future prospects of the world's energy resources are discussed, and the special position of Hungary regarding fossil, nuclear and renewable energy and the country's energy potential is evaluated. (R.P.)

  9. Postal auditing methodology used to find out the performance of high rate brachytherapy equipment

    International Nuclear Information System (INIS)

    Morales, J.A.; Campa, R.

    1998-01-01

    This work describes results from a methodology implemented at the Secondary Laboratory for Dosimetric Calibration at CPHR used to check the brachytherapy performance at high doses rate using Cesium 137 or cobalt 60 sources

  10. Serotonin Receptors in Hippocampus

    Science.gov (United States)

    Berumen, Laura Cristina; Rodríguez, Angelina; Miledi, Ricardo; García-Alcocer, Guadalupe

    2012-01-01

    Serotonin is an ancient molecular signal and a recognized neurotransmitter brainwide distributed with particular presence in hippocampus. Almost all serotonin receptor subtypes are expressed in hippocampus, which implicates an intricate modulating system, considering that they can be localized as autosynaptic, presynaptic, and postsynaptic receptors, even colocalized within the same cell and being target of homo- and heterodimerization. Neurons and glia, including immune cells, integrate a functional network that uses several serotonin receptors to regulate their roles in this particular part of the limbic system. PMID:22629209

  11. Environmental impact statement analysis: dose methodology

    International Nuclear Information System (INIS)

    Mueller, M.A.; Strenge, D.L.; Napier, B.A.

    1981-01-01

    Standardized sections and methodologies are being developed for use in environmental impact statements (EIS) for activities to be conducted on the Hanford Reservation. Five areas for standardization have been identified: routine operations dose methodologies, accident dose methodology, Hanford Site description, health effects methodology, and socioeconomic environment for Hanford waste management activities

  12. A Critique of Methodological Dualism in Education

    Science.gov (United States)

    Yang, Jeong A.; Yoo, Jae-Bong

    2018-01-01

    This paper aims to critically examine the paradigm of methodological dualism and explore whether methodologies in social science currently are appropriate for educational research. There are two primary methodologies for studying education: quantitative and qualitative methods. This is what we mean by "methodological dualism". Is…

  13. Feminist Methodologies and Engineering Education Research

    Science.gov (United States)

    Beddoes, Kacey

    2013-01-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory.…

  14. Ion source

    International Nuclear Information System (INIS)

    1977-01-01

    The specifications of a set of point-shape electrodes of non-corrodable material that can hold a film of liquid material of equal thickness is described. Contained in a jacket, this set forms an ion source. The electrode is made of tungsten with a glassy carbon layer for insulation and an outer layer of aluminium-oxide ceramic material

  15. Radiation Safety and Orphan Sources

    International Nuclear Information System (INIS)

    Janzekovic, H.; Krizman, M.

    2006-01-01

    The wide spread use of radioactive and particularly of nuclear materials which started in the last century very quickly also demonstrated negative sides. The external exposure and radiotoxicity of these materials could be easily used in a malevolent act. Due to the fact that these materials could not be detected without special equipment designed for that purpose, severe control over their use in all phases of a life cycle is required. An orphan source is a radioactive source which is not under regulatory control, either because it has never been under regulatory or because it has been abandoned, lost, misplaced, stolen or transferred without proper authorization. In the last ten years a few international conferences were dedicated to the improvement of the safety and security of radioactive sources. Three main tasks are focused, the maintenance of data bases related to events with orphan sources and the publications of such events, the preparation of recommendations and guidelines to national regulatory bodies in order to prevent and detect the events related to orphan sources as well as to develop the response strategies to radiological or nuclear emergency, appraisals of the national strategies of radioactive sources control. Concerning Slovenia, strengthening control over orphan sources in Slovenia started after the adoption of new legislation in 2002. It was carried out through several tasks with the aim to prevent orphan sources, as well as to identify the sources which could be potentially orphan sources. The comprehensive methodology was developed by the Slovenian nuclear safety administration (S.N.S.A.) based on international guidelines as well as on the study of national lesson learned cases. The methodology was developed and used in close cooperation with all parties involved, namely other regulatory authorities, police, customs, agency for radioactive waste management (A.R.A.O.), technical support organisations (T.S.O.), users of source, authorised

  16. Radiation Safety and Orphan Sources

    Energy Technology Data Exchange (ETDEWEB)

    Janzekovic, H.; Krizman, M. [Slovenian Nuclear Safety Administration, Ljubljana (Slovenia)

    2006-07-01

    The wide spread use of radioactive and particularly of nuclear materials which started in the last century very quickly also demonstrated negative sides. The external exposure and radiotoxicity of these materials could be easily used in a malevolent act. Due to the fact that these materials could not be detected without special equipment designed for that purpose, severe control over their use in all phases of a life cycle is required. An orphan source is a radioactive source which is not under regulatory control, either because it has never been under regulatory or because it has been abandoned, lost, misplaced, stolen or transferred without proper authorization. In the last ten years a few international conferences were dedicated to the improvement of the safety and security of radioactive sources. Three main tasks are focused, the maintenance of data bases related to events with orphan sources and the publications of such events, the preparation of recommendations and guidelines to national regulatory bodies in order to prevent and detect the events related to orphan sources as well as to develop the response strategies to radiological or nuclear emergency, appraisals of the national strategies of radioactive sources control. Concerning Slovenia, strengthening control over orphan sources in Slovenia started after the adoption of new legislation in 2002. It was carried out through several tasks with the aim to prevent orphan sources, as well as to identify the sources which could be potentially orphan sources. The comprehensive methodology was developed by the Slovenian nuclear safety administration (S.N.S.A.) based on international guidelines as well as on the study of national lesson learned cases. The methodology was developed and used in close cooperation with all parties involved, namely other regulatory authorities, police, customs, agency for radioactive waste management (A.R.A.O.), technical support organisations (T.S.O.), users of source, authorised

  17. A Network Based Methodology to Reveal Patterns in Knowledge Transfer

    Directory of Open Access Journals (Sweden)

    Orlando López-Cruz

    2015-12-01

    Full Text Available This paper motivates, presents and demonstrates in use a methodology based in complex network analysis to support research aimed at identification of sources in the process of knowledge transfer at the interorganizational level. The importance of this methodology is that it states a unified model to reveal knowledge sharing patterns and to compare results from multiple researches on data from different periods of time and different sectors of the economy. This methodology does not address the underlying statistical processes. To do this, national statistics departments (NSD provide documents and tools at their websites. But this proposal provides a guide to model information inferences gathered from data processing revealing links between sources and recipients of knowledge being transferred and that the recipient detects as main source to new knowledge creation. Some national statistics departments set as objective for these surveys the characterization of innovation dynamics in firms and to analyze the use of public support instruments. From this characterization scholars conduct different researches. Measures of dimensions of the network composed by manufacturing firms and other organizations conform the base to inquiry the structure that emerges from taking ideas from other organizations to incept innovations. These two sets of data are actors of a two- mode-network. The link between two actors (network nodes, one acting as the source of the idea. The second one acting as the destination comes from organizations or events organized by organizations that “provide” ideas to other group of firms. The resulting demonstrated design satisfies the objective of being a methodological model to identify sources in knowledge transfer of knowledge effectively used in innovation.

  18. Computer Simulation for Dispersion of Air Pollution Released from a Line Source According to Gaussian Model

    International Nuclear Information System (INIS)

    Emad, A.A.; El Shazly, S.M.; Kassem, Kh.O.

    2010-01-01

    A line source model, developed in laboratory of environmental physics, faculty of science at Qena, Egypt is proposed to describe the downwind dispersion of pollutants near roadways, at different cities in Egypt. The model is based on the Gaussian plume methodology and is used to predict air pollutants' concentrations near roadways. In this direction, simple software has been presented in this paper, developed by authors, adopted completely Graphical User Interface (GUI) technique for operating in various windows-based microcomputers. The software interface and code have been designed by Microsoft Visual basic 6.0 based on the Gaussian diffusion equation. This software is developed to predict concentrations of gaseous pollutants (eg. CO, SO 2 , NO 2 and particulates) at a user specified receptor grid

  19. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  20. Multicriteria methodology for decision aiding

    CERN Document Server

    Roy, Bernard

    1996-01-01

    This is the first comprehensive book to present, in English, the multicriteria methodology for decision aiding In the foreword the distinctive features and main ideas of the European School of MCDA are outlined The twelve chapters are essentially expository in nature, but scholarly in treatment Some questions, which are too often neglected in the literature on decision theory, such as how is a decision made, who are the actors, what is a decision aiding model, how to define the set of alternatives, are discussed Examples are used throughout the book to illustrate the various concepts Ways to model the consequences of each alternative and building criteria taking into account the inevitable imprecisions, uncertainties and indeterminations are described and illustrated The three classical operational approaches of MCDA synthesis in one criterion (including MAUT), synthesis by outranking relations, interactive local judgements, are studied This methodology tries to be a theoretical or intellectual framework dire...

  1. AGR core safety assessment methodologies

    International Nuclear Information System (INIS)

    McLachlan, N.; Reed, J.; Metcalfe, M.P.

    1996-01-01

    To demonstrate the safety of its gas-cooled graphite-moderated AGR reactors, nuclear safety assessments of the cores are based upon a methodology which demonstrates no component failures, geometrical stability of the structure and material properties bounded by a database. All AGRs continue to meet these three criteria. However, predictions of future core behaviour indicate that the safety case methodology will eventually need to be modified to deal with new phenomena. A new approach to the safety assessment of the cores is currently under development, which can take account of these factors while at the same time providing the same level of protection for the cores. This approach will be based on the functionality of the core: unhindered movement of control rods, continued adequate cooling of the fuel and the core, continued ability to charge and discharge fuel. (author). 5 figs

  2. Methodological update in Medicina Intensiva.

    Science.gov (United States)

    García Garmendia, J L

    2018-04-01

    Research in the critically ill is complex by the heterogeneity of patients, the difficulties to achieve representative sample sizes and the number of variables simultaneously involved. However, the quantity and quality of records is high as well as the relevance of the variables used, such as survival. The methodological tools have evolved to offering new perspectives and analysis models that allow extracting relevant information from the data that accompanies the critically ill patient. The need for training in methodology and interpretation of results is an important challenge for the intensivists who wish to be updated on the research developments and clinical advances in Intensive Medicine. Copyright © 2017 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  3. Mo(ve)ment-methodology

    DEFF Research Database (Denmark)

    Mørck, Line Lerche; Christian Celosse-Andersen, Martin

    2018-01-01

    This paper describes the theoretical basis for and development of a moment-movement research methodology, based on an integration of critical psychological practice research and critical ethnographic social practice theory. Central theoretical conceptualizations, such as human agency, life...... conditions and identity formation, are discussed in relation to criminological theories of gang desistance. The paper illustrates how the mo(ve)ment methodology was applied in a study of comprehensive processes of identity (re)formation and gang exit processes. This study was conducted with Martin, a former....... This is a moment which captures Martin’s complex and ambiguous feelings of conflictual concerns, frustration, anger, and a new feeling of insecurity in his masculinity, as well as engagement and a sense of deep meaningfulness as he becomes a more reflective academic. All these conflicting feelings also give...

  4. Design methodology of Dutch banknotes

    Science.gov (United States)

    de Heij, Hans A. M.

    2000-04-01

    Since the introduction of a design methodology for Dutch banknotes, the quality of Dutch paper currency has improved in more than one way. The methodology is question provides for (i) a design policy, which helps fix clear objectives; (ii) design management, to ensure a smooth cooperation between the graphic designer, printer, papermaker an central bank, (iii) a program of requirements, a banknote development guideline for all parties involved. This systematic approach enables an objective selection of design proposals, including security features. Furthermore, the project manager obtains regular feedback from the public by conducting market surveys. Each new design of a Netherlands Guilder banknote issued by the Nederlandsche Bank of the past 50 years has been an improvement on its predecessor in terms of value recognition, security and durability.

  5. Workshops as a Research Methodology

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Levinsen, Karin Tweddell

    2017-01-01

    , and workshops as a research methodology. Focusing primarily on the latter, this paper presents five studies on upper secondary and higher education teachers’ professional development and on teaching and learning through video conferencing. Through analysis and discussion of these studies’ findings, we argue......This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice...... that workshops provide a platform that can aid researchers in identifying and exploring relevant factors in a given domain by providing means for understanding complex work and knowledge processes that are supported by technology (for example, e-learning). The approach supports identifying factors...

  6. Methodological challenges and lessons learned

    DEFF Research Database (Denmark)

    Nielsen, Poul Erik; Gustafsson, Jessica

    2017-01-01

    Taking as point of departure three recently conducted empirical studies, the aim of this article is to theoretically and empirically discuss methodological challenges studying the interrelations between media and social reality and to critically reflect on the methodologies used in the studies....... By deconstructing the studies, the article draws attention to the fact that different methods are able to grasp different elements of social reality. Moreover, by analysing the power relations at play, the article demonstrated that the interplay between interviewer and interviewee, and how both parties fit...... into present power structures, greatly influence the narratives that are co-produced during interviews. The article thus concludes that in order to fully understand complex phenomena it is not just enough to use a mixture of methods, the makeup of the research team is also imperative, as a diverse team...

  7. Waste Package Component Design Methodology Report

    International Nuclear Information System (INIS)

    D.C. Mecham

    2004-01-01

    This Executive Summary provides an overview of the methodology being used by the Yucca Mountain Project (YMP) to design waste packages and ancillary components. This summary information is intended for readers with general interest, but also provides technical readers a general framework surrounding a variety of technical details provided in the main body of the report. The purpose of this report is to document and ensure appropriate design methods are used in the design of waste packages and ancillary components (the drip shields and emplacement pallets). The methodology includes identification of necessary design inputs, justification of design assumptions, and use of appropriate analysis methods, and computational tools. This design work is subject to ''Quality Assurance Requirements and Description''. The document is primarily intended for internal use and technical guidance for a variety of design activities. It is recognized that a wide audience including project management, the U.S. Department of Energy (DOE), the U.S. Nuclear Regulatory Commission, and others are interested to various levels of detail in the design methods and therefore covers a wide range of topics at varying levels of detail. Due to the preliminary nature of the design, readers can expect to encounter varied levels of detail in the body of the report. It is expected that technical information used as input to design documents will be verified and taken from the latest versions of reference sources given herein. This revision of the methodology report has evolved with changes in the waste package, drip shield, and emplacement pallet designs over many years and may be further revised as the design is finalized. Different components and analyses are at different stages of development. Some parts of the report are detailed, while other less detailed parts are likely to undergo further refinement. The design methodology is intended to provide designs that satisfy the safety and operational

  8. Waste Package Component Design Methodology Report

    Energy Technology Data Exchange (ETDEWEB)

    D.C. Mecham

    2004-07-12

    This Executive Summary provides an overview of the methodology being used by the Yucca Mountain Project (YMP) to design waste packages and ancillary components. This summary information is intended for readers with general interest, but also provides technical readers a general framework surrounding a variety of technical details provided in the main body of the report. The purpose of this report is to document and ensure appropriate design methods are used in the design of waste packages and ancillary components (the drip shields and emplacement pallets). The methodology includes identification of necessary design inputs, justification of design assumptions, and use of appropriate analysis methods, and computational tools. This design work is subject to ''Quality Assurance Requirements and Description''. The document is primarily intended for internal use and technical guidance for a variety of design activities. It is recognized that a wide audience including project management, the U.S. Department of Energy (DOE), the U.S. Nuclear Regulatory Commission, and others are interested to various levels of detail in the design methods and therefore covers a wide range of topics at varying levels of detail. Due to the preliminary nature of the design, readers can expect to encounter varied levels of detail in the body of the report. It is expected that technical information used as input to design documents will be verified and taken from the latest versions of reference sources given herein. This revision of the methodology report has evolved with changes in the waste package, drip shield, and emplacement pallet designs over many years and may be further revised as the design is finalized. Different components and analyses are at different stages of development. Some parts of the report are detailed, while other less detailed parts are likely to undergo further refinement. The design methodology is intended to provide designs that satisfy the safety

  9. Implementation impacts of PRL methodology

    International Nuclear Information System (INIS)

    Caudill, J.A.; Krupa, J.F.; Meadors, R.E.; Odum, J.V.; Rodrigues, G.C.

    1993-02-01

    This report responds to a DOE-SR request to evaluate the impacts from implementation of the proposed Plutonium Recovery Limit (PRL) methodology. The PRL Methodology is based on cost minimization for decisions to discard or recover plutonium contained in scrap, residues, and other plutonium bearing materials. Implementation of the PRL methodology may result in decisions to declare as waste certain plutonium bearing materials originally considered to be a recoverable plutonium product. Such decisions may have regulatory impacts, because any material declared to be waste would immediately be subject to provisions of the Resource Conservation and Recovery Act (RCRA). The decision to discard these materials will have impacts on waste storage, treatment, and disposal facilities. Current plans for the de-inventory of plutonium processing facilities have identified certain materials as candidates for discard based upon the economic considerations associated with extending the operating schedules for recovery of the contained plutonium versus potential waste disposal costs. This report evaluates the impacts of discarding those materials as proposed by the F Area De-Inventory Plan and compares the De-Inventory Plan assessments with conclusions from application of the PRL. The impact analysis was performed for those materials proposed as potential candidates for discard by the De-Inventory Plan. The De-Inventory Plan identified 433 items, containing approximately 1% of the current SRS Pu-239 inventory, as not appropriate for recovery as the site moves to complete the mission of F-Canyon and FB-Line. The materials were entered into storage awaiting recovery as product under the Department's previous Economic Discard Limit (EDL) methodology which valued plutonium at its incremental cost of production in reactors. An application of Departmental PRLs to the subject 433 items revealed that approximately 40% of them would continue to be potentially recoverable as product plutonium

  10. ISE System Development Methodology Manual

    Energy Technology Data Exchange (ETDEWEB)

    Hayhoe, G.F.

    1992-02-17

    The Information Systems Engineering (ISE) System Development Methodology Manual (SDM) is a framework of life cycle management guidelines that provide ISE personnel with direction, organization, consistency, and improved communication when developing and maintaining systems. These guide-lines were designed to allow ISE to build and deliver Total Quality products, and to meet the goals and requirements of the US Department of Energy (DOE), Westinghouse Savannah River Company, and Westinghouse Electric Corporation.

  11. Environmental Testing Methodology in Biometrics

    OpenAIRE

    Fernández Saavedra, Belén; Sánchez Reíllo, Raúl; Alonso Moreno, Raúl; Miguel Hurtado, Óscar

    2010-01-01

    8 pages document + 5-slide presentation.-- Contributed to: 1st International Biometric Performance Conference (IBPC 2010, NIST, Gaithersburg, MD, US, Mar 1-5, 2010). Recently, biometrics is used in many security systems and these systems can be located in different environments. As many experts claim and previous works have demonstrated, environmental conditions influence biometric performance. Nevertheless, there is not a specific methodology for testing this influence at the moment...

  12. Soft systems methodology: other voices

    OpenAIRE

    Holwell, Sue

    2000-01-01

    This issue of Systemic Practice and Action Research, celebrating the work of Peter Checkland, in the particular nature and development of soft systems methodology (SSM), would not have happened unless the work was seen by others as being important. No significant contribution to thinking happens without a secondary literature developing. Not surprisingly, many commentaries have accompanied the ongoing development of SSM. Some of these are insightful, some full of errors, and some include both...

  13. Systems engineering agile design methodologies

    CERN Document Server

    Crowder, James A

    2013-01-01

    This book examines the paradigm of the engineering design process. The authors discuss agile systems and engineering design. The book captures the entire design process (functionbases), context, and requirements to affect real reuse. It provides a methodology for an engineering design process foundation for modern and future systems design. This book captures design patterns with context for actual Systems Engineering Design Reuse and contains a new paradigm in Design Knowledge Management.

  14. Methodological remarks on contraction theory

    DEFF Research Database (Denmark)

    Jouffroy, Jerome; Slotine, Jean-Jacques E.

    Because contraction analysis stems from a differential and incremental framework, the nature and methodology of contraction-based proofs are significantly different from those of their Lyapunov-based counterparts. This paper specifically studies this issue, and illustrates it by revisiting some c...... classical examples traditionally addressed using Lyapunov theory. Even in these cases, contraction tools can often yield significantly simplified analysis. The examples include adaptive control, robotics, and a proof of convergence of the deterministic Extended Kalman Filter....

  15. Artificial Intelligence Techniques and Methodology

    OpenAIRE

    Carbonell, Jaime G.; Sleeman, Derek

    1982-01-01

    Two closely related aspects of artificial intelligence that have received comparatively little attention in the recent literature are research methodology, and the analysis of computational techniques that span multiple application areas. We believe both issues to be increasingly significant as Artificial Intelligence matures into a science and spins off major application efforts. It is imperative to analyze the repertoire of AI methods with respect to past experience, utility in new domains,...

  16. Developing Foucault's Discourse Analytic Methodology

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2006-01-01

    Full Text Available A methodological position for a FOUCAULTian discourse analysis is presented. A sequence of analytical steps is introduced and an illustrating example is offered. It is emphasized that discourse analysis has to discover the system-level of discursive rules and the deeper structure of the discursive formation. Otherwise the analysis will be unfinished. Michel FOUCAULTs work is theoretically grounded in French structuralism and (the so called post-structuralism. In this paper, post-structuralism is not conceived as a means for overcoming of structuralism, but as a way of critically continuing the structural perspective. In this way, discursive structures can be related to discursive practices and the concept of structure can be disclosed (e. g. to inter-discourse or DERRIDAs concept of structurality. In this way, the structural methodology is continued and radicalized, but not given up. In this paper, FOUCAULTs theory is combined with the works of Michel PÊCHEUX and (especially for the sociology of knowledge and the sociology of culture Pierre BOURDIEU. The practice of discourse analysis is theoretically grounded. This practice can be conceived as a reflexive coupling of deconstruction and reconstruction in the material to be analyzed. This methodology therefore can be characterized as a reconstructive qualitative methodology. At the end of the article, forms of discourse analysis are criticized that do not intend to recover the system level of discursive rules and that do not intend to discover the deeper structure of the discursive formation (i. e. episteme, socio-episteme. These forms merely are commentaries of discourses (not their analyses, they remain phenomenological and are therefore: pre-structuralist. URN: urn:nbn:de:0114-fqs060168

  17. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  18. Methodology for technical risk assessment

    International Nuclear Information System (INIS)

    Waganer, L.M.; Zuckerman, D.S.

    1983-01-01

    A methodology has been developed for and applied to the assessment of the technical risks associated with an evolving technology. This methodology, originally developed for fusion by K. W. Billman and F. R. Scott at EPRI, has been applied to assess the technical risk of a fuel system for a fusion reactor. Technical risk is defined as the risk that a particular technology or component which is currently under development will not achieve a set of required technical specifications (i.e. probability of failure). The individual steps in the technical risk assessment are summarized. The first step in this methodology is to clearly and completely quantify the technical requirements for the particular system being examined. The next step is to identify and define subsystems and various options which appear capable of achieving the required technical performance. The subsystem options are then characterized regarding subsystem functions, interface requirements with the subsystems and systems, important components, developmental obstacles and technical limitations. Key technical subsystem performance parameters are identified which directly or indirectly relate to the system technical specifications. Past, existing and future technical performance data from subsystem experts are obtained by using a Bayesian Interrogation technique. The input data is solicited in the form of probability functions. Thus the output performance of the system is expressed as probability functions

  19. Methodology for astronaut reconditioning research.

    Science.gov (United States)

    Beard, David J; Cook, Jonathan A

    2017-01-01

    Space medicine offers some unique challenges, especially in terms of research methodology. A specific challenge for astronaut reconditioning involves identification of what aspects of terrestrial research methodology hold and which require modification. This paper reviews this area and presents appropriate solutions where possible. It is concluded that spaceflight rehabilitation research should remain question/problem driven and is broadly similar to the terrestrial equivalent on small populations, such as rare diseases and various sports. Astronauts and Medical Operations personnel should be involved at all levels to ensure feasibility of research protocols. There is room for creative and hybrid methodology but careful systematic observation is likely to be more achievable and fruitful than complex trial based comparisons. Multi-space agency collaboration will be critical to pool data from small groups of astronauts with the accepted use of standardised outcome measures across all agencies. Systematic reviews will be an essential component. Most limitations relate to the inherent small sample size available for human spaceflight research. Early adoption of a co-operative model for spaceflight rehabilitation research is therefore advised. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Methodology for ranking restoration options

    International Nuclear Information System (INIS)

    Hedemann Jensen, Per

    1999-04-01

    The work described in this report has been performed as a part of the RESTRAT Project FI4P-CT95-0021a (PL 950128) co-funded by the Nuclear Fission Safety Programme of the European Commission. The RESTRAT project has the overall objective of developing generic methodologies for ranking restoration techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps: characterisation of relevant contaminated sites; identification and characterisation of relevant restoration techniques; assessment of the radiological impact; development and application of a selection methodology for restoration options; formulation of generic conclusions and development of a manual. The project is intended to apply to situations in which sites with nuclear installations have been contaminated with radioactive materials as a result of the operation of these installations. The areas considered for remedial measures include contaminated land areas, rivers and sediments in rivers, lakes, and sea areas. Five contaminated European sites have been studied. Various remedial measures have been envisaged with respect to the optimisation of the protection of the populations being exposed to the radionuclides at the sites. Cost-benefit analysis and multi-attribute utility analysis have been applied for optimisation. Health, economic and social attributes have been included and weighting factors for the different attributes have been determined by the use of scaling constants. (au)

  1. AN INTEGRATED METHODOLOGY FOR CUSTOMER RELATIONSHIP MANAGEMENT CUSTOMIZATION

    Directory of Open Access Journals (Sweden)

    Ricardo Colomo Palacios

    2008-02-01

    Full Text Available The importance and presence of technological solutions in organizations supporting CRM are a vital business fact from the late nineties. Presently, the manufacturers figure in the market has dramatically decreased because of continuous takeovers and merges, but it has on the other hand gained momentum because of the sudden open-source and on-demand solutions appearance. In this scope, a unified methodology centered on CRM solutions is of paramount importance since it has traditionally been linked to either system integration or overall solution design. Based on the two de-facto complementary standards for the implementation and development of Information Systems, namely the ESA and Dyché CRM systems implementation methodology, in this paper, we provide a CRM business solutions customization methodology which pertains independently to the integration and tool maker perspective.

  2. A methodology for radiological accidents analysis in industrial gamma radiography

    International Nuclear Information System (INIS)

    Silva, F.C.A. da.

    1990-01-01

    A critical review of 34 published severe radiological accidents in industrial gamma radiography, that happened in 15 countries, from 1960 to 1988, was performed. The most frequent causes, consequences and dose estimation methods were analysed, aiming to stablish better procedures of radiation safety and accidents analysis. The objective of this work is to elaborate a radiological accidents analysis methodology in industrial gamma radiography. The suggested methodology will enable professionals to determine the true causes of the event and to estimate the dose with a good certainty. The technical analytical tree, recommended by International Atomic Energy Agency to perform radiation protection and nuclear safety programs, was adopted in the elaboration of the suggested methodology. The viability of the use of the Electron Gamma Shower 4 Computer Code System to calculate the absorbed dose in radiological accidents in industrial gamma radiography, mainly at sup(192)Ir radioactive source handling situations was also studied. (author)

  3. Risk assessment methodology for Hanford high-level waste tanks

    International Nuclear Information System (INIS)

    Bott, T.F.; Mac Farlane, D.R.; Stack, D.W.; Kindinger, J.

    1992-01-01

    A methodology is presented for applying Probabilistic Safety Assessment techniques to quantification of the health risks posed by the high-level waste (HLW) underground tanks at the Department of Energy's Hanford reservation. This methodology includes hazard screening development of a list of potential accident initiators, systems fault trees development and quantification, definition of source terms for various release categories, and estimation of health consequences from the releases. Both airborne and liquid pathway releases to the environment, arising from aerosol and spill/leak releases from the tanks, are included in the release categories. The proposed methodology is intended to be applied to a representative subset of the total of 177 tanks, thereby providing a baseline risk profile for the HLW tank farm that can be used for setting clean-up/remediation priorities. Some preliminary results are presented for Tank 101-SY

  4. Testing methodology of embedded software in digital plant protection system

    International Nuclear Information System (INIS)

    Seong, Ah Young; Choi, Bong Joo; Lee, Na Young; Hwang, Il Soon

    2001-01-01

    It is necessary to assure the reliability of software in order to digitalize RPS(Reactor Protection System). Since RPS causes fatal damage on accidental cases, it is classified as Safety 1E class. Therefore we propose the effective testing methodology to assure the reliability of embedded software in the DPPS(Digital Plant Protection System). To test the embedded software effectively in DPPS, our methodology consists of two steps. The first is the re-engineering step that extracts classes from structural source program, and the second is the level of testing step which is composed of unit testing, Integration Testing and System Testing. On each testing step we test the embedded software with selected test cases after the test item identification step. If we use this testing methodology, we can test the embedded software effectively by reducing the cost and the time

  5. The Role of the Sweet Taste Receptor in Enteroendocrine Cells and Pancreatic β-Cells

    Directory of Open Access Journals (Sweden)

    Itaru Kojima

    2011-10-01

    Full Text Available The sweet taste receptor is expressed in taste cells located in taste buds of the tongue. This receptor senses sweet substances in the oral cavity, activates taste cells, and transmits the taste signals to adjacent neurons. The sweet taste receptor is a heterodimer of two G protein-coupled receptors, T1R2 and T1R3. Recent studies have shown that this receptor is also expressed in the extragustatory system, including the gastrointestinal tract, pancreatic β-cells, and glucose-responsive neurons in the brain. In the intestine, the sweet taste receptor regulates secretion of incretin hormones and glucose uptake from the lumen. In β-cells, activation of the sweet taste receptor leads to stimulation of insulin secretion. Collectively, the sweet taste receptor plays an important role in recognition and metabolism of energy sources in the body.

  6. Open source systems security certification

    CERN Document Server

    Damiani, Ernesto; El Ioini, Nabil

    2009-01-01

    Open Source Advances in Computer Applications book series provides timely technological and business information for: Enabling Open Source Systems (OSS) to become an integral part of systems and devices produced by technology companies; Inserting OSS in the critical path of complex network development and embedded products, including methodologies and tools for domain-specific OSS testing (lab code available), plus certification of security, dependability and safety properties for complex systems; Ensuring integrated systems, including OSS, meet performance and security requirements as well as achieving the necessary certifications, according to the overall strategy of OSS usage on the part of the adopter

  7. Orphan sources

    International Nuclear Information System (INIS)

    Pust, R.; Urbancik, L.

    2008-01-01

    The presentation describes how the stable detection systems (hereinafter referred to as S DS ) have contributed to reveal the uncontrolled sources of ionizing radiation on the territory of the State Office for Nuclear Safety (SONS) Brno Regional Centre (RC Brno). It also describes the emergencies which were solved by or in which the workers from the Brno. Regional Centre participated in. The contribution is divided into the following chapters: A. SDS systems installed on the territory of SONS RC Brno; B. Selected unusual emergencies; C. Comments to individual emergencies; D. Aspects of SDS operation in term of their users; E. Aspects of SDS operation and related activities in term of radiation protection; F. Current state of orphan sources. (authors)

  8. Survey of Dynamic PSA Methodologies

    International Nuclear Information System (INIS)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung; Kim, Taewan

    2015-01-01

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  9. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  10. Prioritization methodology for chemical replacement

    Science.gov (United States)

    Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott

    1995-01-01

    Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology

  11. Tritium sources

    International Nuclear Information System (INIS)

    Glodic, S.; Boreli, F.

    1993-01-01

    Tritium is the only radioactive isotope of hydrogen. It directly follows the metabolism of water and it can be bound into genetic material, so it is very important to control levels of contamination. In order to define the state of contamination it is necessary to establish 'zero level', i.e. actual global inventory. The importance of tritium contamination monitoring increases with the development of fusion power installations. Different sources of tritium are analyzed and summarized in this paper. (author)

  12. Source rock

    Directory of Open Access Journals (Sweden)

    Abubakr F. Makky

    2014-03-01

    Full Text Available West Beni Suef Concession is located at the western part of Beni Suef Basin which is a relatively under-explored basin and lies about 150 km south of Cairo. The major goal of this study is to evaluate the source rock by using different techniques as Rock-Eval pyrolysis, Vitrinite reflectance (%Ro, and well log data of some Cretaceous sequences including Abu Roash (E, F and G members, Kharita and Betty formations. The BasinMod 1D program is used in this study to construct the burial history and calculate the levels of thermal maturity of the Fayoum-1X well based on calibration of measured %Ro and Tmax against calculated %Ro model. The calculated Total Organic Carbon (TOC content from well log data compared with the measured TOC from the Rock-Eval pyrolysis in Fayoum-1X well is shown to match against the shale source rock but gives high values against the limestone source rock. For that, a new model is derived from well log data to calculate accurately the TOC content against the limestone source rock in the study area. The organic matter existing in Abu Roash (F member is fair to excellent and capable of generating a significant amount of hydrocarbons (oil prone produced from (mixed type I/II kerogen. The generation potential of kerogen in Abu Roash (E and G members and Betty formations is ranging from poor to fair, and generating hydrocarbons of oil and gas prone (mixed type II/III kerogen. Eventually, kerogen (type III of Kharita Formation has poor to very good generation potential and mainly produces gas. Thermal maturation of the measured %Ro, calculated %Ro model, Tmax and Production index (PI indicates that Abu Roash (F member exciting in the onset of oil generation, whereas Abu Roash (E and G members, Kharita and Betty formations entered the peak of oil generation.

  13. Radioactive source

    International Nuclear Information System (INIS)

    Drabkina, L.E.; Mazurek, V.; Myascedov, D.N.; Prokhorov, P.; Kachalov, V.A.; Ziv, D.M.

    1976-01-01

    A radioactive layer in a radioactive source is sealed by the application of a sealing layer on the radioactive layer. The sealing layer can consist of a film of oxide of titanium, tin, zirconium, aluminum, or chromium. Preferably, the sealing layer is pure titanium dioxide. The radioactive layer is embedded in a finish enamel which, in turn, is on a priming enamel which surrounds a substrate

  14. Muon sources

    International Nuclear Information System (INIS)

    Parsa, Z.

    2001-01-01

    A full high energy muon collider may take considerable time to realize. However, intermediate steps in its direction are possible and could help facilitate the process. Employing an intense muon source to carry out forefront low energy research, such as the search for muon-number non-conservation, represents one interesting possibility. For example, the MECO proposal at BNL aims for 2 x 10 -17 sensitivity in their search for coherent muon-electron conversion in the field of a nucleus. To reach that goal requires the production, capture and stopping of muons at an unprecedented 10 11 μ/sec. If successful, such an effort would significantly advance the state of muon technology. More ambitious ideas for utilizing high intensity muon sources are also being explored. Building a muon storage ring for the purpose of providing intense high energy neutrino beams is particularly exciting.We present an overview of muon sources and example of a muon storage ring based Neutrino Factory at BNL with various detector location possibilities

  15. The fractional scaling methodology (FSM) Part 1. methodology development

    International Nuclear Information System (INIS)

    Novak Zuber; Ivan Catton; Upendra S Rohatgi; Wolfgang Wulff

    2005-01-01

    Full text of publication follows: a quantitative methodology is developed, based on the concepts of hierarchy and synthesis, to integrate and organize information and data. The methodology uses scaling to synthesize experimental data and analytical results, and to provide quantitative criteria for evaluating the effects of various design and operating parameters that influence processes in a complex system such as a nuclear power plant or a related test facility. Synthesis and scaling are performed on three hierarchical levels: the process, component and system levels. Scaling on the process level determines the effect of a selected process on a particular state variable during a selected scenario. At the component level this scaling determines the effects various processes have on a state variable, and it ranks the processes according to their importance by the magnitude of the fractional change they cause on that state variable. At the system level the scaling determines the governing processes and corresponding components, ranking these in the order of importance according to their effect on the fractional change of system-wide state variables. The scaling methodology reveals on all levels the fractional change of state variables and is called therefore the Fractional Scaling Methodology (FSM). FSM synthesizes process parameters and assigns to each thermohydraulic process a dimensionless effect metric Ω = ωt, that is the product of the specific rate of fractional change ω and the characteristic time t. The rate of fractional change ω is the ratio of process transport rate over content of a preserved quantity in a component. The effect metric Ω quantifies the contribution of the process to the fractional change of a state variable in a given component. Ordering of a component effect metrics provides the hierarchy of processes in a component, then in all components and the system. FSM separates quantitatively dominant from minor processes and components and

  16. Basic and modern concepts on cholinergic receptor: A review

    Directory of Open Access Journals (Sweden)

    Prashant Tiwari

    2013-10-01

    Full Text Available Cholinergic system is an important system and a branch of the autonomic nervous system which plays an important role in memory, digestion, control of heart beat, blood pressure, movement and many other functions. This article serves as both structural and functional sources of information regarding cholinergic receptors and provides a detailed understanding of the determinants governing specificity of muscarinic and nicotinic receptor to researchers. The study helps to give overall information about the fundamentals of the cholinergic system, its receptors and ongoing research in this field.

  17. Autoradiographic localization of drug and neurotransmitter receptors in the brain

    International Nuclear Information System (INIS)

    Kuhar, M.J.

    1981-01-01

    By combining and adapting various methodologies, it is possible to develop radiohistochemical methods for the light microscopic localization of drug and neurotransmitter receptors in the brain. These methods are valuable complements to other histochemical methods for mapping neurotransmitters; they provide a unique view of neuroanatomy and they can be used to provide valuable new hypotheses about how drugs produce various effects. Interesting 'hot spots' of receptor localizations have been observed in some sensory and limbic areas of the brain. Because most available methods are light microscopic, the development of ultrastructural methods will be a necessary and important extension of this field. (Auth.)

  18. Imaging GABAc Receptors with Ligand-Conjugated Quantum Dots

    Directory of Open Access Journals (Sweden)

    Ian D. Tomlinson

    2007-01-01

    Full Text Available We report a methodology for labeling the GABAc receptor on the surface membrane of intact cells. This work builds upon our earlier work with serotonin-conjugated quantum dots and our studies with PEGylated quantum dots to reduce nonspecific binding. In the current approach, a PEGylated derivative of muscimol was synthesized and attached via an amide linkage to quantum dots coated in an amphiphilic polymer derivative of a modified polyacrylamide. These conjugates were used to image GABAC receptors heterologously expressed in Xenopus laevis oocytes.

  19. A methodology to generate statistically dependent wind speed scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Morales, J.M.; Conejo, A.J. [Department of Electrical Engineering, Univ. Castilla - La Mancha, Campus Universitario s/n, 13071 Ciudad Real (Spain); Minguez, R. [Environmental Hydraulics Institute ' ' IH Cantabria' ' , Univ. Cantabria, Avenida de los Castros s/n, 39005 Santander (Spain)

    2010-03-15

    Wind power - a renewable energy source increasingly attractive from an economic viewpoint - constitutes an electricity production alternative of growing relevance in current electric energy systems. However, wind power is an intermittent source that cannot be dispatched at the will of the producer. Modeling wind power production requires characterizing wind speed at the sites where the wind farms are located. The wind speed at a particular location can be described through a stochastic process that is spatially correlated with the stochastic processes describing wind speeds at other locations. This paper provides a methodology to characterize the stochastic processes pertaining to wind speed at different geographical locations via scenarios. Each one of these scenarios embodies time dependencies and is spatially dependent of the scenarios describing other wind stochastic processes. The scenarios generated by the proposed methodology are intended to be used within stochastic programming decision models to make informed decisions pertaining to wind power production. The methodology proposed is accurate in reproducing wind speed historical series as well as computationally efficient. A comprehensive case study is used to illustrate the capabilities of the proposed methodology. Appropriate conclusions are finally drawn. (author)

  20. A methodology to generate statistically dependent wind speed scenarios

    International Nuclear Information System (INIS)

    Morales, J.M.; Minguez, R.; Conejo, A.J.

    2010-01-01

    Wind power - a renewable energy source increasingly attractive from an economic viewpoint - constitutes an electricity production alternative of growing relevance in current electric energy systems. However, wind power is an intermittent source that cannot be dispatched at the will of the producer. Modeling wind power production requires characterizing wind speed at the sites where the wind farms are located. The wind speed at a particular location can be described through a stochastic process that is spatially correlated with the stochastic processes describing wind speeds at other locations. This paper provides a methodology to characterize the stochastic processes pertaining to wind speed at different geographical locations via scenarios. Each one of these scenarios embodies time dependencies and is spatially dependent of the scenarios describing other wind stochastic processes. The scenarios generated by the proposed methodology are intended to be used within stochastic programming decision models to make informed decisions pertaining to wind power production. The methodology proposed is accurate in reproducing wind speed historical series as well as computationally efficient. A comprehensive case study is used to illustrate the capabilities of the proposed methodology. Appropriate conclusions are finally drawn.

  1. A receptor model for urban aerosols based on oblique factor analysis

    DEFF Research Database (Denmark)

    Keiding, Kristian; Sørensen, Morten S.; Pind, Niels

    1987-01-01

    A procedure is outlined for the construction of receptor models of urban aerosols, based on factor analysis. The advantage of the procedure is that the covariation of source impacts is included in the construction of the models. The results are compared with results obtained by other receptor......-modelling procedures. It was found that procedures based on correlating sources were physically sound as well as in mutual agreement. Procedures based on non-correlating sources were found to generate physically obscure models....

  2. Source Apportionment and Risk Assessment of Emerging Contaminants: An Approach of Pharmaco-Signature in Water Systems

    Science.gov (United States)

    Jiang, Jheng Jie; Lee, Chon Lin; Fang, Meng Der; Boyd, Kenneth G.; Gibb, Stuart W.

    2015-01-01

    This paper presents a methodology based on multivariate data analysis for characterizing potential source contributions of emerging contaminants (ECs) detected in 26 river water samples across multi-scape regions during dry and wet seasons. Based on this methodology, we unveil an approach toward potential source contributions of ECs, a concept we refer to as the “Pharmaco-signature.” Exploratory analysis of data points has been carried out by unsupervised pattern recognition (hierarchical cluster analysis, HCA) and receptor model (principal component analysis-multiple linear regression, PCA-MLR) in an attempt to demonstrate significant source contributions of ECs in different land-use zone. Robust cluster solutions grouped the database according to different EC profiles. PCA-MLR identified that 58.9% of the mean summed ECs were contributed by domestic impact, 9.7% by antibiotics application, and 31.4% by drug abuse. Diclofenac, ibuprofen, codeine, ampicillin, tetracycline, and erythromycin-H2O have significant pollution risk quotients (RQ>1), indicating potentially high risk to aquatic organisms in Taiwan. PMID:25874375

  3. Personal receptor repertoires: olfaction as a model

    Directory of Open Access Journals (Sweden)

    Olender Tsviya

    2012-08-01

    Full Text Available Abstract Background Information on nucleotide diversity along completely sequenced human genomes has increased tremendously over the last few years. This makes it possible to reassess the diversity status of distinct receptor proteins in different human individuals. To this end, we focused on the complete inventory of human olfactory receptor coding regions as a model for personal receptor repertoires. Results By performing data-mining from public and private sources we scored genetic variations in 413 intact OR loci, for which one or more individuals had an intact open reading frame. Using 1000 Genomes Project haplotypes, we identified a total of 4069 full-length polypeptide variants encoded by these OR loci, average of ~10 per locus, constituting a lower limit for the effective human OR repertoire. Each individual is found to harbor as many as 600 OR allelic variants, ~50% higher than the locus count. Because OR neuronal expression is allelically excluded, this has direct effect on smell perception diversity of the species. We further identified 244 OR segregating pseudogenes (SPGs, loci showing both intact and pseudogene forms in the population, twenty-six of which are annotatively “resurrected” from a pseudogene status in the reference genome. Using a custom SNP microarray we validated 150 SPGs in a cohort of 468 individuals, with every individual genome averaging 36 disrupted sequence variations, 15 in homozygote form. Finally, we generated a multi-source compendium of 63 OR loci harboring deletion Copy Number Variations (CNVs. Our combined data suggest that 271 of the 413 intact OR loci (66% are affected by nonfunctional SNPs/indels and/or CNVs. Conclusions These results portray a case of unusually high genetic diversity, and suggest that individual humans have a highly personalized inventory of functional olfactory receptors, a conclusion that might apply to other receptor multigene families.

  4. Development of seismic risk analysis methodologies at JAERI

    International Nuclear Information System (INIS)

    Tanaka, T.; Abe, K.; Ebisawa, K.; Oikawa, T.

    1988-01-01

    The usefulness of probabilistic safety assessment (PSA) is recognized worldwidely for balanced design and regulation of nuclear power plants. In Japan, the Japan Atomic Energy Research Institute (JAERI) has been engaged in developing methodologies necessary for carrying out PSA. The research and development program was started in 1980. In those days the effort was only for internal initiator PSA. In 1985 the program was expanded so as to include external event analysis. Although this expanded program is to cover various external initiators, the current effort is dedicated for seismic risk analysis. There are three levels of seismic PSA, similarly to internal initiator PSA: Level 1: Evaluation of core damage frequency, Level 2: Evaluation of radioactive release frequency and source terms, and Level 3: Evaluation of environmental consequence. In the JAERI's program, only the methodologies for level 1 seismic PSA are under development. The methodology development for seismic risk analysis is divided into two phases. The Phase I study is to establish a whole set of simple methodologies based on currently available data. In the Phase II, Sensitivity study will be carried out to identify the parameters whose uncertainty may result in lage uncertainty in seismic risk, and For such parameters, the methodology will be upgraded. Now the Phase I study has almost been completed. In this report, outlines of the study and some of its outcomes are described

  5. [Methodologic inconsistency in anamnesis education at medical schools].

    Science.gov (United States)

    Zago, M A

    1989-01-01

    Some relevant points of the process of obtaining the medical anamnesis and physical examination, and the formulation of diagnostic hypotheses are analyzed. The main methodological features include: preponderance of qualitative data, absence of preselected hypotheses, direct involvement of the observer (physician) with the data source (patient), and selection of hypotheses and changes of the patient during the process. Thus, diagnostic investigation does not follow the paradigm of quantitative scientific method, rooted on the logic positivism, which dominates medical research and education.

  6. Methodological Aspects of Depreciation as an Economic Category

    OpenAIRE

    Sigidov, Yuriy I.; Rybyantseva, Maria S.; Adamenko, Alexandr A.; Yarushkina, Elena A.

    2016-01-01

    Depreciation is a complex economic category, the essence of which is manifested in the duality: this cost element, and its own source of reproduction of fixed assets and intangible assets. The depreciation laid relationship with asset and liability balance sheet; it touches on aspects such as formation costs, taxation issues, and reproductive process. That is why a methodological study of the depreciation essence, the allocation of the classification of bases, principles and functions seems u...

  7. Methodology of mycobacteria tuberculosis bacteria detection by Raman spectroscopy

    Science.gov (United States)

    Zyubin, A.; Lavrova, A.; Manicheva, O.; Dogonadze, M.; Tsibulnikova, A.; Samusev, I.

    2018-01-01

    We have developed a methodology for the study of deactivated strains of Mycobacterium tuberculosis. Strains of the Beijing species obtained from pulmonary patient secrete (XDR strain) and reference strain (H37Rv) were investigated by Raman spectrometry with He-Ne (632,8 nm) laser excitation source. As a result of the research, the optimal experimental parameters have been obtained to get spectra of mycolic acids, which are part of the cell wall of mycobacteria.

  8. Recent Methods for Measuring Dopamine D3 receptor Occupancy In Vivo: Importance for Drug Development

    Directory of Open Access Journals (Sweden)

    Bernard eLe Foll

    2014-07-01

    Full Text Available There is considerable interest in developing highly selective dopamine D3 receptor ligands for a variety of mental health disorders. Dopamine D3 receptors have been implicated in Parkinson’s Disease, schizophrenia, anxiety, depression, and substance use disorders. The most concrete evidence suggests a role for the D3 receptor in drug-seeking behaviors. D3 receptors are a subtype of D2 receptors, and traditionally the functional role of these two receptors has been difficult to differentiate. Over the past 10-15 years a number of compounds selective for D3 over D2 receptors have been developed. However, translating these findings into clinical research has been difficult as many of these compounds cannot be used in humans. Therefore, the functional data involving the D3 receptor in drug addiction mostly comes from preclinical studies. Recently, with the advent of [11C]-(+-PHNO, it has become possible to image D3 receptors in the human brain with increased selectivity and sensitivity. This is a significant innovation over traditional methods such as [11C]-raclopride that cannot differentiate between D2 and D3 receptors. The use of [11C]-(+-PHNO will allow for further delineation of the role of D3 receptors. Here, we review recent evidence that the role of the D3 receptor has functional importance and is distinct from the role of the D2 receptor. We then introduce the utility of analyzing [11C]-(+-PHNO binding by region of interest. This novel methodology can be used in preclinical and clinical approaches for the measurement of occupancy of both D3 and D2 receptors. Evidence that [11C]-(+-PHNO can provide insights into the function of D3 receptors in addiction is also presented.

  9. GABA, its receptors, and GABAergic inhibition in mouse taste buds.

    Science.gov (United States)

    Dvoryanchikov, Gennady; Huang, Yijen A; Barro-Soria, Rene; Chaudhari, Nirupa; Roper, Stephen D

    2011-04-13

    Taste buds consist of at least three principal cell types that have different functions in processing gustatory signals: glial-like (type I) cells, receptor (type II) cells, and presynaptic (type III) cells. Using a combination of Ca2+ imaging, single-cell reverse transcriptase-PCR and immunostaining, we show that GABA is an inhibitory transmitter in mouse taste buds, acting on GABA(A) and GABA(B) receptors to suppress transmitter (ATP) secretion from receptor cells during taste stimulation. Specifically, receptor cells express GABA(A) receptor subunits β2, δ, and π, as well as GABA(B) receptors. In contrast, presynaptic cells express the GABA(A) β3 subunit and only occasionally GABA(B) receptors. In keeping with the distinct expression pattern of GABA receptors in presynaptic cells, we detected no GABAergic suppression of transmitter release from presynaptic cells. We suggest that GABA may serve function(s) in taste buds in addition to synaptic inhibition. Finally, we also defined the source of GABA in taste buds: GABA is synthesized by GAD65 in type I taste cells as well as by GAD67 in presynaptic (type III) taste cells and is stored in both those two cell types. We conclude that GABA is an inhibitory transmitter released during taste stimulation and possibly also during growth and differentiation of taste buds.

  10. A protein interaction atlas for the nuclear receptors: properties and quality of a hub-based dimerisation network

    Directory of Open Access Journals (Sweden)

    De Graaf David

    2007-07-01

    Full Text Available Abstract Background The nuclear receptors are a large family of eukaryotic transcription factors that constitute major pharmacological targets. They exert their combinatorial control through homotypic heterodimerisation. Elucidation of this dimerisation network is vital in order to understand the complex dynamics and potential cross-talk involved. Results Phylogeny, protein-protein interactions, protein-DNA interactions and gene expression data have been integrated to provide a comprehensive and up-to-date description of the topology and properties of the nuclear receptor interaction network in humans. We discriminate between DNA-binding and non-DNA-binding dimers, and provide a comprehensive interaction map, that identifies potential cross-talk between the various pathways of nuclear receptors. Conclusion We infer that the topology of this network is hub-based, and much more connected than previously thought. The hub-based topology of the network and the wide tissue expression pattern of NRs create a highly competitive environment for the common heterodimerising partners. Furthermore, a significant number of negative feedback loops is present, with the hub protein SHP [NR0B2] playing a major role. We also compare the evolution, topology and properties of the nuclear receptor network with the hub-based dimerisation network of the bHLH transcription factors in order to identify both unique themes and ubiquitous properties in gene regulation. In terms of methodology, we conclude that such a comprehensive picture can only be assembled by semi-automated text-mining, manual curation and integration of data from various sources.

  11. Methodologies for tracking learning paths

    DEFF Research Database (Denmark)

    Frølunde, Lisbeth; Gilje, Øystein; Lindstrand, Fredrik

    2009-01-01

    filmmakers: what furthers their interest and/or hinders it, and what learning patterns emerge. The aim of this article is to present and discuss issues regarding the methodology and meth- ods of the study, such as developing a relationship with interviewees when conducting inter- views online (using MSN). We...... suggest two considerations about using online interviews: how the interviewees value the given subject of conversation and their familiarity with being online. The benefit of getting online communication with the young filmmakers offers ease, because it is both practical and appropriates a meeting...

  12. Continuous culture apparatus and methodology

    International Nuclear Information System (INIS)

    Conway, H.L.

    1975-01-01

    At present, we are investigating the sorption of potentially toxic trace elements by phytoplankton under controlled laboratory conditions. Continuous culture techniques were used to study the mechanism of the sorption of the trace elements by unialgal diatom populations and the factors influencing this sorption. Continuous culture methodology has been used extensively to study bacterial kinetics. It is an excellent technique for obtaining a known physiological state of phytoplankton populations. An automated method for the synthesis of continuous culture medium for use in these experiments is described

  13. Renewable Energy Assessment Methodology for Japanese OCONUS Army Installations

    Energy Technology Data Exchange (ETDEWEB)

    Solana, Amy E.; Horner, Jacob A.; Russo, Bryan J.; Gorrissen, Willy J.; Kora, Angela R.; Weimar, Mark R.; Hand, James R.; Orrell, Alice C.; Williamson, Jennifer L.

    2010-08-30

    Since 2005, Pacific Northwest National Laboratory (PNNL) has been asked by Installation Management Command (IMCOM) to conduct strategic assessments at selected US Army installations of the potential use of renewable energy resources, including solar, wind, geothermal, biomass, waste, and ground source heat pumps (GSHPs). IMCOM has the same economic, security, and legal drivers to develop alternative, renewable energy resources overseas as it has for installations located in the US. The approach for continental US (CONUS) studies has been to use known, US-based renewable resource characterizations and information sources coupled with local, site-specific sources and interviews. However, the extent to which this sort of data might be available for outside the continental US (OCONUS) sites was unknown. An assessment at Camp Zama, Japan was completed as a trial to test the applicability of the CONUS methodology at OCONUS installations. It was found that, with some help from Camp Zama personnel in translating and locating a few Japanese sources, there was relatively little difficulty in finding sources that should provide a solid basis for conducting an assessment of comparable depth to those conducted for US installations. Project implementation will likely be more of a challenge, but the feasibility analysis will be able to use the same basic steps, with some adjusted inputs, as PNNL’s established renewable resource assessment methodology.

  14. ENVIRONMENTAL ASSESSMENT METHODOLOGY FOR THE NUCLEAR FUEL CYCLE

    Energy Technology Data Exchange (ETDEWEB)

    Brenchley, D. L.; Soldat, J. K.; McNeese, J. A.; Watson, E. C.

    1977-07-01

    This report describes the methodology for determining where environmental control technology is required for the nuclear fuel cycle. The methodology addresses routine emission of chemical and radioactive effluents, and applies to mining, milling, conversion, enrichment, fuel fabrication, reactors (LWR and BWR) and fuel reprocessing. Chemical and radioactive effluents are evaluated independently. Radioactive effluents are evaluated on the basis of maximum exposed individual dose and population dose calculations for a 1-year emission period and a 50-year commitment. Sources of radionuclides for each facility are then listed according to their relative contribution to the total calculated dose. Effluent, ambient and toxicology standards are used to evaluate the effect of chemical effluents. First, each chemical and source configuration is determined. Sources are tagged if they exceed existirrg standards. The combined effect of all chemicals is assessed for each facility. If the additive effects are unacceptable, then additional control technology is recommended. Finally, sources and their chemicals at each facility are ranked according to their relative contribution to the ambient pollution level. This ranking identifies those sources most in need of environmental control.

  15. Ionotropic crustacean olfactory receptors.

    Directory of Open Access Journals (Sweden)

    Elizabeth A Corey

    Full Text Available The nature of the olfactory receptor in crustaceans, a major group of arthropods, has remained elusive. We report that spiny lobsters, Panulirus argus, express ionotropic receptors (IRs, the insect chemosensory variants of ionotropic glutamate receptors. Unlike insects IRs, which are expressed in a specific subset of olfactory cells, two lobster IR subunits are expressed in most, if not all, lobster olfactory receptor neurons (ORNs, as confirmed by antibody labeling and in situ hybridization. Ligand-specific ORN responses visualized by calcium imaging are consistent with a restricted expression pattern found for other potential subunits, suggesting that cell-specific expression of uncommon IR subunits determines the ligand sensitivity of individual cells. IRs are the only type of olfactory receptor that we have detected in spiny lobster olfactory tissue, suggesting that they likely mediate olfactory signaling. Given long-standing evidence for G protein-mediated signaling in activation of lobster ORNs, this finding raises the interesting specter that IRs act in concert with second messenger-mediated signaling.

  16. Methodological Issues and Practices in Qualitative Research.

    Science.gov (United States)

    Bradley, Jana

    1993-01-01

    Discusses methodological issues concerning qualitative research and describes research practices that qualitative researchers use to address these methodological issues. Topics discussed include the researcher as interpreter, the emergent nature of qualitative research, understanding the experience of others, trustworthiness in qualitative…

  17. Audit Methodology for IT Governance

    Directory of Open Access Journals (Sweden)

    Mirela GHEORGHE

    2010-01-01

    Full Text Available The continuous development of the new IT technologies was followed up by a rapid integration of them at the organization level. The management of the organizations face a new challenge: structural redefinition of the IT component in order to create plus value and to minimize IT risks through an efficient management of all IT resources of the organization. These changes have had a great impact on the governance of the IT component. The paper proposes an audit methodology of the IT Governance at the organization level. From this point of view the developed audit strategy is a strategy based on risks to enable IT auditor to study from the best angle efficiency and effectiveness of the IT Governance structure. The evaluation of the risks associated with IT Governance is a key process in planning the audit mission which will allow the identification of the segments with increased risks. With now ambition for completeness, the proposed methodology provides the auditor a useful tool in the accomplishment of his mission.

  18. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  19. Safeguarding the fuel cycle: Methodologies

    International Nuclear Information System (INIS)

    Gruemm, H.

    1984-01-01

    The effectiveness of IAEA safeguards is characterized by the extent to which they achieve their basic purpose - credible verification that no nuclear material is diverted from peaceful uses. This effectiveness depends inter alia but significantly on manpower in terms of the number and qualifications of inspectors. Staff increases will be required to improve effectiveness further, if this is requested by Member States, as well as to take into account new facilities expected to come under safeguards in the future. However, they are difficult to achieve due to financial constraints set by the IAEA budget. As a consequence, much has been done and is being undertaken to improve utilization of available manpower, including standardization of inspection procedures; improvement of management practices and training; rationalization of planning, reporting, and evaluation of inspection activities; and development of new equipment. This article focuses on certain aspects of the verification methodology presently used and asks: are any modifications of this methodology conceivable that would lead to economies of manpower, without loss of effectiveness. It has been stated in this context that present safeguards approaches are ''facility-oriented'' and that the adoption of a ''fuel cycle-oriented approach'' might bring about the desired savings. Many studies have been devoted to this very interesting suggestion. Up to this moment, no definite answer is available and further studies will be necessary to come to a conclusion. In what follows, the essentials of the problem are explained and some possible paths to a solution are discussed

  20. Methodology for combining dynamic responses

    International Nuclear Information System (INIS)

    Cudlin, R.; Hosford, S.; Mattu, R.; Wichman, K.

    1978-09-01

    The NRC has historically required that the structural/mechanical responses due to various accident loads and loads caused by natural phenomena, (such as earthquakes) be combined when analyzing structures, systems, and components important to safety. Several approaches to account for the potential interaction of loads resulting from accidents and natural phenomena have been used. One approach, the so-called absolute or linear summation (ABS) method, linearly adds the peak structural responses due to the individual dynamic loads. In general, the ABS method has also reflected the staff's conservative preference for the combination of dynamic load responses. A second approach, referred to as SRSS, yields a combined response equal to the square root of the sum of the squares of the peak responses due to the individual dynamic loads. The lack of a physical relationship between some of the loads has raised questions as to the proper methodology to be used in the design of nuclear power plants. An NRR Working Group was constituted to examine load combination methodologies and to develop a recommendation concerning criteria or conditions for their application. Evaluations of and recommendations on the use of the ABS and SRSS methods are provided in the report

  1. Information technology security system engineering methodology

    Science.gov (United States)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  2. Predictive value of minimal residual disease in Philadelphia-chromosome-positive acute lymphoblastic leukemia treated with imatinib in the European intergroup study of post-induction treatment of Philadelphia-chromosome-positive acute lymphoblastic leukemia, based on immunoglobulin/T-cell receptor and BCR/ABL1 methodologies

    Science.gov (United States)

    Cazzaniga, Giovanni; De Lorenzo, Paola; Alten, Julia; Röttgers, Silja; Hancock, Jeremy; Saha, Vaskar; Castor, Anders; Madsen, Hans O.; Gandemer, Virginie; Cavé, Hélène; Leoni, Veronica; Köhler, Rolf; Ferrari, Giulia M.; Bleckmann, Kirsten; Pieters, Rob; van der Velden, Vincent; Stary, Jan; Zuna, Jan; Escherich, Gabriele; zur Stadt, Udo; Aricò, Maurizio; Conter, Valentino; Schrappe, Martin; Valsecchi, Maria Grazia; Biondi, Andrea

    2018-01-01

    The prognostic value of minimal residual disease (MRD) in Philadelphia-chromosome-positive (Ph+) childhood acute lymphoblastic leukemia (ALL) treated with tyrosine kinase inhibitors is not fully established. We detected MRD by real-time quantitative polymerase chain reaction (RQ-PCR) of rearranged immunoglobulin/T-cell receptor genes (IG/TR) and/or BCR/ABL1 fusion transcript to investigate its predictive value in patients receiving Berlin-Frankfurt-Münster (BFM) high-risk (HR) therapy and post-induction intermittent imatinib (the European intergroup study of post-induction treatment of Philadelphia-chromosome-positive acute lymphoblastic leukemia (EsPhALL) study). MRD was monitored after induction (time point (TP)1), consolidation Phase IB (TP2), HR Blocks, reinductions, and at the end of therapy. MRD negativity progressively increased over time, both by IG/TR and BCR/ABL1. Of 90 patients with IG/TR MRD at TP1, nine were negative and none relapsed, while 11 with MRD<5×10−4 and 70 with MRD≥5×10−4 had a comparable 5-year cumulative incidence of relapse of 36.4 (15.4) and 35.2 (5.9), respectively. Patients who achieved MRD negativity at TP2 had a low relapse risk (5-yr cumulative incidence of relapse (CIR)=14.3[9.8]), whereas those who attained MRD negativity at a later date showed higher CIR, comparable to patients with positive MRD at any level. BCR/ABL1 MRD negative patients at TP1 had a relapse risk similar to those who were IG/TR MRD negative (1/8 relapses). The overall concordance between the two methods is 69%, with significantly higher positivity by BCR/ABL1. In conclusion, MRD monitoring by both methods may be functional not only for measuring response but also for guiding biological studies aimed at investigating causes for discrepancies, although from our data IG/TR MRD monitoring appears to be more reliable. Early MRD negativity is highly predictive of favorable outcome. The earlier MRD negativity is achieved, the better the prognosis. PMID

  3. Assays for calcitonin receptors

    International Nuclear Information System (INIS)

    Teitelbaum, A.P.; Nissenson, R.A.; Arnaud, C.D.

    1985-01-01

    The assays for calcitonin receptors described focus on their use in the study of the well-established target organs for calcitonin, bone and kidney. The radioligand used in virtually all calcitonin binding studies is 125 I-labelled salmon calcitonin. The lack of methionine residues in this peptide permits the use of chloramine-T for the iodination reaction. Binding assays are described for intact bone, skeletal plasma membranes, renal plasma membranes, and primary kidney cell cultures of rats. Studies on calcitonin metabolism in laboratory animals and regulation of calcitonin receptors are reviewed

  4. Reconciling Anti-essentialism and Quantitative Methodology

    DEFF Research Database (Denmark)

    Jensen, Mathias Fjællegaard

    2017-01-01

    Quantitative methodology has a contested role in feminist scholarship which remains almost exclusively qualitative. Considering Irigaray’s notion of mimicry, Spivak’s strategic essentialism, and Butler’s contingent foundations, the essentialising implications of quantitative methodology may prove...... the potential to reconcile anti-essentialism and quantitative methodology, and thus, to make peace in the quantitative/qualitative Paradigm Wars....

  5. 42 CFR 441.472 - Budget methodology.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets the...

  6. Application of agile methodologies in software development

    Directory of Open Access Journals (Sweden)

    Jovanović Aca D.

    2016-01-01

    Full Text Available The paper presents the potentials for the development of software using agile methodologies. Special consideration is devoted to the potentials and advantages of use of the Scrum methodology in the development of software and the relationship between the implementation of agile methodologies and the software development projects.

  7. Source apportionment of ambient PM2.5 in Santiago, Chile: 1999 and 2004 results.

    Science.gov (United States)

    Jorquera, Héctor; Barraza, Francisco

    2012-10-01

    A receptor model analysis has been applied to ambient PM(2.5) measurements taken at Santiago, Chile (33.5°S, 70.7°W) in 2004 (117 samples) and in 1999 (95 samples) on a receptor site on the eastern side of the city. For both campaigns, six sources have been identified at Santiago and their contributions in 1999/2004 are: motor vehicles: 28 ± 2.5/31.2 ± 3.4%, wood burning: 24.8 ± 2.3/28.9 ± 3.3%, sulfates: 18.8 ± 1.7/16.2 ± 2.5%, marine aerosol: 13 ± 2.1/9.9 ± 1.5%, copper smelters: 11.5 ± 1.4/9.7 ± 3.3% and soil dust: 3.9 ± 1.5/4.0 ± 2.4%. Hence relative contributions are statistically the same but the absolute contributions have been reduced because ambient PM(2.5) has decreased from 34.2 to 25.1 μg/m(3) between 1999 and 2004 at Santiago. Similarity of results for both data sets - analyzed with different techniques at different laboratory facilities - shows that the analysis performed here is robust. Source identification was carried out by inspection of key species in source profiles, seasonality of source contributions, comparison with published source profiles and by looking at wind trajectories computed using the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) from USA's National Oceanic and Atmospheric Administration (NOAA); for the wood burning sources the MODIS burned area daily product was used to confirm wildfire events along the year. Using this combined methodology we have shown conclusively that: a) marine air masses do reach Santiago's basin in significant amounts but combined with anthropogenic sources; b) all copper smelters surrounding Santiago - and perhaps coal-fired power plants as well - contribute to ambient PM(2.5); c) wood burning is the second largest source, coming from residential wood burning in fall and winter and from regional wildfires in spring and summer. The results of the present analysis can be used to improve emission inventories, air quality forecasting systems and cost-benefit analyses at local

  8. How does concurrent sourcing affect performance?

    DEFF Research Database (Denmark)

    Mols, Niels Peter

    2010-01-01

    be modelled. The propositions and discussion offer researchers a starting-point for further research. Practical implications – The propositions that are developed suggest that managers should consider using concurrent sourcing when they face problems caused by volume uncertainty, technological uncertainty....../methodology/approach – Based on transaction cost, agency, neoclassical economic, knowledge-based, and resource-based theory, it is proposed to show how concurrent sourcing affects performance. Findings – The paper argues that concurrent sourcing improves performance when firms face a combination of volume uncertainty...... how concurrent sourcing affects performance of the market and the hierarchy....

  9. Angiotensin type 2 receptor (AT2R) and receptor Mas

    DEFF Research Database (Denmark)

    Villela, Daniel; Leonhardt, Julia; Patel, Neal

    2015-01-01

    The angiotensin type 2 receptor (AT2R) and the receptor Mas are components of the protective arms of the renin-angiotensin system (RAS), i.e. they both mediate tissue protective and regenerative actions. The spectrum of actions of these two receptors and their signalling mechanisms display striki...

  10. A methodological review of qualitative case study methodology in midwifery research.

    Science.gov (United States)

    Atchan, Marjorie; Davis, Deborah; Foureur, Maralyn

    2016-10-01

    To explore the use and application of case study research in midwifery. Case study research provides rich data for the analysis of complex issues and interventions in the healthcare disciplines; however, a gap in the midwifery research literature was identified. A methodological review of midwifery case study research using recognized templates, frameworks and reporting guidelines facilitated comprehensive analysis. An electronic database search using the date range January 2005-December 2014: Maternal and Infant Care, CINAHL Plus, Academic Search Complete, Web of Knowledge, SCOPUS, Medline, Health Collection (Informit), Cochrane Library Health Source: Nursing/Academic Edition, Wiley online and ProQuest Central. Narrative evaluation was undertaken. Clearly worded questions reflected the problem and purpose. The application, strengths and limitations of case study methods were identified through a quality appraisal process. The review identified both case study research's applicability to midwifery and its low uptake, especially in clinical studies. Many papers included the necessary criteria to achieve rigour. The included measures of authenticity and methodology were varied. A high standard of authenticity was observed, suggesting authors considered these elements to be routine inclusions. Technical aspects were lacking in many papers, namely a lack of reflexivity and incomplete transparency of processes. This review raises the profile of case study research in midwifery. Midwives will be encouraged to explore if case study research is suitable for their investigation. The raised profile will demonstrate further applicability; encourage support and wider adoption in the midwifery setting. © 2016 John Wiley & Sons Ltd.

  11. Methodology for flammable gas evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Hopkins, J.D., Westinghouse Hanford

    1996-06-12

    There are 177 radioactive waste storage tanks at the Hanford Site. The waste generates flammable gases. The waste releases gas continuously, but in some tanks the waste has shown a tendency to trap these flammable gases. When enough gas is trapped in a tank`s waste matrix, it may be released in a way that renders part or all of the tank atmosphere flammable for a period of time. Tanks must be evaluated against previously defined criteria to determine whether they can present a flammable gas hazard. This document presents the methodology for evaluating tanks in two areas of concern in the tank headspace:steady-state flammable-gas concentration resulting from continuous release, and concentration resulting from an episodic gas release.

  12. Methodological Reflections: Inter- ethnic Research

    DEFF Research Database (Denmark)

    Singla, Rashmi

    2010-01-01

    with both youth and the parental generation with ethnic minority background in Denmark. These reflections include implications and challenges related to researcher’s national, ethnic background and educational, professional position in encounter with   diverse ‘researched persons’ such as youth......This article reflects on the methodological and epistemological aspects of the ethical issues involved in encounters between researcher and research participants with ethnic minority background in contexts with diversity. Specific challenges involved in longitudinal research (10 - 15 years......) are also considered. The issues related to the social relevance of the research deriving from psycho political validity implying consideration of power dynamics in the personal, relational and collective domains are included. The primary basis for these reflections is a follow-up study concerning young...

  13. Simulation enabled safeguards assessment methodology

    International Nuclear Information System (INIS)

    Bean, Robert; Bjornard, Trond; Larson, Tom

    2007-01-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wire-frame construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed. (authors)

  14. Simulation Enabled Safeguards Assessment Methodology

    International Nuclear Information System (INIS)

    Robert Bean; Trond Bjornard; Thomas Larson

    2007-01-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment Methodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed

  15. Sustainable Innovation and Entrepreneurship Methodology

    DEFF Research Database (Denmark)

    Celik, Sine; Joore, Peter; Christodoulou, Panayiotis

    or regional “co-creation platform for sustainable solutions” to promote structural innovation. In this manual, the Sustainable Innovation and Entrepreneurship Methodology will be described. The organisational guidelines mainly take point of departure in how Aalborg University (AAU) in Denmark has organised......The objective of the InnoLabs project is to facilitate cross-sectoral, multidisciplinary solutions to complex social problems in various European settings. InnoLabs are university-driven physical and/or organizational spaces that function as student innovation laboratories and operate as a local...... this in daily practice. In line with the objectives of the Innolabs project (output 05), partners in the Innolabs project have reflected, evaluated and concluded the project experiences, which are described in this report. The InnoLabs project was developed for the 2014 call of Erasmus+ funds KA2- Cooperation...

  16. Methodologies for 2011 economic reports

    DEFF Research Database (Denmark)

    Nielsen, Rasmus

    STECF’s Expert Working Group 11-03 convened in Athens (28th March – 1st April, 2011) to discuss and seek agreement on the content, indicators, methodologies and format of the 2011 Annual Economic Reports (AER) on the EU fishing fleet, the fish processing and the aquaculture sectors. Proposals...... for improved contents and the overall structure were discussed. Templates for the national and EU overview chapters for the EU the fish processing and the aquaculture sectors were produced. Indicators for the EU fishing fleet and fish processing reports were reviewed; new indicators for the fish processing...... and the aquaculture sector reports were proposed. And topics of special interest were proposed for all three reports....

  17. Neural Networks Methodology and Applications

    CERN Document Server

    Dreyfus, Gérard

    2005-01-01

    Neural networks represent a powerful data processing technique that has reached maturity and broad application. When clearly understood and appropriately used, they are a mandatory component in the toolbox of any engineer who wants make the best use of the available data, in order to build models, make predictions, mine data, recognize shapes or signals, etc. Ranging from theoretical foundations to real-life applications, this book is intended to provide engineers and researchers with clear methodologies for taking advantage of neural networks in industrial, financial or banking applications, many instances of which are presented in the book. For the benefit of readers wishing to gain deeper knowledge of the topics, the book features appendices that provide theoretical details for greater insight, and algorithmic details for efficient programming and implementation. The chapters have been written by experts ands seemlessly edited to present a coherent and comprehensive, yet not redundant, practically-oriented...

  18. Stakeholder analysis methodologies resource book

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, W.M.; Farhar, B.C.

    1994-03-01

    Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and their commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.

  19. Inventory differences: An evaluation methodology

    International Nuclear Information System (INIS)

    Heinberg, C.L.; Roberts, N.J.

    1987-01-01

    This paper discusses an evaluation methodology which is used for inventory differences at the Los Alamos National Laboratory. It is recognized that there are various methods which can be, and are being, used to evaluate process inventory differences at DOE facilities. The purpose of this paper is to share our thoughts on the subject and our techniques with those who are responsible for the evaluation of inventory differences at their facility. One of the most dangerous aspects of any evaluation technique, especially one as complex as most inventory difference evaluations tend to be, is to fail to look at the tools being used as indicators. There is a tendency to look at the results of an evaluation by one technique as an absolute. At the Los Alamos National Laboratory, several tools are used and the final evaluation is based on a combination of the observed results of a many-faceted evaluation. The tools used and some examples are presented

  20. Methodology of formal software evaluation

    International Nuclear Information System (INIS)

    Tuszynski, J.

    1998-01-01

    Sydkraft AB, the major Swedish utility, owner of ca 6000 MW el installed in nuclear (NPP Barsebaeck and NPP Oskarshamn), fossil fuel and hydro Power Plants is facing modernization of the control systems of the plants. Standards applicable require structured, formal methods for implementation of the control functions in the modem, real time software systems. This presentation introduces implementation methodology as discussed presently at the Sydkraft organisation. The approach suggested is based upon the process of co-operation of three parties taking part in the implementation; owner of the plant, vendor and Quality Assurance (QA) organisation. QA will be based on tools for formal software validation and on systematic gathering by the owner of validated and proved-by-operation control modules for the concern-wide utilisation. (author)

  1. Butterfly valve torque prediction methodology

    International Nuclear Information System (INIS)

    Eldiwany, B.H.; Sharma, V.; Kalsi, M.S.; Wolfe, K.

    1994-01-01

    As part of the Motor-Operated Valve (MOV) Performance Prediction Program, the Electric Power Research Institute has sponsored the development of methodologies for predicting thrust and torque requirements of gate, globe, and butterfly MOVs. This paper presents the methodology that will be used by utilities to calculate the dynamic torque requirements for butterfly valves. The total dynamic torque at any disc position is the sum of the hydrodynamic torque, bearing torque (which is induced by the hydrodynamic force), as well as other small torque components (such as packing torque). The hydrodynamic torque on the valve disc, caused by the fluid flow through the valve, depends on the disc angle, flow velocity, upstream flow disturbances, disc shape, and the disc aspect ratio. The butterfly valve model provides sets of nondimensional flow and torque coefficients that can be used to predict flow rate and hydrodynamic torque throughout the disc stroke and to calculate the required actuation torque and the maximum transmitted torque throughout the opening and closing stroke. The scope of the model includes symmetric and nonsymmetric discs of different shapes and aspects ratios in compressible and incompressible fluid applications under both choked and nonchoked flow conditions. The model features were validated against test data from a comprehensive flowloop and in situ test program. These tests were designed to systematically address the effect of the following parameters on the required torque: valve size, disc shapes and disc aspect ratios, upstream elbow orientation and its proximity, and flow conditions. The applicability of the nondimensional coefficients to valves of different sizes was validated by performing tests on 42-in. valve and a precisely scaled 6-in. model. The butterfly valve model torque predictions were found to bound test data from the flow-loop and in situ testing, as shown in the examples provided in this paper

  2. System Anthropological Psychology: Methodological Foundations

    Directory of Open Access Journals (Sweden)

    Vitaliy Y. Klochko

    2012-01-01

    Full Text Available The article considers methodological foundations of the system anthropologicalpsychology (SAP as a scientific branch developed by a well-represented groupof Siberian scientists. SAP is a theory based on axiomatics of cultural-historicalpsychology of L.S. Vygotsky and transspective analysis as a specially developedmeans to define the tendencies of science developing as a self-organizing system.Transspective analysis has revealed regularities in a constantly growing complexityof professional-psychological thinking along the course of emergence ofscientific cognition. It has proved that the field of modern psychology is shapedby theories constructed with ideation of different grades of complexity. The concept“dynamics of the paradigm of science” is introduced; it allows transitions tobe acknowledged from ordinary-binary logic characteristics of the classical scienceto a binary-ternary logic, adequate to non-classical science and then to aternary-multidimensional logic, which is now at the stage of emergence. The latteris employed in SAP construction. It involves the following basic methodologicalprinciples: the principle of directed (selective interaction and the principle ofgenerative effect of selective interaction. The concept of “complimentary interaction”applied in natural as well as humanitarian sciences is reconsidered in thecontext of psychology. The conclusion is made that the principle of selectivity anddirectedness of interaction is relevant to the whole Universe embracing all kindsof systems including the living ones. Different levels of matter organization representingsemantic structures of various complexity use one and the same principleof meaning making through which the Universe ensures its sustainability asa self-developing phenomenon. This methodology provides an explanation fornature and stages of emergence of multidimensional life space of an individual,which comes as a foundation for generation of such features of

  3. TLX: An elusive receptor.

    Science.gov (United States)

    Benod, Cindy; Villagomez, Rosa; Webb, Paul

    2016-03-01

    TLX (tailless receptor) is a member of the nuclear receptor superfamily and belongs to a class of nuclear receptors for which no endogenous or synthetic ligands have yet been identified. TLX is a promising therapeutic target in neurological disorders and brain tumors. Thus, regulatory ligands for TLX need to be identified to complete the validation of TLX as a useful target and would serve as chemical probes to pursue the study of this receptor in disease models. It has recently been proved that TLX is druggable. However, to identify potent and specific TLX ligands with desirable biological activity, a deeper understanding of where ligands bind, how they alter TLX conformation and of the mechanism by which TLX mediates the transcription of its target genes is needed. While TLX is in the process of escaping from orphanhood, future ligand design needs to progress in parallel with improved understanding of (i) the binding cavity or surfaces to target with small molecules on the TLX ligand binding domain and (ii) the nature of the TLX coregulators in particular cell and disease contexts. Both of these topics are discussed in this review. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Glutamate receptor ligands

    DEFF Research Database (Denmark)

    Guldbrandt, Mette; Johansen, Tommy N; Frydenvang, Karla Andrea

    2002-01-01

    Homologation and substitution on the carbon backbone of (S)-glutamic acid [(S)-Glu, 1], as well as absolute stereochemistry, are structural parameters of key importance for the pharmacological profile of (S)-Glu receptor ligands. We describe a series of methyl-substituted 2-aminoadipic acid (AA...

  5. Ginkgolides and glycine receptors

    DEFF Research Database (Denmark)

    Jaracz, Stanislav; Nakanishi, Koji; Jensen, Anders A.

    2004-01-01

    Ginkgolides from the Ginkgo biloba tree are diterpenes with a cage structure consisting of six five-membered rings and a unique tBu group. They exert a variety of biological properties. In addition to being antagonists of the platelet activating factor receptor (PAFR), it has recently been shown ...

  6. adrenergic receptor with preeclampsia

    African Journals Online (AJOL)

    User

    2011-05-09

    May 9, 2011 ... due to a post- receptor defect (Karadas et al., 2007). Several polymorphisms have ... the detection of the Arg16Gly polymorphism, overnight digestion at. 37°C with 10 U ..... DW, Wood AJ, Stein CM (2004). Beta2-adrenoceptor ...

  7. Metformin and insulin receptors

    International Nuclear Information System (INIS)

    Vigneri, R.; Gullo, D.; Pezzino, V.

    1984-01-01

    The authors evaluated the effect of metformin (N,N-dimethylbiguanide), a biguanide known to be less toxic than phenformin, on insulin binding to its receptors, both in vitro and in vivo. Specific 125 I-insulin binding to cultured IM-9 human lymphocytes and MCF-7 human breast cancer cells was determined after preincubation with metformin. Specific 125 I-insulin binding to circulating monocytes was also evaluated in six controls, eight obese subjects, and six obese type II diabetic patients before and after a short-term treatment with metformin. Plasma insulin levels and blood glucose were also measured on both occasions. Metformin significantly increased insulin binding in vitro to both IM-9 lymphocytes and MCF-7 cells; the maximum increment was 47.1% and 38.0%, respectively. Metformin treatment significantly increased insulin binding in vivo to monocytes of obese subjects and diabetic patients. Scatchard analysis indicated that the increased binding was mainly due to an increase in receptor capacity. Insulin binding to monocytes of normal controls was unchanged after metformin as were insulin levels in all groups; blood glucose was significantly reduced after metformin only in diabetic patients. These data indicate that metformin increases insulin binding to its receptors in vitro and in vivo. The effect in vivo is observed in obese subjects and in obese type II diabetic patients, paralleling the clinical effectiveness of this antidiabetic agent, and is not due to receptor regulation by circulating insulin, since no variation in insulin levels was recorded

  8. Multivariate Receptor Models for Spatially Correlated Multipollutant Data

    KAUST Repository

    Jun, Mikyoung

    2013-08-01

    The goal of multivariate receptor modeling is to estimate the profiles of major pollution sources and quantify their impacts based on ambient measurements of pollutants. Traditionally, multivariate receptor modeling has been applied to multiple air pollutant data measured at a single monitoring site or measurements of a single pollutant collected at multiple monitoring sites. Despite the growing availability of multipollutant data collected from multiple monitoring sites, there has not yet been any attempt to incorporate spatial dependence that may exist in such data into multivariate receptor modeling. We propose a spatial statistics extension of multivariate receptor models that enables us to incorporate spatial dependence into estimation of source composition profiles and contributions given the prespecified number of sources and the model identification conditions. The proposed method yields more precise estimates of source profiles by accounting for spatial dependence in the estimation. More importantly, it enables predictions of source contributions at unmonitored sites as well as when there are missing values at monitoring sites. The method is illustrated with simulated data and real multipollutant data collected from eight monitoring sites in Harris County, Texas. Supplementary materials for this article, including data and R code for implementing the methods, are available online on the journal web site. © 2013 Copyright Taylor and Francis Group, LLC.

  9. Six methodological steps to build medical data warehouses for research.

    Science.gov (United States)

    Szirbik, N B; Pelletier, C; Chaussalet, T

    2006-09-01

    We propose a simple methodology for heterogeneous data collection and central repository-style database design in healthcare. Our method can be used with or without other software development frameworks, and we argue that its application can save a relevant amount of implementation effort. Also, we believe that the method can be used in other fields of research, especially those that have a strong interdisciplinary nature. The idea emerged during a healthcare research project, which consisted among others in grouping information from heterogeneous and distributed information sources. We developed this methodology by the lessons learned when we had to build a data repository, containing information about elderly patients flows in the UK's long-term care system (LTC). We explain thoroughly those aspects that influenced the methodology building. The methodology is defined by six steps, which can be aligned with various iterative development frameworks. We describe here the alignment of our methodology with the RUP (rational unified process) framework. The methodology emphasizes current trends, as early identification of critical requirements, data modelling, close and timely interaction with users and stakeholders, ontology building, quality management, and exception handling. Of a special interest is the ontological engineering aspect, which had the effects with the highest impact after the project. That is, it helped stakeholders to perform better collaborative negotiations that brought better solutions for the overall system investigated. An insight into the problems faced by others helps to lead the negotiators to win-win situations. We consider that this should be the social result of any project that collects data for better decision making that leads finally to enhanced global outcomes.

  10. The Methodological Dynamism of Grounded Theory

    Directory of Open Access Journals (Sweden)

    Nicholas Ralph

    2015-11-01

    Full Text Available Variations in grounded theory (GT interpretation are the subject of ongoing debate. Divergences of opinion, genres, approaches, methodologies, and methods exist, resulting in disagreement on what GT methodology is and how it comes to be. From the postpositivism of Glaser and Strauss, to the symbolic interactionist roots of Strauss and Corbin, through to the constructivism of Charmaz, the field of GT methodology is distinctive in the sense that those using it offer new ontological, epistemological, and methodological perspectives at specific moments in time. We explore the unusual dynamism attached to GT’s underpinnings. Our view is that through a process of symbolic interactionism, in which generations of researchers interact with their context, moments are formed and philosophical perspectives are interpreted in a manner congruent with GT’s essential methods. We call this methodological dynamism, a process characterized by contextual awareness and moment formation, contemporaneous translation, generational methodology, and methodological consumerism.

  11. Relative Hazard and Risk Measure Calculation Methodology

    International Nuclear Information System (INIS)

    Stenner, Robert D.; Strenge, Dennis L.; Elder, Matthew S.; Andrews, William B.; Walton, Terry L.

    2003-01-01

    The RHRM equations, as represented in methodology and code presented in this report, are primarily a collection of key factors normally used in risk assessment that are relevant to understanding the hazards and risks associated with projected mitigation, cleanup, and risk management activities. The RHRM code has broad application potential. For example, it can be used to compare one mitigation, cleanup, or risk management activity with another, instead of just comparing it to just the fixed baseline. If the appropriate source term data are available, it can be used in its non-ratio form to estimate absolute values of the associated controlling hazards and risks. These estimated values of controlling hazards and risks can then be examined to help understand which mitigation, cleanup, or risk management activities are addressing the higher hazard conditions and risk reduction potential at a site. Graphics can be generated from these absolute controlling hazard and risk values to graphically compare these high hazard and risk reduction potential conditions. If the RHRM code is used in this manner, care must be taken to specifically define and qualify (e.g., identify which factors were considered and which ones tended to drive the hazard and risk estimates) the resultant absolute controlling hazard and risk values

  12. Evaluation methodology for generator refurbishment decisions

    International Nuclear Information System (INIS)

    Moore, W.G.; Ulm, S.F.

    1991-01-01

    The Electrical Power Industry is undergoing tremendous change due to deregulation, aging equipment, environmental concerns, and investment/risk considerations. Public utility commissions, along with shareholders and end consumers, are closely monitoring utilities; decisions, especially in the area of costs-both Operation and Maintenance, and Capital. Increasing emphasis, within the conventional utility environment, has been and continue to be, placed on controlling expenditures. To be responsive to these industry and competitive pressures, utilities must make equipment refurbishment decisions. These decisions should be based on input from many sources, including the severity of the failure, cost of replacement versus refurbishment, risks and safety considerations, the expected remaining life of the unit, operational mode (base or peak), fuel type, initial costs, system capacity, available budgets, and financing options. Many times, however, refurbishment decisions are base don an abstract understanding of the above, but feel, or emotional attachment to a particular option. This paper describes a general methodology for refurbishment decision making, applied specifically to generators. Also included in a case history of one utility's progression through this process

  13. Econometric Methodology of Monopolization Process Evaluation

    Directory of Open Access Journals (Sweden)

    Dmitrijs Skoruks

    2014-06-01

    Full Text Available The research “Econometric Methodology of Monopolization Process Evaluation” gives a perspective description of monopolization process’ nature, occurrence source, development procedure and internal conjuncture specifics, as well as providing an example of modern econometrical method application within a unified framework of market competition analysis for the purpose of conducting a quantitative competition evaluation on an industry level for practical use in both private and public sectors. The main question of the aforementioned research is the definition and quantitative analysis of monopolization effects in modern day globalized markets, while con- structing an empirical model of the econometric analysis, based on the use of in- ternational historical experience of monopoly formations standings, with the goal of introducing a further development scheme for the use of both econometrical and statistical instruments in line with the forecasting and business research need of enterprises and regulatory functions of the public sector. The current research uses a vast variety of monopolization evaluation ratios and their econometrical updates on companies that are involved in the study procedure in order to detect and scallar measure their market monopolizing potential, based on the implemented acquired market positions, turnover shares and competition policies.

  14. Background compensation methodologies for contamination monitoring systems

    International Nuclear Information System (INIS)

    Raman, Anand; Chaudhury, Probal; Pradeepkumar, K.S.

    2014-01-01

    Radiation surveillance program in the various nuclear facilities incorporate contamination monitoring as an important component. Contamination monitoring programs constitute monitoring for alpha and beta contamination of the physical entities associated with the working personnel that include his hands, feet, clothing, shoes as well as the general surface areas in the working environment like floors. All these measurements are fraught with the contribution of the ambient gamma background radiation fields. These inhibit a proper and precise estimation of the contamination concentration being monitored. This paper investigates the efficacy of two methodologies that have been incorporated in two of the contamination monitoring systems developed in the Division. In the first system discussed, a high degree of gamma compensation has been achieved for an uniform exposure of the order of 50 nSv/hr to 100 mSv/hr. In the second system discussed, the degree of gamma compensation achieved is equal to those dictated by the statistical nature of the uncertainties associated with the subtraction of background from the source data. These two methods can be very effectively employed depending on the application requirement. A minimum detection level equivalent to 0.37 Bq/cdm 2 has been achieved in both these cases

  15. Beyond Modeling: All-Atom Olfactory Receptor Model Simulations

    Directory of Open Access Journals (Sweden)

    Peter C Lai

    2012-05-01

    Full Text Available Olfactory receptors (ORs are a type of GTP-binding protein-coupled receptor (GPCR. These receptors are responsible for mediating the sense of smell through their interaction with odor ligands. OR-odorant interactions marks the first step in the process that leads to olfaction. Computational studies on model OR structures can validate experimental functional studies as well as generate focused and novel hypotheses for further bench investigation by providing a view of these interactions at the molecular level. Here we have shown the specific advantages of simulating the dynamic environment that is associated with OR-odorant interactions. We present a rigorous methodology that ranges from the creation of a computationally-derived model of an olfactory receptor to simulating the interactions between an OR and an odorant molecule. Given the ubiquitous occurrence of GPCRs in the membranes of cells, we anticipate that our OR-developed methodology will serve as a model for the computational structural biology of all GPCRs.

  16. Assessment methodology for radioactive effluents

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    The objective of this environmental assessment is to define and rank the needs for controlling radioactive effluents from nuclear fuel cycle facilities. The assessment is based on environmental standards and dose-to-man calculations. The study includes three calculations for each isotope from each facility: maximum individual dose for a 50-year dose commitment from a 1-yr exposure according to the organ affected; population dose for a 50-yr dose commitment from a 1-yr exposure according to the organ affected; and annual dose rate for the maximally exposed individual. The relative contribution of a specific nuclide and source to the total dose provides a method of ranking the nuclides, which in turn identifies the sources that should receive the greatest control in the future. These results will be used in subsequent tasks to assess the environmental impact of the total nuclear fuel cycle

  17. Olfactory Receptor Database: a sensory chemoreceptor resource

    OpenAIRE

    Skoufos, Emmanouil; Marenco, Luis; Nadkarni, Prakash M.; Miller, Perry L.; Shepherd, Gordon M.

    2000-01-01

    The Olfactory Receptor Database (ORDB) is a WWW-accessible database that has been expanded from an olfactory receptor resource to a chemoreceptor resource. It stores data on six classes of G-protein-coupled sensory chemoreceptors: (i) olfactory receptor-like proteins, (ii) vomeronasal receptors, (iii) insect olfactory receptors, (iv) worm chemoreceptors, (v) taste papilla receptors and (vi) fungal pheromone receptors. A complementary database of the ligands of these receptors (OdorDB) has bee...

  18. Dose enhancement in a room cobalt-60 source

    International Nuclear Information System (INIS)

    Simons, M.; Pease, R.L.; Fleetwood, D.M.; Schwank, J.R.; Krzesniak, M.

    1997-01-01

    A room Co-60 source was characterized using TLDs and pMOS RADFETs. Dose enhancement was measured using RADFETs with and without gold- flashed kovar lids. A methodology was developed to predict dose enhancement vs position and test configuration

  19. The G protein-coupled receptors deorphanization landscape.

    Science.gov (United States)

    Laschet, Céline; Dupuis, Nadine; Hanson, Julien

    2018-07-01

    G protein-coupled receptors (GPCRs) are usually highlighted as being both the largest family of membrane proteins and the most productive source of drug targets. However, most of the GPCRs are understudied and hence cannot be used immediately for innovative therapeutic strategies. Besides, there are still around 100 orphan receptors, with no described endogenous ligand and no clearly defined function. The race to discover new ligands for these elusive receptors seems to be less intense than before. Here, we present an update of the various strategies employed to assign a function to these receptors and to discover new ligands. We focus on the recent advances in the identification of endogenous ligands with a detailed description of newly deorphanized receptors. Replication being a key parameter in these endeavors, we also discuss the latest controversies about problematic ligand-receptor pairings. In this context, we propose several recommendations in order to strengthen the reporting of new ligand-receptor pairs. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Guidelines for reporting evaluations based on observational methodology.

    Science.gov (United States)

    Portell, Mariona; Anguera, M Teresa; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2015-01-01

    Observational methodology is one of the most suitable research designs for evaluating fidelity of implementation, especially in complex interventions. However, the conduct and reporting of observational studies is hampered by the absence of specific guidelines, such as those that exist for other evaluation designs. This lack of specific guidance poses a threat to the quality and transparency of these studies and also constitutes a considerable publication hurdle. The aim of this study thus was to draw up a set of proposed guidelines for reporting evaluations based on observational methodology. The guidelines were developed by triangulating three sources of information: observational studies performed in different fields by experts in observational methodology, reporting guidelines for general studies and studies with similar designs to observational studies, and proposals from experts in observational methodology at scientific meetings. We produced a list of guidelines grouped into three domains: intervention and expected outcomes, methods, and results. The result is a useful, carefully crafted set of simple guidelines for conducting and reporting observational studies in the field of program evaluation.

  1. Seismic hazard analysis. A methodology for the Eastern United States

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D L

    1980-08-01

    This report presents a probabilistic approach for estimating the seismic hazard in the Central and Eastern United States. The probabilistic model (Uniform Hazard Methodology) systematically incorporates the subjective opinion of several experts in the evaluation of seismic hazard. Subjective input, assumptions and associated hazard are kept separate for each expert so as to allow review and preserve diversity of opinion. The report is organized into five sections: Introduction, Methodology Comparison, Subjective Input, Uniform Hazard Methodology (UHM), and Uniform Hazard Spectrum. Section 2 Methodology Comparison, briefly describes the present approach and compares it with other available procedures. The remainder of the report focuses on the UHM. Specifically, Section 3 describes the elicitation of subjective input; Section 4 gives details of various mathematical models (earthquake source geometry, magnitude distribution, attenuation relationship) and how these models re combined to calculate seismic hazard. The lost section, Uniform Hazard Spectrum, highlights the main features of typical results. Specific results and sensitivity analyses are not presented in this report. (author)

  2. Chemokine receptor CXCR4 downregulated by von Hippel-Lindau tumour suppressor pVHL

    DEFF Research Database (Denmark)

    Staller, Peter; Sulitkova, Jitka; Lisztwan, Joanna

    2003-01-01

    Organ-specific metastasis is governed, in part, by interactions between chemokine receptors on cancer cells and matching chemokines in target organs. For example, malignant breast cancer cells express the chemokine receptor CXCR4 and commonly metastasize to organs that are an abundant source of t...

  3. Development of methodology for the synthesis of poly(lactic acid-co-glycolic acid) for use in the production of radioactive sources; Desenvolvimento da metodologia para sintese do poli(acido latico-co-acido glicolico) para utilizacao na producao de fontes radioativas

    Energy Technology Data Exchange (ETDEWEB)

    Peleias Junior, Fernando dos Santos; Zeituni, Carlos Alberto; Rostelato, Maria Elisa Chuery Martins; Souza, Carla Daruich de; Mattos, Fabio Rodrigues de; Moura, Eduardo Santana de; Moura, Joao Augusto; Benega, Marcos Antonio Gimenes; Feher, Anselmo; Costa, Osvaldo Luiz da; Rodrigues, Bruna Teiga, E-mail: fernandopeleias@gmail.com [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP/CTR), Sao Paulo, SP (Brazil). Centro de Tecnologia das Radiacoes; Fechine, Guilhermino Jose [Universidade Presbiteriana Mackenzie, Sao Paulo, SP (Brazil). Escola de Engenharia

    2015-05-15

    According to the World Health Organization, cancer is a leading cause of death worldwide. A radiotherapy method extensively used in prostate cancer is brachytherapy, where the area requiring treatment receives radioactive seeds. Iodine-125 seeds can be inserted loose or stranded in bioabsorbable polymers produced from poly(lactic-co-glycolic acid) (PLGA). We developed the synthesis methodology for PLGA and the results obtained show that it was possible to determine the optimal reaction parameters (time and temperature) for PLGA in 80/20 (lactide/glycolide) ratio. The yield was higher than 90% using a temperature of 110 °C and reaction time of 72 hours; however, the molecular weight values obtained are very low compared to those obtained by other authors. New tests using previously synthesized dimers and nitrogen atmosphere are being performed. These conditions could potentially increase the molar mass of PLGA. All techniques used confirmed the expected structure of the polymer. (author)

  4. New sources and instrumentation for neutrons in biology

    DEFF Research Database (Denmark)

    Teixeira, S. C. M.; Zaccai, G.; Ankner, J.

    2008-01-01

    Neutron radiation offers significant advantages for the study of biological molecular structure and dynamics. A broad and significant effort towards instrumental and methodological development to facilitate biology experiments at neutron sources worldwide is reviewed.......Neutron radiation offers significant advantages for the study of biological molecular structure and dynamics. A broad and significant effort towards instrumental and methodological development to facilitate biology experiments at neutron sources worldwide is reviewed....

  5. 28 CFR 104.47 - Collateral sources.

    Science.gov (United States)

    2010-07-01

    ... determining the appropriate collateral source offset for future benefit payments, the Special Master may employ an appropriate methodology for determining the present value of such future benefits. In... compensation, including life insurance, pension funds, death benefits programs, and payments by Federal, State...

  6. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  7. Prostaglandin Receptor Signaling in Disease

    Directory of Open Access Journals (Sweden)

    Toshiyuki Matsuoka

    2007-01-01

    Full Text Available Prostanoids, consisting of the prostaglandins (PGs and the thromboxanes (TXs, are a group of lipid mediators formed in response to various stimuli. They include PGD2, PGE2, PGF2α, PGI2, and TXA2. They are released outside of the cells immediately after synthesis, and exert their actions by binding to a G-protein coupled rhodopsin-type receptor on the surface of target cells. There are eight types of the prostanoid receptors conserved in mammals from mouse to human. They are the PGD receptor (DP, four subtypes of the PGE receptor (EP1, EP2, EP3, and EP4, the PGF receptor (FP, PGI receptor (IP, and TXA receptor (TP. Recently, mice deficient in each of these prostanoid receptors were generated and subjected to various experimental models of disease. These studies have revealed the roles of PG receptor signaling in various pathological conditions, and suggest that selective manipulation of the prostanoid receptors may be beneficial in treatment of the pathological conditions. Here we review these recent findings of roles of prostanoid receptor signaling and their therapeutic implications.

  8. Climate index for Switzerland - Methodology

    International Nuclear Information System (INIS)

    2006-01-01

    According to the U.S. Department of Energy, an estimated 25% of the GNP is affected by weather-related events. The variations in temperature - even small ones - can also have long-lasting effects on the operational results of a company. Among other, the Energy supply sector is sensitive to weather risks: a milder or harsher than usual winter leads to a decrease or increase of energy consumption. The price of electricity on power trading facilities like Powernext is especially sensitive to odd changes in temperatures. Powernext and Meteo-France (the French meteorological agency) have joined expertise in order to promote the use of weather indices in term of decision making or underlying of hedging tools to energy actors, end users from any other sector of activity and specialists of the weather risk hedging. The Powernext Weather indices are made from information collected by Meteo-France's main observation network according to the norms of international meteorology, in areas carefully selected. The gross data are submitted to a thorough review allowing the correction of abnormalities and the reconstitution of missing data. Each index is fashioned to take into account the economic activity in the various regions of the country as represented by each region's population. This demographic information represents a fair approximation of the weight of the regional economic activity. This document presents the calculation methodology of average, minimum and maximum weather indexes with the winter and summer regression equations for the different economical regions of Switzerland. (J.S.)

  9. Methodology of site protection studies

    International Nuclear Information System (INIS)

    Farges, L.

    1980-01-01

    Preliminary studies preceding building of a nuclear facility aim at assessing the choice of a site and establishing operating and control procedures. These studies are of two types. Studies on the impact of environment on the nuclear facility to be constructed form one type and studies on the impact of nuclear facilities on the environment form the second type. A methodology giving a framework to studies of second type is presented. These studies are undertaken to choose suitable sites for nuclear facilities. After a preliminary selection of a site based on the first estimate, a detailed site study is undertaken. The procedure for this consists of five successive phases, namely, (1) an inquiry assessing the initial state of the site, (2) an initial synthesis of accumulated information for assessing the health and safety consequences of releases, (3) laboratory and field studies simulating the movement of waste products for a quantitative assessment of effects, (4) final synthesis for laying down the release limits and radiological control methods, and (5) conclusions based on comparing the data of final synthesis to the limits prescribed by regulations. These five phases are outlined. Role of periodic reassessments after the facility is in operation for same time is explained. (M.G.B.)

  10. Climate index for Belgium - Methodology

    International Nuclear Information System (INIS)

    2006-01-01

    According to the U.S. Department of Energy, an estimated 25% of the GNP is affected by weather-related events. The variations in temperature - even small ones - can also have long-lasting effects on the operational results of a company. Among other, the Energy supply sector is sensitive to weather risks: a milder or harsher than usual winter leads to a decrease or increase of energy consumption. The price of electricity on power trading facilities like Powernext is especially sensitive to odd changes in temperatures. Powernext and Meteo-France (the French meteorological agency) have joined expertise in order to promote the use of weather indices in term of decision making or underlying of hedging tools to energy actors, end users from any other sector of activity and specialists of the weather risk hedging. The Powernext Weather indices are made from information collected by Meteo-France's main observation network according to the norms of international meteorology, in areas carefully selected. The gross data are submitted to a thorough review allowing the correction of abnormalities and the reconstitution of missing data. Each index is fashioned to take into account the economic activity in the various regions of the country as represented by each region's population. This demographic information represents a fair approximation of the weight of the regional economic activity. This document presents the calculation methodology of average, minimum and maximum weather indexes with the winter and summer regression equations for the different economical regions of Belgium. (J.S.)

  11. Zika detection: comparison of methodologies

    Directory of Open Access Journals (Sweden)

    Tatiana Elias Colombo

    Full Text Available ABSTRACT Many countries in the Americas have detected local transmission of multiple arboviruses that cause febrile illnesses. Therefore, laboratory testing has become an important tool for confirming the etiology of these diseases. The present study aimed to compare the sensitivity and specificity of three different Zika virus detection assays. One hundred serum samples from patients presenting with acute febrile symptoms were tested using a previously reported TaqMan® RT-qPCR assay. We used a SYBR® Green RT-qPCR and a conventional PCR methodologies to compare the results. Of the samples that were determined to be negative by the TaqMan® RT-qPCR assay, 100% (Kappa = 0.670 were also found to be negative by SYBR® Green RT-qPCR based on Tm comparison; however, 14% (Kappa = 0.035 were found to be positive by conventional PCR followed by agarose gel electrophoresis. The differences between the ZIKV strains circulating worldwide and the low viremia period can compromise diagnostic accuracy and thereby the accuracy of outbreak data. Therefore, improved assays are required to improve the diagnosis and surveillance of arbovirus.

  12. Climate index for Portugal - Methodology

    International Nuclear Information System (INIS)

    2006-01-01

    According to the U.S. Department of Energy, an estimated 25% of the GNP is affected by weather-related events. The variations in temperature - even small ones - can also have long-lasting effects on the operational results of a company. Among other, the Energy supply sector is sensitive to weather risks: a milder or harsher than usual winter leads to a decrease or increase of energy consumption. The price of electricity on power trading facilities like Powernext is especially sensitive to odd changes in temperatures. Powernext and Meteo-France (the French meteorological agency) have joined expertise in order to promote the use of weather indices in term of decision making or underlying of hedging tools to energy actors, end users from any other sector of activity and specialists of the weather risk hedging. The Powernext Weather indices are made from information collected by Meteo-France's main observation network according to the norms of international meteorology, in areas carefully selected. The gross data are submitted to a thorough review allowing the correction of abnormalities and the reconstitution of missing data. Each index is fashioned to take into account the economic activity in the various regions of the country as represented by each region's population. This demographic information represents a fair approximation of the weight of the regional economic activity. This document presents the calculation methodology of average, minimum and maximum weather indexes with the winter and summer regression equations for the different economical regions of Portugal. (J.S.)

  13. Methodological Aspects of Architectural Documentation

    Directory of Open Access Journals (Sweden)

    Arivaldo Amorim

    2011-12-01

    Full Text Available This paper discusses the methodological approach that is being developed in the state of Bahia in Brazil since 2003, in architectural and urban sites documentation, using extensive digital technologies. Bahia has a vast territory with important architectural ensembles ranging from the sixteenth century to present day. As part of this heritage is constructed of raw earth and wood, it is very sensitive to various deleterious agents. It is therefore critical document this collection that is under threats. To conduct those activities diverse digital technologies that could be used in documentation process are being experimented. The task is being developed as an academic research, with few financial resources, by scholarship students and some volunteers. Several technologies are tested ranging from the simplest to the more sophisticated ones, used in the main stages of the documentation project, as follows: work overall planning, data acquisition, processing and management and ultimately, to control and evaluate the work. The activities that motivated this paper are being conducted in the cities of Rio de Contas and Lençóis in the Chapada Diamantina, located at 420 km and 750 km from Salvador respectively, in Cachoeira city at Recôncavo Baiano area, 120 km from Salvador, the capital of Bahia state, and at Pelourinho neighbourhood, located in the historic capital. Part of the material produced can be consulted in the website: < www.lcad.ufba.br>.

  14. Methodology of a systematic review.

    Science.gov (United States)

    Linares-Espinós, E; Hernández, V; Domínguez-Escrig, J L; Fernández-Pello, S; Hevia, V; Mayor, J; Padilla-Fernández, B; Ribal, M J

    2018-05-03

    The objective of evidence-based medicine is to employ the best scientific information available to apply to clinical practice. Understanding and interpreting the scientific evidence involves understanding the available levels of evidence, where systematic reviews and meta-analyses of clinical trials are at the top of the levels-of-evidence pyramid. The review process should be well developed and planned to reduce biases and eliminate irrelevant and low-quality studies. The steps for implementing a systematic review include (i) correctly formulating the clinical question to answer (PICO), (ii) developing a protocol (inclusion and exclusion criteria), (iii) performing a detailed and broad literature search and (iv) screening the abstracts of the studies identified in the search and subsequently of the selected complete texts (PRISMA). Once the studies have been selected, we need to (v) extract the necessary data into a form designed in the protocol to summarise the included studies, (vi) assess the biases of each study, identifying the quality of the available evidence, and (vii) develop tables and text that synthesise the evidence. A systematic review involves a critical and reproducible summary of the results of the available publications on a particular topic or clinical question. To improve scientific writing, the methodology is shown in a structured manner to implement a systematic review. Copyright © 2018 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  15. Climate index for France - Methodology

    International Nuclear Information System (INIS)

    2006-01-01

    According to the U.S. Department of Energy, an estimated 25% of the GNP is affected by weather-related events. The variations in temperature - even small ones - can also have long-lasting effects on the operational results of a company. Among other, the Energy supply sector is sensitive to weather risks: a milder or harsher than usual winter leads to a decrease or increase of energy consumption. The price of electricity on power trading facilities like Powernext is especially sensitive to odd changes in temperatures. Powernext and Meteo-France (the French meteorological agency) have joined expertise in order to promote the use of weather indices in term of decision making or underlying of hedging tools to energy actors, end users from any other sector of activity and specialists of the weather risk hedging. The Powernext Weather indices are made from information collected by Meteo-France's main observation network according to the norms of international meteorology, in areas carefully selected. The gross data are submitted to a thorough review allowing the correction of abnormalities and the reconstitution of missing data. Each index is fashioned to take into account the economic activity in the various regions of the country as represented by each region's population. This demographic information represents a fair approximation of the weight of the regional economic activity. This document presents the calculation methodology of average, minimum and maximum weather indexes with the winter and summer regression equations for the different economical regions of France. (J.S.)

  16. Methodological Issues in Questionnaire Design.

    Science.gov (United States)

    Song, Youngshin; Son, Youn Jung; Oh, Doonam

    2015-06-01

    The process of designing a questionnaire is complicated. Many questionnaires on nursing phenomena have been developed and used by nursing researchers. The purpose of this paper was to discuss questionnaire design and factors that should be considered when using existing scales. Methodological issues were discussed, such as factors in the design of questions, steps in developing questionnaires, wording and formatting methods for items, and administrations methods. How to use existing scales, how to facilitate cultural adaptation, and how to prevent socially desirable responding were discussed. Moreover, the triangulation method in questionnaire development was introduced. Steps were recommended for designing questions such as appropriately operationalizing key concepts for the target population, clearly formatting response options, generating items and confirming final items through face or content validity, sufficiently piloting the questionnaire using item analysis, demonstrating reliability and validity, finalizing the scale, and training the administrator. Psychometric properties and cultural equivalence should be evaluated prior to administration when using an existing questionnaire and performing cultural adaptation. In the context of well-defined nursing phenomena, logical and systematic methods will contribute to the development of simple and precise questionnaires.

  17. Climate index for Spain - Methodology

    International Nuclear Information System (INIS)

    2006-01-01

    According to the U.S. Department of Energy, an estimated 25% of the GNP is affected by weather-related events. The variations in temperature - even small ones - can also have long-lasting effects on the operational results of a company. Among other, the Energy supply sector is sensitive to weather risks: a milder or harsher than usual winter leads to a decrease or increase of energy consumption. The price of electricity on power trading facilities like Powernext is especially sensitive to odd changes in temperatures. Powernext and Meteo-France (the French meteorological agency) have joined expertise in order to promote the use of weather indices in term of decision making or underlying of hedging tools to energy actors, end users from any other sector of activity and specialists of the weather risk hedging. The Powernext Weather indices are made from information collected by Meteo-France's main observation network according to the norms of international meteorology, in areas carefully selected. The gross data are submitted to a thorough review allowing the correction of abnormalities and the reconstitution of missing data. Each index is fashioned to take into account the economic activity in the various regions of the country as represented by each region's population. This demographic information represents a fair approximation of the weight of the regional economic activity. This document presents the calculation methodology of average, minimum and maximum weather indexes with the winter and summer regression equations for the different economical regions of Spain. (J.S.)

  18. Open Government Data Publication Methodology

    Directory of Open Access Journals (Sweden)

    Jan Kucera

    2015-04-01

    Full Text Available Public sector bodies hold a significant amount of data that might be of potential interest to citizens and businesses. However the re-use potential of this data is still largely untapped because the data is not always published in a way that would allow its easy discovery, understanding and re-use. Open Government Data (OGD initiatives aim at increasing availability of machine-readable data provided under an open license and therefore these initiatives might facilitate re-use of the government data which in turn might lead to increased transparency and economic growth. However the recent studies show that still only a portion of data provided by the public sector bodies is truly open. Public sector bodies face a number of challenges when publishing OGD and they need to address the relevant risks. Therefore there is a need for best practices and methodologies for publication of OGD that would provide the responsible persons with a clear guidance on how the OGD initiatives should be implemented and how the known challenges and risks should be addressed.

  19. Scoping paper on new CDM baseline methodology for cross-border power trade

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    Poeyry has been sub-contracted by Carbon Limits, under the African Development Bank CDM Support Programme, to prepare a new CDM baseline methodology for cross border trade, based on a transmission line from Ethiopia to Kenya. The first step in that process is to review the response of the UNFCCC, particularly the Methodologies Panel ('Meth Panel') of the CDM Executive Board, to the various proposals on cross-border trade and interconnection of grids. This report reviews the Methodology Panel and Executive Board decisions on 4 requests for revisions of ACM2 'Consolidated baseline methodology for grid-connected electricity generation from renewable sources', and 5 proposed new baseline methodologies (NM255, NM269, NM272, NM318, NM342), all of which were rejected. We analyse the reasons the methodologies were rejected, and whether the proposed draft Approved Methodology (AM) that the Methodology Panel created in response to NM269 and NM272 is a suitable basis for a new methodology proposal.(auth)

  20. Safety analysis methodology for OPR 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun

    2005-01-01

    Full text: Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)