WorldWideScience

Sample records for methodology broadly defined

  1. Narrowly versus Broadly Defined Autism Spectrum Disorders: Differences in Pre-and Perinatal Risk Factors

    Science.gov (United States)

    Visser, Janne C.; Rommelse, Nanda; Vink, Lianne; Schrieken, Margo; Oosterling, Iris J.; Gaag, Rutger J.; Buitelaar, Jan K.

    2014-01-01

    This study examined the differential contribution of pre-and perinatal risks in narrowly versus broadly defined autism spectrum disorder (ASD) and across core symptom domains, IQ and co-morbid problems. Children with a DSM-IV diagnosis of autistic disorder (AD) (n = 121) or pervasive developmental disorder not otherwise specified (PDD-NOS)…

  2. Analytical methodologies for broad metabolite coverage of exhaled breath condensate.

    Science.gov (United States)

    Aksenov, Alexander A; Zamuruyev, Konstantin O; Pasamontes, Alberto; Brown, Joshua F; Schivo, Michael; Foutouhi, Soraya; Weimer, Bart C; Kenyon, Nicholas J; Davis, Cristina E

    2017-09-01

    Breath analysis has been gaining popularity as a non-invasive technique that is amenable to a broad range of medical uses. One of the persistent problems hampering the wide application of the breath analysis method is measurement variability of metabolite abundances stemming from differences in both sampling and analysis methodologies used in various studies. Mass spectrometry has been a method of choice for comprehensive metabolomic analysis. For the first time in the present study, we juxtapose the most commonly employed mass spectrometry-based analysis methodologies and directly compare the resultant coverages of detected compounds in exhaled breath condensate in order to guide methodology choices for exhaled breath condensate analysis studies. Four methods were explored to broaden the range of measured compounds across both the volatile and non-volatile domain. Liquid phase sampling with polyacrylate Solid-Phase MicroExtraction fiber, liquid phase extraction with a polydimethylsiloxane patch, and headspace sampling using Carboxen/Polydimethylsiloxane Solid-Phase MicroExtraction (SPME) followed by gas chromatography mass spectrometry were tested for the analysis of volatile fraction. Hydrophilic interaction liquid chromatography and reversed-phase chromatography high performance liquid chromatography mass spectrometry were used for analysis of non-volatile fraction. We found that liquid phase breath condensate extraction was notably superior compared to headspace extraction and differences in employed sorbents manifested altered metabolite coverages. The most pronounced effect was substantially enhanced metabolite capture for larger, higher-boiling compounds using polyacrylate SPME liquid phase sampling. The analysis of the non-volatile fraction of breath condensate by hydrophilic and reverse phase high performance liquid chromatography mass spectrometry indicated orthogonal metabolite coverage by these chromatography modes. We found that the metabolite coverage

  3. Research Network of Tehran Defined Population: Methodology and Establishment

    Directory of Open Access Journals (Sweden)

    Ali-Asghar Kolahi

    2015-12-01

    Full Text Available Background: We need a defined population for determining prevalence and incidence of diseases, as well as conducting interventional, cohort and longitudinal studies, calculating correct and timely public health indicators, assessing actual health needs of community, performing educational programs and interventions to promote healthy lifestyle, and enhancing quality of primary health services.The objective of this project was to determine a defined population which is representative of Tehran, the Capital of Iran. This article reports the methodology and establishment of the research network of Tehran defined population.Methods: This project started by selecting two urban health centers from each of the five district health centers affiliated to Shahid Beheshti University of Medical Sciences in 2012. Inside each selected urban health center, one defined population research station was established. Two new centers have been added during 2013 and 2014. For the time being, the number of the covered population of the network has reached 40000 individuals. The most important criterion for the defined population has been to be representative of the population of Tehran. For this, we selected two urban health centers from 12 of 22 municipality districts and from each of the five different socioeconomic of Greater Tehran. Merely 80000 individuals in neighborhoods of each defined population research station were considered as control group of the project.Findings: Totally we selected 12 defined population research stations and their under-covered population developed a defined population which is representative of Tehran population.Conclusion: a population lab is ready now in metropolitan of Tehran.

  4. Readiness and motivation for change among young women with broadly defined eating disorders.

    Science.gov (United States)

    Ålgars, Monica; Ramberg, Carin; Moszny, Josefine; Hagman, Jessica; Rintala, Hanna; Santtila, Pekka

    2015-01-01

    Readiness and motivation for change were examined in 32 women with broadly defined eating disorders who took part in a 10-week Cognitive Behavioral Therapy (CBT)-based group intervention. Readiness for change and eating disorder psychopathology were assessed before and after the intervention. The results revealed significant negative associations between degree of eating disorder symptoms and degree of readiness for change before the intervention started. In particular, higher levels of eating concern, shape concern, and body dissatisfaction were associated with lower motivation for change. No significant associations between degree of readiness for change before the intervention started and changes in eating disorder symptoms at the end of intervention were found. Readiness for change increased from the beginning to the end of the intervention, indicating that group CBT may be a cost-effective and time-efficient way of enhancing readiness and motivation for change in individuals with eating psychopathology.

  5. "Dermatitis" defined.

    Science.gov (United States)

    Smith, Suzanne M; Nedorost, Susan T

    2010-01-01

    The term "dermatitis" can be defined narrowly or broadly, clinically or histologically. A common and costly condition, dermatitis is underresourced compared to other chronic skin conditions. The lack of a collectively understood definition of dermatitis and its subcategories could be the primary barrier. To investigate how dermatologists define the term "dermatitis" and determine if a consensus on the definition of this term and other related terms exists. A seven-question survey of dermatologists nationwide was conducted. Of respondents (n  =  122), half consider dermatitis to be any inflammation of the skin. Nearly half (47.5%) use the term interchangeably with "eczema." Virtually all (> 96%) endorse the subcategory "atopic" under the terms "dermatitis" and "eczema," but the subcategories "contact," "drug hypersensitivity," and "occupational" are more highly endorsed under the term "dermatitis" than under the term "eczema." Over half (55.7%) personally consider "dermatitis" to have a broad meaning, and even more (62.3%) believe that dermatologists as a whole define the term broadly. There is a lack of consensus among experts in defining dermatitis, eczema, and their related subcategories.

  6. Using MFM methodology to generate and define major accident scenarios for quantitative risk assessment studies

    DEFF Research Database (Denmark)

    Hua, Xinsheng; Wu, Zongzhi; Lind, Morten

    2017-01-01

    to calculate likelihood of each MAS. Combining the likelihood of each scenario with a qualitative risk matrix, each major accident scenario is thereby ranked for consideration for detailed consequence analysis. The methodology is successfully highlighted using part of BMA-process for production of hydrogen......Generating and defining Major Accident Scenarios (MAS) are commonly agreed as the key step for quantitative risk assessment (QRA). The aim of the study is to explore the feasibility of using Multilevel Flow Modeling (MFM) methodology to formulating MAS. Traditionally this is usually done based...

  7. A Case Study of Six Sigma Define-Measure-Analyze-Improve-Control (DMAIC Methodology in Garment Sector

    Directory of Open Access Journals (Sweden)

    Abdur Rahman

    2017-12-01

    Full Text Available This paper demonstrates the empirical application of Six Sigma and Define-Measure-Analyze-Improve-Control (DMAIC methodology to reduce product defects within a garments manufacturing organization in Bangladesh which follows the DMAIC methodology to investigate defects, root causes and provide a solution to eliminate these defects. The analysis from employing Six Sigma and DMAIC indicated that the broken stitch and open seam influenced the number of defective products. Design of experiments (DOE and the analysis of variance (ANOVA techniques were combined to statistically determine the correlation of the broken stitch and open seam with defects as well as to define their optimum values needed to eliminate the defects. Thus, a reduction of about 35% in the garments defect was achieved, which helped the organization studied to reduce its defects and thus improve its Sigma level from 1.7 to 3.4.

  8. A Methodology to Define Flood Resilience

    Science.gov (United States)

    Tourbier, J.

    2012-04-01

    Flood resilience has become an internationally used term with an ever-increasing number of entries on the Internet. The SMARTeST Project is looking at approaches to flood resilience through case studies at cities in various countries, including Washington D.C. in the United States. In light of U.S. experiences a methodology is being proposed by the author that is intended to meet ecologic, spatial, structural, social, disaster relief and flood risk aspects. It concludes that: "Flood resilience combines (1) spatial, (2) structural, (3) social, and (4) risk management levels of flood preparedness." Flood resilience should incorporate all four levels, but not necessarily with equal emphasis. Stakeholders can assign priorities within different flood resilience levels and the considerations they contain, dividing 100% emphasis into four levels. This evaluation would be applied to planned and completed projects, considering existing conditions, goals and concepts. We have long known that the "road to market" for the implementation of flood resilience is linked to capacity building of stakeholders. It is a multidisciplinary enterprise, involving the integration of all the above aspects into the decision-making process. Traditional flood management has largely been influenced by what in the UK has been called "Silo Thinking", involving constituent organizations that are responsible for different elements, and are interested only in their defined part of the system. This barrier to innovation also has been called the "entrapment effect". Flood resilience is being defined as (1) SPATIAL FLOOD RESILIENCE implying the management of land by floodplain zoning, urban greening and management to reduce storm runoff through depression storage and by practicing Sustainable Urban Drainage (SUD's), Best Management Practices (BMP's, or Low Impact Development (LID). Ecologic processes and cultural elements are included. (2) STRUCTURAL FLOOD RESILIENCE referring to permanent flood defense

  9. Patterns of remission, continuation and incidence of broadly defined eating disorders during early pregnancy in the Norwegian Mother and Child Cohort Study (MoBa).

    Science.gov (United States)

    Bulik, Cynthia M; Von Holle, Ann; Hamer, Robert; Knoph Berg, Cecilie; Torgersen, Leila; Magnus, Per; Stoltenberg, Camilla; Siega-Riz, Anna Maria; Sullivan, Patrick; Reichborn-Kjennerud, Ted

    2007-08-01

    We explored the course of broadly defined eating disorders during pregnancy in the Norwegian Mother and Child Cohort Study (MoBa) at the Norwegian Institute of Public Health. A total of 41,157 pregnant women, enrolled at approximately 18 weeks' gestation, had valid data from the Norwegian Medical Birth Registry. We collected questionnaire-based diagnostic information on broadly defined anorexia nervosa (AN), and bulimia nervosa (BN), and eating disorders not otherwise specified (EDNOS). EDNOS subtypes included binge eating disorder (BED) and recurrent self-induced purging in the absence of binge eating (EDNOS-P). We explored rates of remission, continuation and incidence of BN, BED and EDNOS-P during pregnancy. Prepregnancy prevalence estimates were 0.1% for AN, 0.7% for BN, 3.5% for BED and 0.1% for EDNOS-P. During early pregnancy, estimates were 0.2% (BN), 4.8% (BED) and 0.1% (EDNOS-P). Proportions of individuals remitting during pregnancy were 78% (EDNOS-P), 40% (BN purging), 39% (BED), 34% (BN any type) and 29% (BN non-purging type). Additional individuals with BN achieved partial remission. Incident BN and EDNOS-P during pregnancy were rare. For BED, the incidence rate was 1.1 per 1000 person-weeks, equating to 711 new cases of BED during pregnancy. Incident BED was associated with indices of lower socio-economic status. Pregnancy appears to be a catalyst for remission of some eating disorders but also a vulnerability window for the new onset of broadly defined BED, especially in economically disadvantaged individuals. Vigilance by health-care professionals for continuation and emergence of eating disorders in pregnancy is warranted.

  10. Methodology to Define Delivery Accuracy Under Current Day ATC Operations

    Science.gov (United States)

    Sharma, Shivanjli; Robinson, John E., III

    2015-01-01

    In order to enable arrival management concepts and solutions in a NextGen environment, ground- based sequencing and scheduling functions have been developed to support metering operations in the National Airspace System. These sequencing and scheduling algorithms as well as tools are designed to aid air traffic controllers in developing an overall arrival strategy. The ground systems being developed will support the management of aircraft to their Scheduled Times of Arrival (STAs) at flow-constrained meter points. This paper presents a methodology for determining the undelayed delivery accuracy for current day air traffic control operations. This new method analyzes the undelayed delivery accuracy at meter points in order to understand changes of desired flow rates as well as enabling definition of metrics that will allow near-future ground automation tools to successfully achieve desired separation at the meter points. This enables aircraft to meet their STAs while performing high precision arrivals. The research presents a possible implementation that would allow delivery performance of current tools to be estimated and delivery accuracy requirements for future tools to be defined, which allows analysis of Estimated Time of Arrival (ETA) accuracy for Time-Based Flow Management (TBFM) and the FAA's Traffic Management Advisor (TMA). TMA is a deployed system that generates scheduled time-of-arrival constraints for en- route air traffic controllers in the US. This new method of automated analysis provides a repeatable evaluation of the delay metrics for current day traffic, new releases of TMA, implementation of different tools, and across different airspace environments. This method utilizes a wide set of data from the Operational TMA-TBFM Repository (OTTR) system, which processes raw data collected by the FAA from operational TMA systems at all ARTCCs in the nation. The OTTR system generates daily reports concerning ATC status, intent and actions. Due to its

  11. Improving inferior vena cava filter retrieval rates with the define, measure, analyze, improve, control methodology.

    Science.gov (United States)

    Sutphin, Patrick D; Reis, Stephen P; McKune, Angie; Ravanzo, Maria; Kalva, Sanjeeva P; Pillai, Anil K

    2015-04-01

    To design a sustainable process to improve optional inferior vena cava (IVC) filter retrieval rates based on the Define, Measure, Analyze, Improve, Control (DMAIC) methodology of the Six Sigma process improvement paradigm. DMAIC, an acronym for Define, Measure, Analyze, Improve, and Control, was employed to design and implement a quality improvement project to increase IVC filter retrieval rates at a tertiary academic hospital. Retrievable IVC filters were placed in 139 patients over a 2-year period. The baseline IVC filter retrieval rate (n = 51) was reviewed through a retrospective analysis, and two strategies were devised to improve the filter retrieval rate: (a) mailing of letters to clinicians and patients for patients who had filters placed within 8 months of implementation of the project (n = 43) and (b) a prospective automated scheduling of a clinic visit at 4 weeks after filter placement for all new patients (n = 45). The effectiveness of these strategies was assessed by measuring the filter retrieval rates and estimated increase in revenue to interventional radiology. IVC filter retrieval rates increased from a baseline of 8% to 40% with the mailing of letters and to 52% with the automated scheduling of a clinic visit 4 weeks after IVC filter placement. The estimated revenue per 100 IVC filters placed increased from $2,249 to $10,518 with the mailing of letters and to $17,022 with the automated scheduling of a clinic visit. Using the DMAIC methodology, a simple and sustainable quality improvement intervention was devised that markedly improved IVC filter retrieval rates in eligible patients. Copyright © 2015 SIR. Published by Elsevier Inc. All rights reserved.

  12. Potential Adverse Effects of Broad-Spectrum Antimicrobial Exposure in the Intensive Care Unit.

    Science.gov (United States)

    Wiens, Jenna; Snyder, Graham M; Finlayson, Samuel; Mahoney, Monica V; Celi, Leo Anthony

    2018-02-01

    The potential adverse effects of empiric broad-spectrum antimicrobial use among patients with suspected but subsequently excluded infection have not been fully characterized. We sought novel methods to quantify the risk of adverse effects of broad-spectrum antimicrobial exposure among patients admitted to an intensive care unit (ICU). Among all adult patients admitted to ICUs at a single institution, we selected patients with negative blood cultures who also received ≥1 broad-spectrum antimicrobials. Broad-spectrum antimicrobials were categorized in ≥1 of 5 categories based on their spectrum of activity against potential pathogens. We performed, in serial, 5 cohort studies to measure the effect of each broad-spectrum category on patient outcomes. Exposed patients were defined as those receiving a specific category of broad-spectrum antimicrobial; nonexposed were all other patients in the cohort. The primary outcome was 30-day mortality. Secondary outcomes included length of hospital and ICU stay and nosocomial acquisition of antimicrobial-resistant bacteria (ARB) or Clostridium difficile within 30 days of admission. Among the study cohort of 1918 patients, 316 (16.5%) died within 30 days, 821 (42.8%) had either a length of hospital stay >7 days or an ICU length of stay >3 days, and 106 (5.5%) acquired either a nosocomial ARB or C. difficile . The short-term use of broad-spectrum antimicrobials in any of the defined broad-spectrum categories was not significantly associated with either primary or secondary outcomes. The prompt and brief empiric use of defined categories of broad-spectrum antimicrobials could not be associated with additional patient harm.

  13. Psychosocial factors associated with broadly defined bulimia nervosa during early pregnancy: findings from the Norwegian Mother and Child Cohort Study.

    Science.gov (United States)

    Knoph Berg, Cecilie; Bulik, Cynthia M; Von Holle, Ann; Torgersen, Leila; Hamer, Robert; Sullivan, Patrick; Reichborn-Kjennerud, Ted

    2008-05-01

    The purpose of the present study was to investigate the relationship between psychosocial characteristics and broadly defined bulimia nervosa during early pregnancy, including factors associated with continuation, incidence and remission. A total of 41 157 women completed questionnaires at approximately gestation week 18, including items on eating disorders and psychosocial characteristics as a part of Norwegian Mother and Child Cohort Study conducted by the Norwegian Institute of Public Health. Incident bulimia nervosa during the first trimester was significantly associated with symptoms of anxiety and depression and low self-esteem and life satisfaction, whereas remission was significantly associated with higher self-esteem and life satisfaction. Continuation was not significantly related to any of the psychosocial variables tested. Onset of bulimia nervosa during pregnancy is associated with mood and anxiety symptoms. Remission of bulimic symptoms and new onset of bulimia nervosa are associated with opposite profiles of self-esteem, and life satisfaction measures.

  14. Defining Hardwood Veneer Log Quality Attributes

    Science.gov (United States)

    Jan Wiedenbeck; Michael Wiemann; Delton Alderman; John Baumgras; William Luppold

    2004-01-01

    This publication provides a broad spectrum of information on the hardwood veneer industry in North America. Veneer manufacturers and their customers impose guidelines in specifying wood quality attributes that are very discriminating but poorly defined (e.g., exceptional color, texture, and/or figure characteristics). To better understand and begin to define the most...

  15. Defining and understanding healthy lifestyles choices for adolescents.

    Science.gov (United States)

    He, Ka; Kramer, Ellen; Houser, Robert F; Chomitz, Virginia R; Hacker, Karen A

    2004-07-01

    To: (a) establish criteria for defining positive health behaviors and lifestyle; and (b) identify characteristics of adolescents who practice a healthy lifestyle. Responses from a 1998 survey via questionnaire, of 1487 students, from a public high school, Cambridge, Massachusetts, were used to assess correlates of healthy lifestyle choices. Strict and broad assessments of healthy behaviors were defined for students: use of alcohol, tobacco, and illegal drugs; sexual behavior; attempted suicide. Whereas the "strict" criteria included only those adolescents who did not practice any of the behaviors in question, the broad criteria reflected experimentation and moderate risk-taking. The prevalence of positive behaviors was assessed by demographic and student characteristics. In addition, logistic regression models were created to predict determinants of teenagers' healthy lifestyles using both strict and broad definitions. Using strict criteria of healthy lifestyle, significant predictors were being female, born outside the United States, higher academic performance, and fewer stressful life events. Using a broad definition of a healthy lifestyle, significant predictors were being non-Caucasian, in the lower grade levels at the school, higher academic performance, and fewer stressful life events. In both models, peers' approval of risky behaviors negatively influenced teens' lifestyles, whereas parents' disapproval of risky behaviors was a positive influence. These results reinforce the importance of school, peer, and parent support of positive behaviors. It is important for public health workers and families to understand and define healthy lifestyles choices for adolescents.

  16. Defining food literacy: A scoping review.

    Science.gov (United States)

    Truman, Emily; Lane, Daniel; Elliott, Charlene

    2017-09-01

    The term "food literacy" describes the idea of proficiency in food related skills and knowledge. This prevalent term is broadly applied, although its core elements vary from initiative to initiative. In light of its ubiquitous use-but varying definitions-this article establishes the scope of food literacy research by identifying all articles that define 'food literacy', analysing its key conceptualizations, and reporting outcomes/measures of this concept. A scoping review was conducted to identify all articles (academic and grey literature) using the term "food literacy". Databases included Medline, Pubmed, Embase, CAB Abstracts, CINAHL, Scopus, JSTOR, and Web of Science, and Google Scholar. Of 1049 abstracts, 67 studies were included. From these, data was extracted on country of origin, study type (methodological approach), primary target population, and the primary outcomes relating to food literacy. The majority of definitions of food literacy emphasize the acquisition of critical knowledge (information and understanding) (55%) over functional knowledge (skills, abilities and choices) (8%), although some incorporate both (37%). Thematic analysis of 38 novel definitions of food literacy reveals the prevalence of six themes: skills and behaviours, food/health choices, culture, knowledge, emotions, and food systems. Study outcomes largely focus on knowledge generating measures, with very few focusing on health related outcome measures. Current definitions of food literacy incorporate components of six key themes or domains and attributes of both critical and functional knowledge. Despite this broad definition of the term, most studies aiming to improve food literacy focus on knowledge related outcomes. Few articles address health outcomes, leaving an important gap (and opportunity) for future research in this field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Methodologies for defining quality of life

    Energy Technology Data Exchange (ETDEWEB)

    Glicken, J. [Ecological Planning and Toxicology, Inc., Albuquerque, NM (United States); Engi, D. [Sandia National Labs., Albuquerque, NM (United States)

    1996-10-10

    Quality of life as a concept has been used in many ways in the public policy arena. It can be used in summative evaluations to assess the impacts of policies or programs. Alternatively, it can be applied to formative evaluations to provide input to the formation of new policies. In short, it provides the context for the understanding needed to evaluate the results of choices that have been made in the public policy arena, or the potential of choices yet to be made. In either case, the public policy question revolves around the positive or negative impact the choice will have on quality of life, and the magnitude of that impact. This discussion will develop a conceptual framework that proposes that an assessment of quality of life is based on a comparison of expectations with experience. The framework defines four basic components from which these expectations arise: natural conditions, social conditions, the body, and the mind. Each one of these components is generally described, and associated with a general policy or rhetorical category which gives it its policy vocabulary--environmental quality, economic well-being, human health, and self-fulfillment.

  18. Defining a methodology for benchmarking spectrum unfolding codes

    International Nuclear Information System (INIS)

    Meyer, W.; Kirmser, P.G.; Miller, W.H.; Hu, K.K.

    1976-01-01

    It has long been recognized that different neutron spectrum unfolding codes will produce significantly different results when unfolding the same measured data. In reviewing the results of such analyses it has been difficult to determine which result if any is the best representation of what was measured by the spectrometer detector. A proposal to develop a benchmarking procedure for spectrum unfolding codes is presented. The objective of the procedure will be to begin to develop a methodology and a set of data with a well established and documented result that could be used to benchmark and standardize the various unfolding methods and codes. It is further recognized that development of such a benchmark must involve a consensus of the technical community interested in neutron spectrum unfolding

  19. Shareholder, stakeholder-owner or broad stakeholder maximization

    DEFF Research Database (Denmark)

    Mygind, Niels

    2004-01-01

    With reference to the discussion about shareholder versus stakeholder maximization it is argued that the normal type of maximization is in fact stakeholder-owner maxi-mization. This means maximization of the sum of the value of the shares and stake-holder benefits belonging to the dominating...... including the shareholders of a company. Although it may be the ultimate goal for Corporate Social Responsibility to achieve this kind of maximization, broad stakeholder maximization is quite difficult to give a precise definition. There is no one-dimensional measure to add different stakeholder benefits...... not traded on the mar-ket, and therefore there is no possibility for practical application. Broad stakeholder maximization instead in practical applications becomes satisfying certain stakeholder demands, so that the practical application will be stakeholder-owner maximization un-der constraints defined...

  20. Surveillance of broad-spectrum antibiotic prescription in Singaporean hospitals: a 5-year longitudinal study.

    Directory of Open Access Journals (Sweden)

    Yi-Xin Liew

    Full Text Available BACKGROUND: Inappropriate prescription of antibiotics may contribute towards higher levels antimicrobial resistance. A key intervention for improving appropriate antibiotic prescription is surveillance of prescription. This paper presents the results of a longitudinal surveillance of broad-spectrum antibiotic prescription in 5 public-sector hospitals in Singapore from 2006 to 2010. METHODOLOGY/PRINCIPAL FINDINGS: Quarterly antibiotic prescription data were obtained and converted to defined daily doses (DDDs per 1,000 inpatient-days. The presence of significant trends in antibiotic prescription over time for both individual and combined hospitals was tested by regression analysis and corrected for autocorrelation between time-points. Excluding fluoroquinolones, there was a significant increase in prescription of all monitored antibiotics from an average of 233.12 defined daily doses (DDD/1,000 inpatient-days in 2006 to 254.38 DDD/1,000 inpatient-days in 2010 (Coefficient = 1.13, 95%CI: 0.16-2.09, p = 0.025. Increasing utilization of carbapenems, piperacillin/tazobactam, and Gram-positive agents were seen in the majority of the hospitals, while cephalosporins were less prescribed over time. The combined expenditure for 5 hospitals increased from USD9.9 million in 2006 to USD16.7 million in 2010. CONCLUSIONS/SIGNIFICANCE: The rate of prescription of broad-spectrum antibiotics in Singaporean hospitals is much higher compared to those of European hospitals. This may be due to high rates of antimicrobial resistance. The increase in expenditure on monitored antibiotics over the past 5 years outstripped the actual increase in DDD/1,000 inpatient-days of antibiotics prescribed. Longitudinal surveillance of antibiotic prescription on a hospital and countrywide level is important for detecting trends for formulating interventions or policies. Further research is needed to understand the causes for the various prescription trends and to act on these where

  1. 'The methodology of positive economics' does not give us the methodology of positive economics

    NARCIS (Netherlands)

    U.I. Mäki (Uskali)

    2003-01-01

    textabstractIt is argued that rather than a well defined F-Twist, Milton Friedman's 'Methodology of positive economies' offers an F-Mix: a pool of ambiguous and inconsistent ingredients that can be used for putting together a number of different methodological positions. This concerns issues such as

  2. Comparative study on software development methodologies

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2014-12-01

    Full Text Available This paper focuses on the current state of knowledge in the field of software development methodologies. It aims to set the stage for the formalization of a software development methodology dedicated to innovation orientated IT projects. The paper starts by depicting specific characteristics in software development project management. Managing software development projects involves techniques and skills that are proprietary to the IT industry. Also the software development project manager handles challenges and risks that are predominantly encountered in business and research areas that involve state of the art technology. Conventional software development stages are defined and briefly described. Development stages are the building blocks of any software development methodology so it is important to properly research this aspect. Current software development methodologies are presented. Development stages are defined for every showcased methodology. For each methodology a graphic representation is illustrated in order to better individualize its structure. Software development methodologies are compared by highlighting strengths and weaknesses from the stakeholder's point of view. Conclusions are formulated and a research direction aimed at formalizing a software development methodology dedicated to innovation orientated IT projects is enunciated.

  3. Resection methodology for PSP data processing: Recent ...

    Indian Academy of Sciences (India)

    M. Senthilkumar (Newgen Imaging) 1461 1996 Oct 15 13:05:22

    Abstract. PSP data processing, which primarily involves image alignment and image analysis, is a crucial element in obtaining accurate PSP results. There are two broad approaches to image alignment: the algebraic transformation technique, often called image-warping technique, and resection methodology, which uses ...

  4. Situating methodology within qualitative research.

    Science.gov (United States)

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  5. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  6. The Speaker Respoken: Material Rhetoric as Feminist Methodology.

    Science.gov (United States)

    Collins, Vicki Tolar

    1999-01-01

    Presents a methodology based on the concept of "material rhetoric" that can help scholars avoid problems as they reclaim women's historical texts. Defines material rhetoric and positions it theoretically in relation to other methodologies, including bibliographical studies, reception theory, and established feminist methodologies. Illustrates…

  7. Adaptation of Agile Project Management Methodology for Project Team

    Directory of Open Access Journals (Sweden)

    Rasnacis Arturs

    2015-12-01

    Full Text Available A project management methodology that defines basic processes, tools, techniques, methods, resources and procedures used to manage a project is necessary for effective and successful IT project management. Each company needs to define its own methodology or adapt some of the existing ones. The purpose of the research is to evaluate the possibilities of adapting IT project development methodology according to the company, company employee characteristics and their mutual relations. The adaptation process will be illustrated with a case study at an IT company in Latvia where the developed methodology is based on Agile Scrum, one of the most widespread Agile methods.

  8. DEFINING THE CHEMICAL SPACE OF PUBLIC GENOMIC ...

    Science.gov (United States)

    The current project aims to chemically index the genomics content of public genomic databases to make these data accessible in relation to other publicly available, chemically-indexed toxicological information. By defining the chemical space of public genomic data, it is possible to identify classes of chemicals on which to develop methodologies for the integration of chemogenomic data into predictive toxicology. The chemical space of public genomic data will be presented as well as the methodologies and tools developed to identify this chemical space.

  9. A broad view of model validation

    International Nuclear Information System (INIS)

    Tsang, C.F.

    1989-10-01

    The safety assessment of a nuclear waste repository requires the use of models. Such models need to be validated to ensure, as much as possible, that they are a good representation of the actual processes occurring in the real system. In this paper we attempt to take a broad view by reviewing step by step the modeling process and bringing out the need to validating every step of this process. This model validation includes not only comparison of modeling results with data from selected experiments, but also evaluation of procedures for the construction of conceptual models and calculational models as well as methodologies for studying data and parameter correlation. The need for advancing basic scientific knowledge in related fields, for multiple assessment groups, and for presenting our modeling efforts in open literature to public scrutiny is also emphasized. 16 refs

  10. Researching virtual worlds methodologies for studying emergent practices

    CERN Document Server

    Phillips, Louise

    2013-01-01

    This volume presents a wide range of methodological strategies that are designed to take into account the complex, emergent, and continually shifting character of virtual worlds. It interrogates how virtual worlds emerge as objects of study through the development and application of various methodological strategies. Virtual worlds are not considered objects that exist as entities with fixed attributes independent of our continuous engagement with them and interpretation of them. Instead, they are conceived of as complex ensembles of technology, humans, symbols, discourses, and economic structures, ensembles that emerge in ongoing practices and specific situations. A broad spectrum of perspectives and methodologies is presented: Actor-Network-Theory and post-Actor-Network-Theory, performativity theory, ethnography, discourse analysis, Sense-Making Methodology, visual ethnography, multi-sited ethnography, and Social Network Analysis.

  11. A GIS-based methodology to quantitatively define an Adjacent Protected Area in a shallow karst cavity: the case of Altamira cave.

    Science.gov (United States)

    Elez, J; Cuezva, S; Fernandez-Cortes, A; Garcia-Anton, E; Benavente, D; Cañaveras, J C; Sanchez-Moral, S

    2013-03-30

    Different types of land use are usually present in the areas adjacent to many shallow karst cavities. Over time, the increasing amount of potentially harmful matter and energy, of mainly anthropic origin or influence, that reaches the interior of a shallow karst cavity can modify the hypogeal ecosystem and increase the risk of damage to the Palaeolithic rock art often preserved within the cavity. This study proposes a new Protected Area status based on the geological processes that control these matter and energy fluxes into the Altamira cave karst system. Analysis of the geological characteristics of the shallow karst system shows that direct and lateral infiltration, internal water circulation, ventilation, gas exchange and transmission of vibrations are the processes that control these matter and energy fluxes into the cave. This study applies a comprehensive methodological approach based on Geographic Information Systems (GIS) to establish the area of influence of each transfer process. The stratigraphic and structural characteristics of the interior of the cave were determined using 3D Laser Scanning topography combined with classical field work, data gathering, cartography and a porosity-permeability analysis of host rock samples. As a result, it was possible to determine the hydrogeological behavior of the cave. In addition, by mapping and modeling the surface parameters it was possible to identify the main features restricting hydrological behavior and hence direct and lateral infiltration into the cave. These surface parameters included the shape of the drainage network and a geomorphological and structural characterization via digital terrain models. Geological and geomorphological maps and models integrated into the GIS environment defined the areas involved in gas exchange and ventilation processes. Likewise, areas that could potentially transmit vibrations directly into the cave were identified. This study shows that it is possible to define a

  12. Which sociodemographic factors are important on smoking behaviour of high school students? The contribution of classification and regression tree methodology in a broad epidemiological survey.

    Science.gov (United States)

    Ozge, C; Toros, F; Bayramkaya, E; Camdeviren, H; Sasmaz, T

    2006-08-01

    The purpose of this study is to evaluate the most important sociodemographic factors on smoking status of high school students using a broad randomised epidemiological survey. Using in-class, self administered questionnaire about their sociodemographic variables and smoking behaviour, a representative sample of total 3304 students of preparatory, 9th, 10th, and 11th grades, from 22 randomly selected schools of Mersin, were evaluated and discriminative factors have been determined using appropriate statistics. In addition to binary logistic regression analysis, the study evaluated combined effects of these factors using classification and regression tree methodology, as a new statistical method. The data showed that 38% of the students reported lifetime smoking and 16.9% of them reported current smoking with a male predominancy and increasing prevalence by age. Second hand smoking was reported at a 74.3% frequency with father predominance (56.6%). The significantly important factors that affect current smoking in these age groups were increased by household size, late birth rank, certain school types, low academic performance, increased second hand smoking, and stress (especially reported as separation from a close friend or because of violence at home). Classification and regression tree methodology showed the importance of some neglected sociodemographic factors with a good classification capacity. It was concluded that, as closely related with sociocultural factors, smoking was a common problem in this young population, generating important academic and social burden in youth life and with increasing data about this behaviour and using new statistical methods, effective coping strategies could be composed.

  13. A hybrid design methodology for structuring an Integrated Environmental Management System (IEMS) for shipping business.

    Science.gov (United States)

    Celik, Metin

    2009-03-01

    The International Safety Management (ISM) Code defines a broad framework for the safe management and operation of merchant ships, maintaining high standards of safety and environmental protection. On the other hand, ISO 14001:2004 provides a generic, worldwide environmental management standard that has been utilized by several industries. Both the ISM Code and ISO 14001:2004 have the practical goal of establishing a sustainable Integrated Environmental Management System (IEMS) for shipping businesses. This paper presents a hybrid design methodology that shows how requirements from both standards can be combined into a single execution scheme. Specifically, the Analytic Hierarchy Process (AHP) and Fuzzy Axiomatic Design (FAD) are used to structure an IEMS for ship management companies. This research provides decision aid to maritime executives in order to enhance the environmental performance in the shipping industry.

  14. Media Literacy, Education & (Civic) Capability: A Transferable Methodology

    Science.gov (United States)

    McDougall, Julian; Berger, Richard; Fraser, Pete; Zezulkova, Marketa

    2015-01-01

    This article explores the relationship between a formal media educational encounter in the UK and the broad objectives for media and information literacy education circulating in mainland Europe and the US. A pilot study, developed with a special interest group of the United Kingdom Literacy Association, applied a three-part methodology for…

  15. Animal Models of Virus-Induced Neurobehavioral Sequelae: Recent Advances, Methodological Issues, and Future Prospects

    Directory of Open Access Journals (Sweden)

    Marco Bortolato

    2010-01-01

    Full Text Available Converging lines of clinical and epidemiological evidence suggest that viral infections in early developmental stages may be a causal factor in neuropsychiatric disorders such as schizophrenia, bipolar disorder, and autism-spectrum disorders. This etiological link, however, remains controversial in view of the lack of consistent and reproducible associations between viruses and mental illness. Animal models of virus-induced neurobehavioral disturbances afford powerful tools to test etiological hypotheses and explore pathophysiological mechanisms. Prenatal or neonatal inoculations of neurotropic agents (such as herpes-, influenza-, and retroviruses in rodents result in a broad spectrum of long-term alterations reminiscent of psychiatric abnormalities. Nevertheless, the complexity of these sequelae often poses methodological and interpretational challenges and thwarts their characterization. The recent conceptual advancements in psychiatric nosology and behavioral science may help determine new heuristic criteria to enhance the translational value of these models. A particularly critical issue is the identification of intermediate phenotypes, defined as quantifiable factors representing single neurochemical, neuropsychological, or neuroanatomical aspects of a diagnostic category. In this paper, we examine how the employment of these novel concepts may lead to new methodological refinements in the study of virus-induced neurobehavioral sequelae through animal models.

  16. Methodologies used in Project Management

    OpenAIRE

    UNGUREANU, Adrian; UNGUREANU, Anca

    2014-01-01

    Undoubtedly, a methodology properly defined and strictly followed for project management provides a firm guarantee that the work will be done on time, in budget and according to specifications. A project management methodology in simple terms is a “must-have” to avoid failure and reduce risks, because is one of the critical success factors, such basic skills of the management team. This is the simple way to guide the team through the design and execution phases, processes and tasks throughout...

  17. Defining the mobilome.

    Science.gov (United States)

    Siefert, Janet L

    2009-01-01

    This chapter defines the agents that provide for the movement of genetic material which fuels the adaptive potential of life on our planet. The chapter has been structured to be broadly comprehensive, arbitrarily categorizing the mobilome into four classes: (1) transposons, (2) plasmids, (3) bacteriophage, and (4) self-splicing molecular parasites.Our increasing understanding of the mobilome is as dynamic as the mobilome itself. With continuing discovery, it is clear that nature has not confined these genomic agents of change to neat categories, but rather the classification categories overlap and intertwine. Massive sequencing efforts and their published analyses are continuing to refine our understanding of the extent of the mobilome. This chapter provides a framework to describe our current understanding of the mobilome and a foundation on which appreciation of its impact on genome evolution can be understood.

  18. Computational thermal analysis of cylindrical fin design parameters and a new methodology for defining fin structure in LED automobile headlamp cooling applications

    International Nuclear Information System (INIS)

    Sökmen, Kemal Furkan; Yürüklü, Emrah; Yamankaradeniz, Nurettin

    2016-01-01

    Highlights: • In the study, cooling of LED headlamps in automotive is investigated. • The study is based on free convection cooling of LED module. • Besides free convection, Monte Carlo model is used as radiation model as well. • A new algorithm is presented for designing optimum fin structure. • Suggested algorithm for optimum design is verified by various simulations. - Abstract: In this study, the effects of fin design, fin material, and free and forced convection on junction temperature in automotive headlamp cooling applications of LED lights are researched by using ANSYS CFX 14 software. Furthermore a new methodology is presented for defining the optimum cylindrical fin structure within the given limits. For measuring the performance of methodology, analyses are carried out for various ambient temperatures (25 °C, 50 °C and 80 °C) and different LED power dissipations (0.5 W, 0.75 W, 1 W and 1.25 W). Then, analyses are repeated at different heat transfer coefficients and different fin materials in order to calculate LED junction temperature in order to see if the fin structure proposed by the methodology is appropriate for staying below the given safety temperature limit. As a result, the suggested method has always proposed proper fin structures with optimum characteristics for given LED designs. As another result, for safe junction temperature ranges, it is seen that for all LED power dissipations, adding aluminum or copper plate behind the printed circuit board at low ambient temperatures is sufficient. Also, as the ambient temperature increases, especially in high powered LED lights, addition of aluminum is not sufficient and fin usage becomes essential. High heat transfer coefficient and using copper fin affect the junction temperature positively.

  19. Standarized radiological hazard analysis for a broad based operational safety program

    International Nuclear Information System (INIS)

    Wadman, W.W. III; Andrews, L.L.

    1992-01-01

    The Radiological hazard Analysis (RHA) Manual provides a methodology and detailed guidance for systematic analysis of radiological hazards over a broad spectrum of program functions, housed in a wide variety of facilities. Radiological programs at LANL include: research and experimentation; routine materials operations; production; non-destructive examination or testing; isotope and machine produced radiations; chemistry; and metallurgy. The RHA permits uniform evaluation of hazard types over a range of several orders of magnitude of hazard severity. The results are used to estimate risk, evaluate types and level or resource allocations, identify deficiencies, and plan corrective actions for safe working environments. 2 refs

  20. Standardized radiological hazard analysis for a broad based operational safety program

    International Nuclear Information System (INIS)

    Wadman, W. III; Andrews, L.

    1992-01-01

    The Radiological Hazard Analysis (RHA) Manual provides a methodology and detailed guidance for systematic analysis of radiological hazards over a broad spectrum of program functions, housed in a wide variety of facilities. Radiological programs at LANL include: research and experimentation routine materials operations; production; non-destructive examination or testing; isotope and machine produced radiations; chemistry; and metallurgy. The RHA permits uniform evaluation of hazard types over a range of several orders of magnitude of hazard severity. The results are used to estimate risk, evaluate types and level of resource allocations, identify deficiencies, and plan corrective actions for safe working environments. (author)

  1. Genetic utility of broadly defined bipolar schizoaffective disorder as a diagnostic concept

    Science.gov (United States)

    Hamshere, M. L.; Green, E. K.; Jones, I. R.; Jones, L.; Moskvina, V.; Kirov, G.; Grozeva, D.; Nikolov, I.; Vukcevic, D.; Caesar, S.; Gordon-Smith, K.; Fraser, C.; Russell, E.; Breen, G.; St Clair, D.; Collier, D. A.; Young, A. H.; Ferrier, I. N.; Farmer, A.; McGuffin, P.; Holmans, P. A.; Owen, M. J.; O’Donovan, M. C.; Craddock, N.

    2009-01-01

    Background Psychiatric phenotypes are currently defined according to sets of descriptive criteria. Although many of these phenotypes are heritable, it would be useful to know whether any of the various diagnostic categories in current use identify cases that are particularly helpful for biological–genetic research. Aims To use genome-wide genetic association data to explore the relative genetic utility of seven different descriptive operational diagnostic categories relevant to bipolar illness within a large UK case–control bipolar disorder sample. Method We analysed our previously published Wellcome Trust Case Control Consortium (WTCCC) bipolar disorder genome-wide association data-set, comprising 1868 individuals with bipolar disorder and 2938 controls genotyped for 276 122 single nucleotide polymorphisms (SNPs) that met stringent criteria for genotype quality. For each SNP we performed a test of association (bipolar disorder group v. control group) and used the number of associated independent SNPs statistically significant at Pschizoaffective disorder, bipolar type; DSM–IV: bipolar I disorder; bipolar II disorder; schizoaffective disorder, bipolar type. Results The RDC schizoaffective disorder, bipolar type (v. controls) stood out from the other diagnostic subsets as having a significant excess of independent association signals (Pschizoaffective features have either a particularly strong genetic contribution or that, as a group, are genetically more homogeneous than the other phenotypes tested. The results point to the importance of using diagnostic approaches that recognise this group of individuals. Our approach can be applied to similar data-sets for other psychiatric and non-psychiatric phenotypes. PMID:19567891

  2. 2D nanomaterials assembled from sequence-defined molecules

    International Nuclear Information System (INIS)

    Mu, Peng; State University of New York; Zhou, Guangwen; Chen, Chun-Long

    2017-01-01

    Two dimensional (2D) nanomaterials have attracted broad interest owing to their unique physical and chemical properties with potential applications in electronics, chemistry, biology, medicine and pharmaceutics. Due to the current limitations of traditional 2D nanomaterials (e.g., graphene and graphene oxide) in tuning surface chemistry and compositions, 2D nanomaterials assembled from sequence-defined molecules (e.g., DNAs, proteins, peptides and peptoids) have recently been developed. They represent an emerging class of 2D nanomaterials with attractive physical and chemical properties. Here, we summarize the recent progress in the synthesis and applications of this type of sequence-defined 2D nanomaterials. We also discuss the challenges and opportunities in this new field.

  3. Safety at Work : Research Methodology

    NARCIS (Netherlands)

    Beurden, van K. (Karin); Boer, de J. (Johannes); Brinks, G. (Ger); Goering-Zaburnenko, T. (Tatiana); Houten, van Y. (Ynze); Teeuw, W. (Wouter)

    2012-01-01

    In this document, we provide the methodological background for the Safety atWork project. This document combines several project deliverables as defined inthe overall project plan: validation techniques and methods (D5.1.1), performanceindicators for safety at work (D5.1.2), personal protection

  4. Methodological Guidelines for Advertising Research

    DEFF Research Database (Denmark)

    Rossiter, John R.; Percy, Larry

    2017-01-01

    In this article, highly experienced advertising academics and advertising research consultants John R. Rossiter and Larry Percy present and discuss what they believe to be the seven most important methodological guidelines that need to be implemented to improve the practice of advertising research....... Their focus is on methodology, defined as first choosing a suitable theoretical framework to guide the research study and then identifying the advertising responses that need to be studied. Measurement of those responses is covered elsewhere in this special issue in the article by Bergkvist and Langner. Most...

  5. A Proposed Methodology for the Conceptualization, Operationalization, and Empirical Validation of the Concept of Information Need

    Science.gov (United States)

    Afzal, Waseem

    2017-01-01

    Introduction: The purpose of this paper is to propose a methodology to conceptualize, operationalize, and empirically validate the concept of information need. Method: The proposed methodology makes use of both qualitative and quantitative perspectives, and includes a broad array of approaches such as literature reviews, expert opinions, focus…

  6. The Theoretical and Methodological Crisis of the Afrocentric Conception.

    Science.gov (United States)

    Banks, W. Curtis

    1992-01-01

    Defines the theory of the Afrocentric conception, and comments on Afrocentric research methodology. The Afrocentric conception is likely to succeed if it constructs a particularist theory in contrast to cross-cultural relativism and because it relies on the methodology of the absolute rather than the comparative. (SLD)

  7. Relative Hazard and Risk Measure Calculation Methodology

    International Nuclear Information System (INIS)

    Stenner, Robert D.; Strenge, Dennis L.; Elder, Matthew S.; Andrews, William B.; Walton, Terry L.

    2003-01-01

    The RHRM equations, as represented in methodology and code presented in this report, are primarily a collection of key factors normally used in risk assessment that are relevant to understanding the hazards and risks associated with projected mitigation, cleanup, and risk management activities. The RHRM code has broad application potential. For example, it can be used to compare one mitigation, cleanup, or risk management activity with another, instead of just comparing it to just the fixed baseline. If the appropriate source term data are available, it can be used in its non-ratio form to estimate absolute values of the associated controlling hazards and risks. These estimated values of controlling hazards and risks can then be examined to help understand which mitigation, cleanup, or risk management activities are addressing the higher hazard conditions and risk reduction potential at a site. Graphics can be generated from these absolute controlling hazard and risk values to graphically compare these high hazard and risk reduction potential conditions. If the RHRM code is used in this manner, care must be taken to specifically define and qualify (e.g., identify which factors were considered and which ones tended to drive the hazard and risk estimates) the resultant absolute controlling hazard and risk values

  8. A Methodology for Integrating Maintainability Using Software Metrics

    OpenAIRE

    Lewis, John A.; Henry, Sallie M.

    1989-01-01

    Maintainability must be integrated into software early in the development process. But for practical use, the techniques used must be as unobtrusive to the existing software development process as possible. This paper defines a methodology for integrating maintainability into large-scale software and describes an experiment which implemented the methodology into a major commercial software development environment.

  9. A Security Assessment Mechanism for Software-Defined Networking-Based Mobile Networks

    Directory of Open Access Journals (Sweden)

    Shibo Luo

    2015-12-01

    Full Text Available Software-Defined Networking-based Mobile Networks (SDN-MNs are considered the future of 5G mobile network architecture. With the evolving cyber-attack threat, security assessments need to be performed in the network management. Due to the distinctive features of SDN-MNs, such as their dynamic nature and complexity, traditional network security assessment methodologies cannot be applied directly to SDN-MNs, and a novel security assessment methodology is needed. In this paper, an effective security assessment mechanism based on attack graphs and an Analytic Hierarchy Process (AHP is proposed for SDN-MNs. Firstly, this paper discusses the security assessment problem of SDN-MNs and proposes a methodology using attack graphs and AHP. Secondly, to address the diversity and complexity of SDN-MNs, a novel attack graph definition and attack graph generation algorithm are proposed. In order to quantify security levels, the Node Minimal Effort (NME is defined to quantify attack cost and derive system security levels based on NME. Thirdly, to calculate the NME of an attack graph that takes the dynamic factors of SDN-MN into consideration, we use AHP integrated with the Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS as the methodology. Finally, we offer a case study to validate the proposed methodology. The case study and evaluation show the advantages of the proposed security assessment mechanism.

  10. A Security Assessment Mechanism for Software-Defined Networking-Based Mobile Networks.

    Science.gov (United States)

    Luo, Shibo; Dong, Mianxiong; Ota, Kaoru; Wu, Jun; Li, Jianhua

    2015-12-17

    Software-Defined Networking-based Mobile Networks (SDN-MNs) are considered the future of 5G mobile network architecture. With the evolving cyber-attack threat, security assessments need to be performed in the network management. Due to the distinctive features of SDN-MNs, such as their dynamic nature and complexity, traditional network security assessment methodologies cannot be applied directly to SDN-MNs, and a novel security assessment methodology is needed. In this paper, an effective security assessment mechanism based on attack graphs and an Analytic Hierarchy Process (AHP) is proposed for SDN-MNs. Firstly, this paper discusses the security assessment problem of SDN-MNs and proposes a methodology using attack graphs and AHP. Secondly, to address the diversity and complexity of SDN-MNs, a novel attack graph definition and attack graph generation algorithm are proposed. In order to quantify security levels, the Node Minimal Effort (NME) is defined to quantify attack cost and derive system security levels based on NME. Thirdly, to calculate the NME of an attack graph that takes the dynamic factors of SDN-MN into consideration, we use AHP integrated with the Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) as the methodology. Finally, we offer a case study to validate the proposed methodology. The case study and evaluation show the advantages of the proposed security assessment mechanism.

  11. Methodology for astronaut reconditioning research.

    Science.gov (United States)

    Beard, David J; Cook, Jonathan A

    2017-01-01

    Space medicine offers some unique challenges, especially in terms of research methodology. A specific challenge for astronaut reconditioning involves identification of what aspects of terrestrial research methodology hold and which require modification. This paper reviews this area and presents appropriate solutions where possible. It is concluded that spaceflight rehabilitation research should remain question/problem driven and is broadly similar to the terrestrial equivalent on small populations, such as rare diseases and various sports. Astronauts and Medical Operations personnel should be involved at all levels to ensure feasibility of research protocols. There is room for creative and hybrid methodology but careful systematic observation is likely to be more achievable and fruitful than complex trial based comparisons. Multi-space agency collaboration will be critical to pool data from small groups of astronauts with the accepted use of standardised outcome measures across all agencies. Systematic reviews will be an essential component. Most limitations relate to the inherent small sample size available for human spaceflight research. Early adoption of a co-operative model for spaceflight rehabilitation research is therefore advised. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. A systematic methodology for design of tailor-made blended products

    DEFF Research Database (Denmark)

    Yunus, Nor Alafiza Binti; Gernaey, Krist; Woodley, John

    2014-01-01

    A systematic methodology for design of tailor-made blended products has been developed. In tailor-made blended products, one identifies the product needs and matches them by blending different chemicals. The systematic methodology has four main tasks. First, the design problem is defined: the pro......, the methodology is highlighted through two case studies involving gasoline blends and lubricant base oils....

  13. A COMPREHENSIVE NONPOINT SOURCE FIELD STUDY FOR SEDIMENT, NUTRIENTS, AND PATHOGENS IN THE SOUTH FORK BROAD RIVER WATERSHED IN NORTHEAST GEORGIA

    Science.gov (United States)

    This technical report provides a description of the field project design, quality control, the sampling protocols and analysis methodology used, and standard operating procedures for the South Fork Broad River Watershed (SFBR) Total Maximum Daily Load (TMDL) project. This watersh...

  14. Development and application of a methodology for identifying and characterising scenarios

    International Nuclear Information System (INIS)

    Billington, D.; Bailey, L.

    1998-01-01

    This report forms part of a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed performance assessment. A scenario approach to performance assessment has been adopted. It is proposed that potential evolutions of a deep geological radioactive waste repository can be represented by a base scenario and a number of variant scenarios. It is intended that assessment of the base, scenario would form the core of any future performance assessment. The base scenario is chosen to be broad-ranging and to represent the natural evolution of the repository system and its surrounding environment. The base scenario is defined to include all those FEPs which are certain to occur and those which are judged likely to occur for a significant period of the assessment timescale. Variant scenarios are defined by FEPs which represent a significant perturbation to the natural system evolution, for example the occurrence of a large seismic event. A variant scenario defined by a single initiating FEP is characterised by a sequence of events. This is represented as a 'timeline' which forms the basis for modelling that scenario. To generate a variant scenario defined by two initiating FEPs, a methodology is presented for combining the timelines for the two underlying 'single-FEP' variants. The resulting series of event sequences can be generated automatically. These sequences are then reviewed, in order to reduce the number of timelines requiring detailed consideration. This is achieved in two ways: by aggregating sequences which have similar consequence in terms of safety performance; and by combining successive intervals along a timeline where appropriate. In the context of a performance assessment, the aim is to determine the conditional risk and appropriate weight for each

  15. Methodological challenges in assessing the environmental status of a marine ecosystem: case study of the Baltic Sea.

    Directory of Open Access Journals (Sweden)

    Henn Ojaveer

    Full Text Available Assessments of the environmental status of marine ecosystems are increasingly needed to inform management decisions and regulate human pressures to meet the objectives of environmental policies. This paper addresses some generic methodological challenges and related uncertainties involved in marine ecosystem assessment, using the central Baltic Sea as a case study. The objectives of good environmental status of the Baltic Sea are largely focusing on biodiversity, eutrophication and hazardous substances. In this paper, we conduct comparative evaluations of the status of these three segments, by applying different methodological approaches. Our analyses indicate that the assessment results are sensitive to a selection of indicators for ecological quality objectives that are affected by a broad spectrum of human activities and natural processes (biodiversity, less so for objectives that are influenced by a relatively narrow array of drivers (eutrophications, hazardous substances. The choice of indicator aggregation rule appeared to be of essential importance for assessment results for all three segments, whereas the hierarchical structure of indicators had only a minor influence. Trend-based assessment was shown to be a useful supplement to reference-based evaluation, being independent of the problems related to defining reference values and indicator aggregation methodologies. Results of this study will help in setting priorities for future efforts to improve environmental assessments in the Baltic Sea and elsewhere, and to ensure the transparency of the assessment procedure.

  16. (Re)Defining Salesperson Motivation

    DEFF Research Database (Denmark)

    Khusainova, Rushana; de Jong, Ad; Lee, Nick

    2018-01-01

    The construct of motivation is one of the central themes in selling and sales management research. Yet, to-date no review article exists that surveys the construct (both from an extrinsic and intrinsic motivation context), critically evaluates its current status, examines various key challenges...... apparent from the extant research, and suggests new research opportunities based on a thorough review of past work. The authors explore how motivation is defined, major theories underpinning motivation, how motivation has historically been measured, and key methodologies used over time. In addition......, attention is given to principal drivers and outcomes of salesperson motivation. A summarizing appendix of key articles in salesperson motivation is provided....

  17. The 2010 Broad Prize

    Science.gov (United States)

    Education Digest: Essential Readings Condensed for Quick Review, 2011

    2011-01-01

    A new data analysis, based on data collected as part of The Broad Prize process, provides insights into which large urban school districts in the United States are doing the best job of educating traditionally disadvantaged groups: African-American, Hispanics, and low-income students. Since 2002, The Eli and Edythe Broad Foundation has awarded The…

  18. Perceiving beauty in all women: Psychometric evaluation of the Broad Conceptualization of Beauty Scale.

    Science.gov (United States)

    Tylka, Tracy L; Iannantuono, Amy C

    2016-06-01

    Women's ability to broadly conceptualize beauty (i.e., perceive many looks, appearances, body sizes/shapes, and inner characteristics as beautiful) has been identified as a facet of positive body image in qualitative research. A scale is needed to be able to assess this construct within quantitative research. Therefore, we developed the Broad Conceptualization of Beauty Scale (BCBS), which measures the extent women define female beauty widely within external and internal characteristics, and examined its psychometric properties among four community samples totaling 1086 women. Exploratory and confirmatory factor analyses revealed a unidimensional structure with nine items. The internal consistency, test-retest reliability, and construct (convergent, discriminant, and incremental) validity of its scores were upheld. Researchers and clinicians can use the BCBS alone to assess women's perceptions of female beauty, or they can use the BCBS alongside women's perceptions of self-beauty to more comprehensively explore women's ability to broadly conceptualize beauty for others and themselves. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Generation of a Broad-Group HTGR Library for Use with SCALE

    International Nuclear Information System (INIS)

    Ellis, Ronald James; Lee, Deokjung; Wiarda, Dorothea; Williams, Mark L.; Mertyurek, Ugur

    2012-01-01

    With current and ongoing interest in high temperature gas reactors (HTGRs), the U.S. Nuclear Regulatory Commission (NRC) anticipates the need for nuclear data libraries appropriate for use in applications for modeling, assessing, and analyzing HTGR reactor physics and operating behavior. The objective of this work was to develop a broad-group library suitable for production analyses with SCALE for HTGR applications. Several interim libraries were generated from SCALE fine-group 238- and 999-group libraries, and the final broad-group library was created from Evaluated Nuclear Data File/B Version ENDF/B-VII Release 0 cross-section evaluations using new ORNL methodologies with AMPX, SCALE, and other codes. Furthermore, intermediate resonance (IR) methods were applied to the HTGR broadgroup library, and lambda factors and f-factors were incorporated into the library s nuclear data files. A new version of the SCALE BONAMI module named BONAMI-IR was developed to process the IR data in the new library and, thus, eliminate the need for the CENTRM/PMC modules for resonance selfshielding. This report documents the development of the HTGR broad-group nuclear data library and the results of test and benchmark calculations using the new library with SCALE. The 81-group library is shown to model HTGR cases with similar accuracy to the SCALE 238-group library but with significantly faster computational times due to the reduced number of energy groups and the use of BONAMI-IR instead of BONAMI/CENTRM/PMC for resonance self-shielding calculations.

  20. Systems Approach to Tourism: A Methodology for Defining Complex Tourism System

    Directory of Open Access Journals (Sweden)

    Jere Jakulin Tadeja

    2017-08-01

    Full Text Available Background and Purpose: The complexity of the tourism system, as well as modelling in a frame of system dynamics, will be discussed in this paper. The phaenomenon of tourism, which possesses the typical properties of global and local organisations, will be presented as an open complex system with all its elements, and an optimal methodology to explain the relations among them. The approach we want to present is due to its transparency an excellent tool for searching systems solutions and serves also as a strategic decision-making assessment. We will present systems complexity and develop three models of a complex tourism system: the first one will present tourism as an open complex system with its elements, which operate inside of a tourism market area. The elements of this system present subsystems, which relations and interdependencies will be explained with two models: causal-loop diagram and a simulation model in frame of systems dynamics.

  1. Methodology for technical risk assessment

    International Nuclear Information System (INIS)

    Waganer, L.M.; Zuckerman, D.S.

    1983-01-01

    A methodology has been developed for and applied to the assessment of the technical risks associated with an evolving technology. This methodology, originally developed for fusion by K. W. Billman and F. R. Scott at EPRI, has been applied to assess the technical risk of a fuel system for a fusion reactor. Technical risk is defined as the risk that a particular technology or component which is currently under development will not achieve a set of required technical specifications (i.e. probability of failure). The individual steps in the technical risk assessment are summarized. The first step in this methodology is to clearly and completely quantify the technical requirements for the particular system being examined. The next step is to identify and define subsystems and various options which appear capable of achieving the required technical performance. The subsystem options are then characterized regarding subsystem functions, interface requirements with the subsystems and systems, important components, developmental obstacles and technical limitations. Key technical subsystem performance parameters are identified which directly or indirectly relate to the system technical specifications. Past, existing and future technical performance data from subsystem experts are obtained by using a Bayesian Interrogation technique. The input data is solicited in the form of probability functions. Thus the output performance of the system is expressed as probability functions

  2. IMPROVING METHODOLOGY OF RISK IDENTIFICATION OF OCCUPATIONAL DANGEROUS

    Directory of Open Access Journals (Sweden)

    A. P. BOCHKOVSKYI

    2018-04-01

    Full Text Available In the paper, according to the analysis of statistical data, correlation between the amount of occupational injuries and occupationaldiseases in Ukraine within last 5 years is defined. Also, using methodology of the International Labor Organizationcorrelcation between the amount of accident fatalities and general number of accidents in Ukraine and EU countries (Austria, GreatBritain, Germany, Denmark, Norway, Poland, Hungry, Finland, France is defined. It is shown that in spite of the positive dynamicsof decreasing amount of occupational injuries, the number of occupational diseases in Ukraine always increases. The comparativeanalysis of the ratio of the number of accident fatalities to the total number of registered accidents showed that, on average, Ukraineexceeds the EU countries by this indicator by 100 times.It is noted, that such negative indicators (in particular, increasing amount of occupational diseases, may occure because ofimperfect methodology for identifying the risks of professional dangerous.Also, it is ascertained that basing on the existed methodology, the identefication process of occupational dangerous isquite subjective, which reduces objectivity of conducting quantitative assessment. In order to eliminate defined drawnbacks it is firsttime proposed to use corresponding integral criterion to conduct the process of quantitative risk assessmentTo solve this problem authors formulate and propose an algorithm of improving methodology of a process of analysing dangerousand harmful production effects (DHPE which are the mainest reasons of occupational dangerous.The proposed algorithm includes implementation of four following successive steps: DHPE identification, indication of theirmaximum allowed threshold of concentrations (levels, identification of the sources of identified DHPE, esimation of consequencesof manifestation.The improved proposed methodology allows indentify risks of occurrence occupational dangerous in systems

  3. Presentation of a methodology for measuring social acceptance of three hydrogen storage technologies and preliminary results

    Energy Technology Data Exchange (ETDEWEB)

    Noirot, I.; Bigay, C. N.

    2005-07-01

    Hydrogen storage is a key technology for the extensive use of H2 as energy carrier. As none of the current technologies satisfies all of the hydrogen storage attributes required by manufacturers and end users, there is intense research works aiming at developing viable solutions. A broad objective of the StorHy European project is to provide technological storage solutions, which are attractive from an economical, environmental and safety point of view. A specific sub-project is dedicated to the comparison of three different potential storage technologies for transport applications (compressed gas, cryogenic liquid, solid media). This evaluation is carried out in a harmonised way, based on common tools and assessment strategies that could be useful for decision makers and stakeholders. The assessment is achieved in a 'sustainable development' spirit, taking into consideration the technical, environmental, economical, safety and social requirements. The latter ones have newly emerged in such evaluations, based on the Quality Function Deployment (QFD) approach, and would require to be further studied. Hydrogen acceptability studies have been conducted in previous projects. They have been reviewed by LBST in the AcceptH2 project Public acceptance of Hydrogen Transport Technologies : Analysis and comparisons of existing studies (www. accepth2. com - August 2003). During these hydrogen acceptance surveys, mainly fuel cell bus passengers from demonstration projects around the world have been questioned. The work presented in this paper goes further in the methodology refinement as it focuses on the evaluation of hydrogen storage solutions. It proposes a methodological tool for efficient social evaluation of new technologies and associated preliminary results concerning France. In a global approach to sustainable development, the CEA has developed a new methodology to evaluate its current research projects : Multicriteria Analysis for Sustainable Industrial

  4. 78 FR 20119 - Broad Stakeholder Survey

    Science.gov (United States)

    2013-04-03

    ... DEPARTMENT OF HOMELAND SECURITY [Docket No. DHS-2012-0042] Broad Stakeholder Survey AGENCY... concerning the Broad Stakeholder Survey. DHS previously published this ICR in the Federal Register on August... across the Nation. The Broad Stakeholder Survey is designed to gather stakeholder feedback on the...

  5. Broad band exciplex dye lasers

    International Nuclear Information System (INIS)

    Dienes, A.; Shank, C.V.; Trozzolo, A.M.

    1975-01-01

    The disclosure is concerned with exciplex dye lasers, i.e., lasers in which the emitting species is a complex formed only from a constituent in an electronically excited state. Noting that an exciplex laser, favorable from the standpoint of broad tunability, results from a broad shift in the peak emission wavelength for the exciplex relative to the unreacted species, a desirable class resulting in such broad shift is described. Preferred classes of laser media utilizing specified resonant molecules are set forth. (auth)

  6. Towards an MDA-based development methodology

    NARCIS (Netherlands)

    Gavras, Anastasius; Belaunde, Mariano; Ferreira Pires, Luis; Andrade Almeida, João; Oquendo, Flavio; Warboys, Brian C.; Morrison, Ron

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  7. Broad-Band Visually Evoked Potentials: Re(convolution in Brain-Computer Interfacing.

    Directory of Open Access Journals (Sweden)

    Jordy Thielen

    Full Text Available Brain-Computer Interfaces (BCIs allow users to control devices and communicate by using brain activity only. BCIs based on broad-band visual stimulation can outperform BCIs using other stimulation paradigms. Visual stimulation with pseudo-random bit-sequences evokes specific Broad-Band Visually Evoked Potentials (BBVEPs that can be reliably used in BCI for high-speed communication in speller applications. In this study, we report a novel paradigm for a BBVEP-based BCI that utilizes a generative framework to predict responses to broad-band stimulation sequences. In this study we designed a BBVEP-based BCI using modulated Gold codes to mark cells in a visual speller BCI. We defined a linear generative model that decomposes full responses into overlapping single-flash responses. These single-flash responses are used to predict responses to novel stimulation sequences, which in turn serve as templates for classification. The linear generative model explains on average 50% and up to 66% of the variance of responses to both seen and unseen sequences. In an online experiment, 12 participants tested a 6 × 6 matrix speller BCI. On average, an online accuracy of 86% was reached with trial lengths of 3.21 seconds. This corresponds to an Information Transfer Rate of 48 bits per minute (approximately 9 symbols per minute. This study indicates the potential to model and predict responses to broad-band stimulation. These predicted responses are proven to be well-suited as templates for a BBVEP-based BCI, thereby enabling communication and control by brain activity only.

  8. Interrogating discourse: the application of Foucault's methodological discussion to specific inquiry.

    Science.gov (United States)

    Fadyl, Joanna K; Nicholls, David A; McPherson, Kathryn M

    2013-09-01

    Discourse analysis following the work of Michel Foucault has become a valuable methodology in the critical analysis of a broad range of topics relating to health. However, it can be a daunting task, in that there seems to be both a huge number of possible approaches to carrying out this type of project, and an abundance of different, often conflicting, opinions about what counts as 'Foucauldian'. This article takes the position that methodological design should be informed by ongoing discussion and applied as appropriate to a particular area of inquiry. The discussion given offers an interpretation and application of Foucault's methodological principles, integrating a reading of Foucault with applications of his work by other authors, showing how this is then applied to interrogate the practice of vocational rehabilitation. It is intended as a contribution to methodological discussion in this area, offering an interpretation of various methodological elements described by Foucault, alongside specific application of these aspects.

  9. Laboratory experiments in innovation research: A methodological overview and a review of the current literature

    OpenAIRE

    Brüggemann, Julia; Bizer, Kilian

    2016-01-01

    Innovation research has developed a broad set of methodological approaches in recent decades. In this paper, we propose laboratory experiments as a fruitful methodological addition to the existing methods in innovation research. Therefore, we provide an overview of the existing methods, discuss the advantages and limitations of laboratory experiments, and review experimental studies dealing with different fields of innovation policy, namely intellectual property rights, financi...

  10. A methodology to enlarge narrow stability windows

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Ewerton M.P.; Pastor, Jorge A.S.C.; Fontoura, Sergio A.B. [Pontificia Univ. Catolica do Rio de Janeiro (PUC-Rio), RJ (Brazil). Dept. de Engenharia Civil. Grupo de Tecnologia e Engenharia de Petroleo

    2004-07-01

    The stability window in a wellbore design is defined by the difference between fracture pressure and collapse pressure. Deep water environments typically present narrow stability windows, because rocks have low strength due to under-compaction process. Often also, horizontal wells are drilled to obtain a better development of reservoirs placed in thin layers of sandstone. In this scenario, several challenges are faced when drilling in deep water. The traditional approach for predicting instabilities is to determine collapses and fractures at borehole wall. However, the initiation of rupture does not indicate that the borehole fails to perform its function as a wellbore. Thus, a methodology in which the stability window may be enlarged is desirable. This paper presents one practical analytical methodology that consists in allowing wellbore pressures smaller than the conventional collapse pressure, i.e., based upon failure on the borehole wall. This means that a collapse region (shear failure) will be developed around the borehole wall. This collapse region is pre-defined and to estimate its size is used a failure criterion. The aforementioned methodology is implemented in a user-friendly software, which can perform analyses of stress, pore pressure, formation failure, mud weight and mud salinity design for drilling in shale formations. Simulations of a wellbore drilling in a narrow stability window environment are performed to demonstrate the improvements of using the methodology. (author)

  11. Pharmacy sales data versus ward stock accounting for the surveillance of broad-spectrum antibiotic use in hospitals.

    Science.gov (United States)

    Haug, Jon B; Myhr, Randi; Reikvam, Asmund

    2011-12-13

    Antibiotic consumption in hospitals is commonly measured using the accumulated amount of drugs delivered from the pharmacy to ward held stocks. The reliability of this method, particularly the impact of the length of the registration periods, has not been evaluated and such evaluation was aim of the study. During 26 weeks, we performed a weekly ward stock count of use of broad-spectrum antibiotics--that is second- and third-generation cephalosporins, carbapenems, and quinolones--in five hospital wards and compared the data with corresponding pharmacy sales figures during the same period. Defined daily doses (DDDs) for antibiotics were used as measurement units (WHO ATC/DDD classification). Consumption figures obtained with the two methods for different registration intervals were compared by use of intraclass correlation analysis and Bland-Altman statistics. Broad-spectrum antibiotics accounted for a quarter to one-fifth of all systemic antibiotics (ATC group J01) used in the hospital and varied between wards, from 12.8 DDDs per 100 bed days in a urological ward to 24.5 DDDs in a pulmonary diseases ward. For the entire study period of 26 weeks, the pharmacy and ward defined daily doses figures for all broad-spectrum antibiotics differed only by 0.2%; however, for single wards deviations varied from -4.3% to 6.9%. The intraclass correlation coefficient, pharmacy versus ward data, increased from 0.78 to 0.94 for parenteral broad-spectrum antibiotics with increasing registration periods (1-4 weeks), whereas the corresponding figures for oral broad-spectrum antibiotics (ciprofloxacin) were from 0.46 to 0.74. For all broad-spectrum antibiotics and for parenteral antibiotics, limits of agreement between the two methods showed, according to Bland-Altman statistics, a deviation of ± 5% or less from average mean DDDs at 3- and 4-weeks registration intervals. Corresponding deviation for oral antibiotics was ± 21% at a 4-weeks interval. There is a need for caution in

  12. Trends in broad-spectrum antibiotic prescribing for children with acute otitis media in the United States, 1998–2004

    Directory of Open Access Journals (Sweden)

    Gambler Angela S

    2009-06-01

    Full Text Available Abstract Background Overuse of broad-spectrum antibiotics is associated with antibiotic resistance. Acute otitis media (AOM is responsible for a large proportion of antibiotics prescribed for US children. Rates of broad-spectrum antibiotic prescribing for AOM are unknown. Methods Analysis of the National Ambulatory Medical Care Survey and National Hospital Ambulatory Medical Care Survey, 1998 to 2004 (N = 6,878. Setting is office-based physicians, hospital outpatient departments, and emergency departments. Patients are children aged 12 years and younger prescribed antibiotics for acute otitis media. Main outcome measure is percentage of broad-spectrum antibiotics, defined as amoxicillin/clavulanate, macrolides, cephalosporins and quinolones. Results Broad-spectrum prescribing for acute otitis media increased from 34% of visits in 1998 to 45% of visits in 2004 (P Conclusion Prescribing of broad-spectrum antibiotics for acute otitis media has steadily increased from 1998 to 2004. Associations with non-clinical factors suggest potential for improvement in prescribing practice.

  13. IMSF: Infinite Methodology Set Framework

    Science.gov (United States)

    Ota, Martin; Jelínek, Ivan

    Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.

  14. Defining Quality in Undergraduate Education

    Directory of Open Access Journals (Sweden)

    Alison W. Bowers

    2018-01-01

    Full Text Available Objectives: This research brief explores the literature addressing quality in undergraduate education to identify what previous research has said about quality and to offer future directions for research on quality in undergraduate education. Method: We conducted a scoping review to provide a broad overview of existing research. Using targeted search terms in academic databases, we identified and reviewed relevant academic literature to develop emergent themes and implications for future research. Results: The exploratory review of the literature revealed a range of thoughtful discussions and empirical studies attempting to define quality in undergraduate education. Many publications highlighted the importance of including different stakeholder perspectives and presented some of the varying perceptions of quality among different stakeholders. Conclusions: While a number of researchers have explored and written about how to define quality in undergraduate education, there is not a general consensus regarding a definition of quality in undergraduate education. Past research offers a range of insights, models, and data to inform future research. Implication for Theory and/or Practice: We provide four recommendations for future research to contribute to a high quality undergraduate educational experience. We suggest more comprehensive systematic reviews of the literature as a next step.

  15. Defining the "proven technology" technical criterion in the reactor technology assessment for Malaysia's nuclear power program

    Science.gov (United States)

    Anuar, Nuraslinda; Kahar, Wan Shakirah Wan Abdul; Manan, Jamal Abdul Nasir Abd

    2015-04-01

    Developing countries that are considering the deployment of nuclear power plants (NPPs) in the near future need to perform reactor technology assessment (RTA) in order to select the most suitable reactor design. The International Atomic Energy Agency (IAEA) reported in the Common User Considerations (CUC) document that "proven technology" is one of the most important technical criteria for newcomer countries in performing the RTA. The qualitative description of five desired features for "proven technology" is relatively broad and only provides a general guideline to its characterization. This paper proposes a methodology to define the "proven technology" term according to a specific country's requirements using a three-stage evaluation process. The first evaluation stage screens the available technologies in the market against a predefined minimum Technology Readiness Level (TRL) derived as a condition based on national needs and policy objectives. The result is a list of technology options, which are then assessed in the second evaluation stage against quantitative definitions of CUC desired features for proven technology. The potential technology candidates produced from this evaluation is further narrowed down to obtain a list of proven technology candidates by assessing them against selected risk criteria and the established maximum allowable total score using a scoring matrix. The outcome of this methodology is the proven technology candidates selected using an accurate definition of "proven technology" that fulfills the policy objectives, national needs and risk, and country-specific CUC desired features of the country that performs this assessment. A simplified assessment for Malaysia is carried out to demonstrate and suggest the use of the proposed methodology. In this exercise, ABWR, AP1000, APR1400 and EPR designs assumed the top-ranks of proven technology candidates according to Malaysia's definition of "proven technology".

  16. Methodological Capacity within the Field of "Educational Technology" Research: An Initial Investigation

    Science.gov (United States)

    Bulfin, Scott; Henderson, Michael; Johnson, Nicola F.; Selwyn, Neil

    2014-01-01

    The academic study of educational technology is often characterised by critics as methodologically limited. In order to test this assumption, the present paper reports on data collected from a survey of 462 "research active" academic researchers working in the broad areas of educational technology and educational media. The paper…

  17. Health economic assessment: a methodological primer.

    Science.gov (United States)

    Simoens, Steven

    2009-12-01

    This review article aims to provide an introduction to the methodology of health economic assessment of a health technology. Attention is paid to defining the fundamental concepts and terms that are relevant to health economic assessments. The article describes the methodology underlying a cost study (identification, measurement and valuation of resource use, calculation of costs), an economic evaluation (type of economic evaluation, the cost-effectiveness plane, trial- and model-based economic evaluation, discounting, sensitivity analysis, incremental analysis), and a budget impact analysis. Key references are provided for those readers who wish a more advanced understanding of health economic assessments.

  18. Health Economic Assessment: A Methodological Primer

    Directory of Open Access Journals (Sweden)

    Steven Simoens

    2009-11-01

    Full Text Available This review article aims to provide an introduction to the methodology of health economic assessment of a health technology. Attention is paid to defining the fundamental concepts and terms that are relevant to health economic assessments. The article describes the methodology underlying a cost study (identification, measurement and valuation of resource use, calculation of costs, an economic evaluation (type of economic evaluation, the cost-effectiveness plane, trial- and model-based economic evaluation, discounting, sensitivity analysis, incremental analysis, and a budget impact analysis. Key references are provided for those readers who wish a more advanced understanding of health economic assessments.

  19. Observation of Nonlinear Self-Trapping of Broad Beams in Defocusing Waveguide Arrays

    International Nuclear Information System (INIS)

    Bennet, Francis H.; Haslinger, Franz; Neshev, Dragomir N.; Kivshar, Yuri S.; Alexander, Tristram J.; Mitchell, Arnan

    2011-01-01

    We demonstrate experimentally the localization of broad optical beams in periodic arrays of optical waveguides with defocusing nonlinearity. This observation in optics is linked to nonlinear self-trapping of Bose-Einstein-condensed atoms in stationary periodic potentials being associated with the generation of truncated nonlinear Bloch states, existing in the gaps of the linear transmission spectrum. We reveal that unlike gap solitons, these novel localized states can have an arbitrary width defined solely by the size of the input beam while independent of nonlinearity.

  20. MICROLENSING OF QUASAR BROAD EMISSION LINES: CONSTRAINTS ON BROAD LINE REGION SIZE

    Energy Technology Data Exchange (ETDEWEB)

    Guerras, E.; Mediavilla, E. [Instituto de Astrofisica de Canarias, Via Lactea S/N, La Laguna E-38200, Tenerife (Spain); Jimenez-Vicente, J. [Departamento de Fisica Teorica y del Cosmos, Universidad de Granada, Campus de Fuentenueva, E-18071 Granada (Spain); Kochanek, C. S. [Department of Astronomy and the Center for Cosmology and Astroparticle Physics, The Ohio State University, 4055 McPherson Lab, 140 West 18th Avenue, Columbus, OH 43221 (United States); Munoz, J. A. [Departamento de Astronomia y Astrofisica, Universidad de Valencia, E-46100 Burjassot, Valencia (Spain); Falco, E. [Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Motta, V. [Departamento de Fisica y Astronomia, Universidad de Valparaiso, Avda. Gran Bretana 1111, Valparaiso (Chile)

    2013-02-20

    We measure the differential microlensing of the broad emission lines between 18 quasar image pairs in 16 gravitational lenses. We find that the broad emission lines are in general weakly microlensed. The results show, at a modest level of confidence (1.8{sigma}), that high ionization lines such as C IV are more strongly microlensed than low ionization lines such as H{beta}, indicating that the high ionization line emission regions are more compact. If we statistically model the distribution of microlensing magnifications, we obtain estimates for the broad line region size of r{sub s} = 24{sup +22} {sub -15} and r{sub s} = 55{sup +150} {sub -35} lt-day (90% confidence) for the high and low ionization lines, respectively. When the samples are divided into higher and lower luminosity quasars, we find that the line emission regions of more luminous quasars are larger, with a slope consistent with the expected scaling from photoionization models. Our estimates also agree well with the results from local reveberation mapping studies.

  1. Two-year follow-up of the MOSAIC trial: A multicenter randomized controlled trial comparing two psychological treatments in adult outpatients with broadly defined anorexia nervosa.

    Science.gov (United States)

    Schmidt, Ulrike; Ryan, Elizabeth G; Bartholdy, Savani; Renwick, Bethany; Keyes, Alexandra; O'Hara, Caitlin; McClelland, Jessica; Lose, Anna; Kenyon, Martha; Dejong, Hannah; Broadbent, Hannah; Loomes, Rachel; Serpell, Lucy; Richards, Lorna; Johnson-Sabine, Eric; Boughton, Nicky; Whitehead, Linette; Bonin, Eva; Beecham, Jennifer; Landau, Sabine; Treasure, Janet

    2016-08-01

    This study reports follow-up data from a multicenter randomized controlled trial (n = 142) comparing the Maudsley Model of Anorexia Nervosa Treatment for Adults (MANTRA) with Specialist Supportive Clinical Management (SSCM) in outpatients with broadly defined anorexia nervosa (AN). At 12 months postrandomization, all patients had statistically significant improvements in body mass index (BMI), eating disorder (ED) symptomatology and other outcomes with no differences between groups. MANTRA was more acceptable to patients. The present study assessed whether gains were maintained at 24 months postrandomization. Follow-up data at 24 months were obtained from 73.2% of participants. Outcome measures included BMI, ED symptomatology, distress, impairment, and additional service utilization during the study period. Outcomes were analyzed using linear mixed models. There were few differences between groups. In both treatment groups, improvements in BMI, ED symptomatology, distress levels, and clinical impairment were maintained or increased further. Estimated mean BMI change from baseline to 24 months was 2.16 kg/m(2) for SSCM and 2.25 kg/m(2) for MANTRA (effect sizes of 1.75 and 1.83, respectively). Most participants (83%) did not require any additional intensive treatments (e.g., hospitalization). Two SSCM patients became overweight through binge-eating. Both treatments have value as outpatient interventions for patients with AN. © 2016 Crown copyright. International Journal of Eating Disorders. (Int J Eat Disord 2016; 49:793-800). © 2016 Crown copyright. International Journal of Eating Disorders.

  2. Using proliferation assessment methodologies for Safeguards-by-Design

    International Nuclear Information System (INIS)

    Van der Meer, K.; Rossa, R.; Turcanu, C.; Borella, A.

    2013-01-01

    MYRRHA, an accelerator driven system (ADS) is designed as a proton accelerator coupled to a liquid Pb-Bi spallation target, surrounded by a Pb-Bi cooled sub-critical neutron multiplying medium in a pool type configuration. An assessment based on three methodologies was made of the proliferation risks of the MYRRHA ADS in comparison with the BR2 MTR, an existing research reactor at the Belgian Nuclear Research Centre SCK-CEN. The used methodologies were the TOPS (Technical Opportunities to Increase the Proliferation Resistance of Nuclear Power Systems), the PR-PP and the INPRO methodologies. The various features of the methodologies are described and the results of the assessments are given and discussed. It is concluded that it would be useful to define one single methodology with two options to perform a quick and a more detailed assessment. The paper is followed by the slides of the presentation

  3. Big and broad social data and the sociological imagination: A collaborative response

    Directory of Open Access Journals (Sweden)

    William Housley

    2014-08-01

    Full Text Available In this paper, we reflect on the disciplinary contours of contemporary sociology, and social science more generally, in the age of ‘big and broad’ social data. Our aim is to suggest how sociology and social sciences may respond to the challenges and opportunities presented by this ‘data deluge’ in ways that are innovative yet sensitive to the social and ethical life of data and methods. We begin by reviewing relevant contemporary methodological debates and consider how they relate to the emergence of big and broad social data as a product, reflexive artefact and organizational feature of emerging global digital society. We then explore the challenges and opportunities afforded to social science through the widespread adoption of a new generation of distributed, digital technologies and the gathering momentum of the open data movement, grounding our observations in the work of the Collaborative Online Social Media ObServatory (COSMOS project. In conclusion, we argue that these challenges and opportunities motivate a renewed interest in the programme for a ‘public sociology’, characterized by the co-production of social scientific knowledge involving a broad range of actors and publics.

  4. Molecular evolution of broadly neutralizing Llama antibodies to the CD4-binding site of HIV-1.

    Science.gov (United States)

    McCoy, Laura E; Rutten, Lucy; Frampton, Dan; Anderson, Ian; Granger, Luke; Bashford-Rogers, Rachael; Dekkers, Gillian; Strokappe, Nika M; Seaman, Michael S; Koh, Willie; Grippo, Vanina; Kliche, Alexander; Verrips, Theo; Kellam, Paul; Fassati, Ariberto; Weiss, Robin A

    2014-12-01

    To date, no immunization of humans or animals has elicited broadly neutralizing sera able to prevent HIV-1 transmission; however, elicitation of broad and potent heavy chain only antibodies (HCAb) has previously been reported in llamas. In this study, the anti-HIV immune responses in immunized llamas were studied via deep sequencing analysis using broadly neutralizing monoclonal HCAbs as a guides. Distinct neutralizing antibody lineages were identified in each animal, including two defined by novel antibodies (as variable regions called VHH) identified by robotic screening of over 6000 clones. The combined application of five VHH against viruses from clades A, B, C and CRF_AG resulted in neutralization as potent as any of the VHH individually and a predicted 100% coverage with a median IC50 of 0.17 µg/ml for the panel of 60 viruses tested. Molecular analysis of the VHH repertoires of two sets of immunized animals showed that each neutralizing lineage was only observed following immunization, demonstrating that they were elicited de novo. Our results show that immunization can induce potent and broadly neutralizing antibodies in llamas with features similar to human antibodies and provide a framework to analyze the effectiveness of immunization protocols.

  5. Self-Study as an Emergent Methodology in Career and Technical Education, Adult Education and Technology: An Invitation to Inquiry

    Science.gov (United States)

    Hawley, Todd S.; Hostetler, Andrew L.

    2017-01-01

    In this manuscript, the authors explore self-study as an emerging research methodology with the potential to open up spaces of inquiry for researchers, graduate students, and teachers in a broad array of fields. They argue that the fields of career and technical education (CTE), adult education and technology can leverage self-study methodology in…

  6. Safety Leadership Defined within the Australian Construction Industry

    Directory of Open Access Journals (Sweden)

    Luke Daniel

    2015-11-01

    Full Text Available This research explores the tenets of safety leadership within the Australian construction environment. The scope of this research aims to establish a universal definition of safety leadership and how it differs from other leadership disciplines. The literature review into this topic was governed by the parent disciplines of Safety and Leadership.  Gaps were identified in the literature that indicated safety leadership is not a well-defined concept and much of the work into safety leadership has been borrowed from other schools of leadership. An exploratory research methodology was utilised which rooted the research into the post-positivist methodology. There were twenty interviews conducted for this research, with participants coming from various leadership positions across multiple construction projects around Australia. Findings detailed a saturation of data that allowed for an empirical definition towards safety leadership to be established. As a person’s scope of responsibility increases, their view of safety leadership becomes synonymous with leadership; although differences do exist. These differences were attributed to the importance of demonstrating safety and working within the legal framework of Australian construction projects. It is proposed that this research offers a substantial contribution to knowledge, based upon a well-defined definition into safety leadership.

  7. Pharmacy sales data versus ward stock accounting for the surveillance of broad-spectrum antibiotic use in hospitals

    Science.gov (United States)

    2011-01-01

    Background Antibiotic consumption in hospitals is commonly measured using the accumulated amount of drugs delivered from the pharmacy to ward held stocks. The reliability of this method, particularly the impact of the length of the registration periods, has not been evaluated and such evaluation was aim of the study. Methods During 26 weeks, we performed a weekly ward stock count of use of broad-spectrum antibiotics - that is second- and third-generation cephalosporins, carbapenems, and quinolones - in five hospital wards and compared the data with corresponding pharmacy sales figures during the same period. Defined daily doses (DDDs) for antibiotics were used as measurement units (WHO ATC/DDD classification). Consumption figures obtained with the two methods for different registration intervals were compared by use of intraclass correlation analysis and Bland-Altman statistics. Results Broad-spectrum antibiotics accounted for a quarter to one-fifth of all systemic antibiotics (ATC group J01) used in the hospital and varied between wards, from 12.8 DDDs per 100 bed days in a urological ward to 24.5 DDDs in a pulmonary diseases ward. For the entire study period of 26 weeks, the pharmacy and ward defined daily doses figures for all broad-spectrum antibiotics differed only by 0.2%; however, for single wards deviations varied from -4.3% to 6.9%. The intraclass correlation coefficient, pharmacy versus ward data, increased from 0.78 to 0.94 for parenteral broad-spectrum antibiotics with increasing registration periods (1-4 weeks), whereas the corresponding figures for oral broad-spectrum antibiotics (ciprofloxacin) were from 0.46 to 0.74. For all broad-spectrum antibiotics and for parenteral antibiotics, limits of agreement between the two methods showed, according to Bland-Altman statistics, a deviation of ± 5% or less from average mean DDDs at 3- and 4-weeks registration intervals. Corresponding deviation for oral antibiotics was ± 21% at a 4-weeks interval

  8. Mapping plant species ranges in the Hawaiian Islands: developing a methodology and associated GIS layers

    Science.gov (United States)

    Price, Jonathan P.; Jacobi, James D.; Gon, Samuel M.; Matsuwaki, Dwight; Mehrhoff, Loyal; Wagner, Warren; Lucas, Matthew; Rowe, Barbara

    2012-01-01

    This report documents a methodology for projecting the geographic ranges of plant species in the Hawaiian Islands. The methodology consists primarily of the creation of several geographic information system (GIS) data layers depicting attributes related to the geographic ranges of plant species. The most important spatial-data layer generated here is an objectively defined classification of climate as it pertains to the distribution of plant species. By examining previous zonal-vegetation classifications in light of spatially detailed climate data, broad zones of climate relevant to contemporary concepts of vegetation in the Hawaiian Islands can be explicitly defined. Other spatial-data layers presented here include the following: substrate age, as large areas of the island of Hawai'i, in particular, are covered by very young lava flows inimical to the growth of many plant species; biogeographic regions of the larger islands that are composites of multiple volcanoes, as many of their species are restricted to a given topographically isolated mountain or a specified group of them; and human impact, which can reduce the range of many species relative to where they formerly were found. Other factors influencing the geographic ranges of species that are discussed here but not developed further, owing to limitations in rendering them spatially, include topography, soils, and disturbance. A method is described for analyzing these layers in a GIS, in conjunction with a database of species distributions, to project the ranges of plant species, which include both the potential range prior to human disturbance and the projected present range. Examples of range maps for several species are given as case studies that demonstrate different spatial characteristics of range. Several potential applications of species-range maps are discussed, including facilitating field surveys, informing restoration efforts, studying range size and rarity, studying biodiversity, managing

  9. Methodology for plastic fracture - a progress report

    International Nuclear Information System (INIS)

    Wilkinson, J.P.D.; Smith, R.E.E.

    1977-01-01

    This paper describes the progress of a study to develop a methodology for plastic fracture. Such a fracture mechanics methodology, having application in the plastic region, is required to assess the margin of safety inherent in nuclear reactor pressure vessels. The initiation and growth of flaws in pressure vessels under overload conditions is distinguished by a number of unique features, such as large scale yielding, three-dimensional structural and flaw configurations, and failure instabilities that may be controlled by either toughness or plastic flow. In order to develop a broadly applicable methodology of plastic fracture, these features require the following analytical and experimental studies: development of criteria for crack initiation and growth under large scale yielding; the use of the finite element method to describe elastic-plastic behaviour of both the structure and the crack tip region; and extensive experimental studies on laboratory scale and large scale specimens, which attempt to reproduce the pertinent plastic flow and crack growth phenomena. This discussion centers on progress to date on the selection, through analysis and laboratory experiments, of viable criteria for crack initiation and growth during plastic fracture. (Auth.)

  10. Toward 5G software defined radio receiver front-ends

    CERN Document Server

    Spiridon, Silvian

    2016-01-01

    This book introduces a new intuitive design methodology for the optimal design path for next-generation software defined radio front-ends (SDRXs). The methodology described empowers designers to "attack" the multi-standard environment in a parallel way rather than serially, providing a critical tool for any design methodology targeting 5G circuits and systems. Throughout the book the SDRX design follows the key wireless standards of the moment (i.e., GSM, WCDMA, LTE, Bluetooth, WLAN), since a receiver compatible with these standards is the most likely candidate for the first design iteration in a 5G deployment. The author explains the fundamental choice the designer has to make regarding the optimal channel selection: how much of the blockers/interferers will be filtered in the analog domain and how much will remain to be filtered in the digital domain. The system-level analysis the author describes entails the direct sampling architecture is treated as a particular case of mixer-based direct conversion archi...

  11. Globalization and social determinants of health: Introduction and methodological background (part 1 of 3)

    Science.gov (United States)

    Labonté, Ronald; Schrecker, Ted

    2007-01-01

    Globalization is a key context for the study of social determinants of health (SDH). Broadly stated, SDH are the conditions in which people live and work, and that affect their opportunities to lead healthy lives. In this first article of a three-part series, we describe the origins of the series in work conducted for the Globalization Knowledge Network of the World Health Organization's Commission on Social Determinants of Health and in the Commission's specific concern with health equity. We explain our rationale for defining globalization with reference to the emergence of a global marketplace, and the economic and political choices that have facilitated that emergence. We identify a number of conceptual milestones in studying the relation between globalization and SDH over the period 1987–2005, and then show that because globalization comprises multiple, interacting policy dynamics, reliance on evidence from multiple disciplines (transdisciplinarity) and research methodologies is required. So, too, is explicit recognition of the uncertainties associated with linking globalization – the quintessential "upstream" variable – with changes in SDH and in health outcomes. PMID:17578568

  12. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    DOE Order 5637.1, ''Classified Computer Security,'' requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, we have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system. 1 tab

  13. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    This paper reports on DIE Order 5637.1, Classified Computer Security, which requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, the authors have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system

  14. Methodological remarks on studying prehistoric Greek religion

    Directory of Open Access Journals (Sweden)

    Petra Pakkanen

    1999-01-01

    Full Text Available This paper presents a methodological approach to the study of Greek religion of the period which lacks written documents, i.e. prehistory. The assumptions and interpretations of religion of that time have to be based on archaeological material. How do we define religion and cultic activity on the basis of primary archaeological material from this period, and which are the methodological tools for this difficult task? By asking questions on the nature and definition of religion and culture scholars of religion have provided us with some methodological apparatus to approach religion of the past in general, but there are models developed by archaeologists as well. Critical combination of these methodological tools leads to the best possible result. Archaeology studies the material culture of the past. History of religion studies the spiritual culture of the past. In the background the two have important theoretical and even philosophical speculations since they both deal with meanings (of things or practices and with interpretation.

  15. A methodology for development of biocatalytic processes

    DEFF Research Database (Denmark)

    Lima Ramos, Joana

    are available. The first case study presents a rational approach for defining a development strategy for multi-enzymatic processes. The proposed methodology requires a profound and structured knowledge of the multi-enzyme systems, integrating chemistry, biological and process engineering. In order to suggest......). These process metrics can often be attained by improvements in the reaction chemistry, the biocatalyst, and/or by process engineering, which often requires a complex process development strategy. Interestingly this complexity, which arises from the need for integration of biological and process technologies...... and their relationship with the overall process is not clear.The work described in this thesis presents a methodological approach for early stage development of biocatalytic processes, understanding and dealing with the reaction, biocatalyst and process constraints. When applied, this methodology has a decisive role...

  16. RAMA Methodology for the Calculation of Neutron Fluence

    International Nuclear Information System (INIS)

    Villescas, G.; Corchon, F.

    2013-01-01

    he neutron fluence plays an important role in the study of the structural integrity of the reactor vessel after a certain time of neutron irradiation. The NRC defined in the Regulatory Guide 1.190, the way must be estimated neutron fluence, including uncertainty analysis of the validation process (creep uncertainty is ? 20%). TRANSWARE Enterprises Inc. developed a methodology for calculating the neutron flux, 1,190 based guide, known as RAMA. Uncertainty values obtained with this methodology, for about 18 vessels, are less than 10%.

  17. Methodological proposal for defining mining vocation in Venezuelan land-use planning; Propuesta metodologica para definir la vocacion minera en el contexto del ordenamiento territorial venezolano

    Energy Technology Data Exchange (ETDEWEB)

    Valladares Salinas, R. Y.; Dall Pozzo, F.; Castillo Padron, A. J.

    2015-07-01

    Mining is an economic activity which is necessary to provide raw materials for the different socio-productive networks of the country. Its application depends on the mineralogical occurrence offered by the geological conditions of an area and requires planning due to the fact that it needs to be located in a geographical space and the environment fragility of this space and the socio-economic and political-institutional legislation at a given moment all have to be taken into account. Therefore, the objective of this research consists of a proposal for a methodological model to define areas with mining adapted to the Venezuelan context of land-use planning, in order to assign mining uses to the most appropriate areas for that goal, considering the selection, assessment and integration of a series of variables and indicators adjusted to the thematic information available. (Author)

  18. Defining the “proven technology” technical criterion in the reactor technology assessment for Malaysia’s nuclear power program

    Energy Technology Data Exchange (ETDEWEB)

    Anuar, Nuraslinda, E-mail: nuraslinda@uniten.edu.my [College of Engineering, Universiti Tenaga Nasional, Jalan IKRAM-UNITEN, 43000 Kajang, Selangor (Malaysia); Kahar, Wan Shakirah Wan Abdul, E-mail: shakirah@tnb.com.my; Manan, Jamal Abdul Nasir Abd [Nuclear Energy Department, Regulatory Economics and Planning Division, Tenaga Nasional Berhad, No. 8 Jalan Tun Sambanthan, Brickfields, 50470 Kuala Lumpur (Malaysia)

    2015-04-29

    Developing countries that are considering the deployment of nuclear power plants (NPPs) in the near future need to perform reactor technology assessment (RTA) in order to select the most suitable reactor design. The International Atomic Energy Agency (IAEA) reported in the Common User Considerations (CUC) document that “proven technology” is one of the most important technical criteria for newcomer countries in performing the RTA. The qualitative description of five desired features for “proven technology” is relatively broad and only provides a general guideline to its characterization. This paper proposes a methodology to define the “proven technology” term according to a specific country’s requirements using a three-stage evaluation process. The first evaluation stage screens the available technologies in the market against a predefined minimum Technology Readiness Level (TRL) derived as a condition based on national needs and policy objectives. The result is a list of technology options, which are then assessed in the second evaluation stage against quantitative definitions of CUC desired features for proven technology. The potential technology candidates produced from this evaluation is further narrowed down to obtain a list of proven technology candidates by assessing them against selected risk criteria and the established maximum allowable total score using a scoring matrix. The outcome of this methodology is the proven technology candidates selected using an accurate definition of “proven technology” that fulfills the policy objectives, national needs and risk, and country-specific CUC desired features of the country that performs this assessment. A simplified assessment for Malaysia is carried out to demonstrate and suggest the use of the proposed methodology. In this exercise, ABWR, AP1000, APR1400 and EPR designs assumed the top-ranks of proven technology candidates according to Malaysia’s definition of “proven technology”.

  19. Defining the “proven technology” technical criterion in the reactor technology assessment for Malaysia’s nuclear power program

    International Nuclear Information System (INIS)

    Anuar, Nuraslinda; Kahar, Wan Shakirah Wan Abdul; Manan, Jamal Abdul Nasir Abd

    2015-01-01

    Developing countries that are considering the deployment of nuclear power plants (NPPs) in the near future need to perform reactor technology assessment (RTA) in order to select the most suitable reactor design. The International Atomic Energy Agency (IAEA) reported in the Common User Considerations (CUC) document that “proven technology” is one of the most important technical criteria for newcomer countries in performing the RTA. The qualitative description of five desired features for “proven technology” is relatively broad and only provides a general guideline to its characterization. This paper proposes a methodology to define the “proven technology” term according to a specific country’s requirements using a three-stage evaluation process. The first evaluation stage screens the available technologies in the market against a predefined minimum Technology Readiness Level (TRL) derived as a condition based on national needs and policy objectives. The result is a list of technology options, which are then assessed in the second evaluation stage against quantitative definitions of CUC desired features for proven technology. The potential technology candidates produced from this evaluation is further narrowed down to obtain a list of proven technology candidates by assessing them against selected risk criteria and the established maximum allowable total score using a scoring matrix. The outcome of this methodology is the proven technology candidates selected using an accurate definition of “proven technology” that fulfills the policy objectives, national needs and risk, and country-specific CUC desired features of the country that performs this assessment. A simplified assessment for Malaysia is carried out to demonstrate and suggest the use of the proposed methodology. In this exercise, ABWR, AP1000, APR1400 and EPR designs assumed the top-ranks of proven technology candidates according to Malaysia’s definition of “proven technology”

  20. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    International Nuclear Information System (INIS)

    Dorp, F. van

    1996-09-01

    team in modelling, radioecology etc, can be readily incorporated. Iterative cross-checking of the interaction matrix and FEP List contents is regarded as an important part of the procedure. FEP Lists of the type referred to above can be developed for specific assessments, eg, through applying the interaction matrix methodology. An example of the type of software tool which can be used to maintain and extend a FEP List has been developed within the Group. It is called BIOFEP and is described in an Appendix. Detailed assumptions about migration and accumulation of radionuclides in biosphere media with which humans interact will be strongly dependent upon the assumptions which have to be made about the individuals or population groups for whom radiation doses are being assessed. The Working Group has reviewed these 'critical group' assumptions and found considerable variability in both regulatory specification and in performance assessments designed to meet regulatory objectives. Consistency in approach to regulation and assessment is to be desired. While the Working Group has been broadly successful in setting out an appropriate methodology and providing useful input to model development in terms of FEPs and application experience, further activities are recommended. In summary, these involve further testing and augmentation of the methodology: to consider a wider range of basic system descriptions; to more fully develop conceptual models according to the methodology; to examine other types of release from the geosphere; to develop principles for critical group definition; and to develop principles for applying field and other data in defining parameters used in models to represent processes

  1. Measuring the impact of methodological research: a framework and methods to identify evidence of impact.

    Science.gov (United States)

    Brueton, Valerie C; Vale, Claire L; Choodari-Oskooei, Babak; Jinks, Rachel; Tierney, Jayne F

    2014-11-27

    Providing evidence of impact highlights the benefits of medical research to society. Such evidence is increasingly requested by research funders and commonly relies on citation analysis. However, other indicators may be more informative. Although frameworks to demonstrate the impact of clinical research have been reported, no complementary framework exists for methodological research. Therefore, we assessed the impact of methodological research projects conducted or completed between 2009 and 2012 at the UK Medical Research Council Clinical Trials Unit Hub for Trials Methodology Research Hub, with a view to developing an appropriate framework. Various approaches to the collection of data on research impact were employed. Citation rates were obtained using Web of Science (http://www.webofknowledge.com/) and analyzed descriptively. Semistructured interviews were conducted to obtain information on the rates of different types of research output that indicated impact for each project. Results were then pooled across all projects. Finally, email queries pertaining to methodology projects were collected retrospectively and their content analyzed. Simple citation analysis established the citation rates per year since publication for 74 methodological publications; however, further detailed analysis revealed more about the potential influence of these citations. Interviews that spanned 20 individual research projects demonstrated a variety of types of impact not otherwise collated, for example, applications and further developments of the research; release of software and provision of guidance materials to facilitate uptake; formation of new collaborations and broad dissemination. Finally, 194 email queries relating to 6 methodological projects were received from 170 individuals across 23 countries. They provided further evidence that the methodologies were impacting on research and research practice, both nationally and internationally. We have used the information

  2. Optimal Switch Configuration in Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    Béla GENGE

    2016-06-01

    Full Text Available The emerging Software-Defined Networks (SDN paradigm facilitates innovative applications and enables the seamless provisioning of resilient communications. Nevertheless, the installation of communication flows in SDN requires careful planning in order to avoid configuration errors and to fulfill communication requirements. In this paper we propose an approach that installs automatically and optimally static flows in SDN switches. The approach aims to select high capacity links and shortest path routing, and enforces communication link and switch capacity limitations. Experimental results demonstrate the effectiveness and scalability of the developed methodology.

  3. A design methodology for unattended monitoring systems

    International Nuclear Information System (INIS)

    SMITH, JAMES D.; DELAND, SHARON M.

    2000-01-01

    The authors presented a high-level methodology for the design of unattended monitoring systems, focusing on a system to detect diversion of nuclear materials from a storage facility. The methodology is composed of seven, interrelated analyses: Facility Analysis, Vulnerability Analysis, Threat Assessment, Scenario Assessment, Design Analysis, Conceptual Design, and Performance Assessment. The design of the monitoring system is iteratively improved until it meets a set of pre-established performance criteria. The methodology presented here is based on other, well-established system analysis methodologies and hence they believe it can be adapted to other verification or compliance applications. In order to make this approach more generic, however, there needs to be more work on techniques for establishing evaluation criteria and associated performance metrics. They found that defining general-purpose evaluation criteria for verifying compliance with international agreements was a significant undertaking in itself. They finally focused on diversion of nuclear material in order to simplify the problem so that they could work out an overall approach for the design methodology. However, general guidelines for the development of evaluation criteria are critical for a general-purpose methodology. A poor choice in evaluation criteria could result in a monitoring system design that solves the wrong problem

  4. 77 FR 50144 - Broad Stakeholder Survey

    Science.gov (United States)

    2012-08-20

    ... DEPARTMENT OF HOMELAND SECURITY [Docket No. DHS-2012-0042] Broad Stakeholder Survey AGENCY... Information Collection Request: 1670-NEW. SUMMARY: The Department of Homeland Security (DHS), National... (Pub. L. 104-13, 44 U.S.C. Chapter 35). NPPD is soliciting comments concerning the Broad Stakeholder...

  5. 76 FR 34087 - Broad Stakeholder Survey

    Science.gov (United States)

    2011-06-10

    ... DEPARTMENT OF HOMELAND SECURITY [Docket No. DHS-2011-0027] Broad Stakeholder Survey AGENCY... Information Collection Request: 1670-NEW. SUMMARY: The Department of Homeland Security (DHS), National... (Pub. L. 104-13, 44 U.S.C. Chapter 35). NPPD is soliciting comments concerning the Broad Stakeholder...

  6. A systematic review of the methodology of telemedicine evaluation in patients with postural and movement disorders

    NARCIS (Netherlands)

    Huis in 't Veld, M.H.A.; van Dijk, H; Hermens, Hermanus J.; Vollenbroek-Hutten, Miriam Marie Rosé

    2006-01-01

    We reviewed the methodology used in telemedicine research concerning patients with postural and movement disorders. Literature searches were performed using various computerized databases through to October 2005. Twenty-two studies met the criteria for review. Two broad models of telemedicine

  7. Defining waste acceptance criteria for the Hanford Replacement Cross-Site Transfer System

    International Nuclear Information System (INIS)

    Hudson, J.D.

    1996-04-01

    This document provides a methodology for defining waste acceptance criteria for the Hanford Replacement Cross-Site Transfer System (RCSTS). This methodology includes characterization, transport analysis, and control. A framework is described for each of these functions. A tool was developed for performing the calculations associated with the transport analysis. This tool, a worksheet that is available in formats acceptable for a variety of PC spreadsheet programs, enables a comparison of the pressure required to transport a given slurry at a rate that particulate suspension is maintained to the pressure drop available from the RCSTS

  8. Methodology for performing RF reliability experiments on a generic test structure

    NARCIS (Netherlands)

    Sasse, G.T.; de Vries, Rein J.; Schmitz, Jurriaan

    2007-01-01

    This paper discusses a new technique developed for generating well defined RF large voltage swing signals for on wafer experiments. This technique can be employed for performing a broad range of different RF reliability experiments on one generic test structure. The frequency dependence of a

  9. Defining the Constructs of Expert Coaching: A Q-Methodological Study of Olympic Sport Coaches

    Science.gov (United States)

    DeWeese, Brad Heath

    2012-01-01

    The purpose of this study was to enhance the development of coaches for participation at International level competition through the improvement of coaching education programming. Although many studies have alluded to the benefit of various coaching education tactics, no study to date had set out to determine the constructs that define an expert…

  10. Stability studies needed to define the handling and transport conditions of sensitive pharmaceutical or biotechnological products.

    Science.gov (United States)

    Ammann, Claude

    2011-12-01

    Many pharmaceutical or biotechnological products require transport using temperature-controlled systems to keep their therapeutic properties. There are presently no official guidelines for testing pharmaceutical products in order to define suitable transport specifications. After reviewing the current guidance documents, this paper proposes a methodology for testing pharmaceutical products and defining appropriate transport conditions.

  11. Technical report on LWR design decision methodology. Phase I

    International Nuclear Information System (INIS)

    1980-03-01

    Energy Incorporated (EI) was selected by Sandia Laboratories to develop and test on LWR design decision methodology. Contract Number 42-4229 provided funding for Phase I of this work. This technical report on LWR design decision methodology documents the activities performed under that contract. Phase I was a short-term effort to thoroughly review the curret LWR design decision process to assure complete understanding of current practices and to establish a well defined interface for development of initial quantitative design guidelines

  12. Assessing species saturation: conceptual and methodological challenges.

    Science.gov (United States)

    Olivares, Ingrid; Karger, Dirk N; Kessler, Michael

    2018-05-07

    Is there a maximum number of species that can coexist? Intuitively, we assume an upper limit to the number of species in a given assemblage, or that a lineage can produce, but defining and testing this limit has proven problematic. Herein, we first outline seven general challenges of studies on species saturation, most of which are independent of the actual method used to assess saturation. Among these are the challenge of defining saturation conceptually and operationally, the importance of setting an appropriate referential system, and the need to discriminate among patterns, processes and mechanisms. Second, we list and discuss the methodological approaches that have been used to study species saturation. These approaches vary in time and spatial scales, and in the variables and assumptions needed to assess saturation. We argue that assessing species saturation is possible, but that many studies conducted to date have conceptual and methodological flaws that prevent us from currently attaining a good idea of the occurrence of species saturation. © 2018 Cambridge Philosophical Society.

  13. Applicability of the Directed Graph Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Huszti, Jozsef [Institute of Isotope of the Hungarian Academy of Sciences, Budapest (Hungary); Nemeth, Andras [ESRI Hungary, Budapest (Hungary); Vincze, Arpad [Hungarian Atomic Energy Authority, Budapest (Hungary)

    2012-06-15

    Possible methods to construct, visualize and analyse the 'map' of the State's nuclear infrastructure based on different directed graph approaches are proposed. The transportation and the flow network models are described in detail. The use of the possible evaluation methodologies and the use of available software tools to construct and maintain the nuclear 'map' using pre-defined standard building blocks (nuclear facilities) are introduced and discussed.

  14. The Future of Contextual Fear Learning for PTSD Research: A Methodological Review of Neuroimaging Studies.

    Science.gov (United States)

    Glenn, Daniel E; Risbrough, Victoria B; Simmons, Alan N; Acheson, Dean T; Stout, Daniel M

    2017-10-21

    There has been a great deal of recent interest in human models of contextual fear learning, particularly due to the use of such paradigms for investigating neural mechanisms related to the etiology of posttraumatic stress disorder. However, the construct of "context" in fear conditioning research is broad, and the operational definitions and methods used to investigate contextual fear learning in humans are wide ranging and lack specificity, making it difficult to interpret findings about neural activity. Here we will review neuroimaging studies of contextual fear acquisition in humans. We will discuss the methodology associated with four broad categories of how contextual fear learning is manipulated in imaging studies (colored backgrounds, static picture backgrounds, virtual reality, and configural stimuli) and highlight findings for the primary neural circuitry involved in each paradigm. Additionally, we will offer methodological recommendations for human studies of contextual fear acquisition, including using stimuli that distinguish configural learning from discrete cue associations and clarifying how context is experimentally operationalized.

  15. Methodology for Measuring the Complexity of Enterprise Information Systems

    Directory of Open Access Journals (Sweden)

    Ilja Holub

    2016-07-01

    Full Text Available The complexity of enterprise information systems is currently a challenge faced not only by IT professionals and project managers, but also by the users of such systems. Current methodologies and frameworks used to design and implement information systems do not specifically deal with the issue of their complexity and, apart from few exceptions, do not at all attempt to simplify the complexity. This article presents the author's own methodology for managing complexity, which can be used to complement any other methodology and which helps limit the growth of complexity. It introduces its own definition and metric of complexity, which it defines as the sum of entities of the individual UML models of the given system, which are selected according to the MMDIS methodology so as to consistently describe all relevant content dimensions of the system. The main objective is to propose a methodology to manage information system complexity and to verify it in practice on a real-life SAP implementation project.

  16. The Maudsley Outpatient Study of Treatments for Anorexia Nervosa and Related Conditions (MOSAIC): Comparison of the Maudsley Model of Anorexia Nervosa Treatment for Adults (MANTRA) with specialist supportive clinical management (SSCM) in outpatients with broadly defined anorexia nervosa: A randomized controlled trial.

    Science.gov (United States)

    Schmidt, Ulrike; Magill, Nicholas; Renwick, Bethany; Keyes, Alexandra; Kenyon, Martha; Dejong, Hannah; Lose, Anna; Broadbent, Hannah; Loomes, Rachel; Yasin, Huma; Watson, Charlotte; Ghelani, Shreena; Bonin, Eva-Maria; Serpell, Lucy; Richards, Lorna; Johnson-Sabine, Eric; Boughton, Nicky; Whitehead, Linette; Beecham, Jennifer; Treasure, Janet; Landau, Sabine

    2015-08-01

    Anorexia nervosa (AN) in adults has poor outcomes, and treatment evidence is limited. This study evaluated the efficacy and acceptability of a novel, targeted psychological therapy for AN (Maudsley Model of Anorexia Nervosa Treatment for Adults; MANTRA) compared with Specialist Supportive Clinical Management (SSCM). One hundred forty-two outpatients with broadly defined AN (body mass index [BMI] ≤ 18.5 kg/m²) were randomly allocated to receive 20 to 30 weekly sessions (depending on clinical severity) plus add-ons (4 follow-up sessions, optional sessions with dietician and with carers) of MANTRA (n = 72) or SSCM (n = 70). Assessments were administered blind to treatment condition at baseline, 6 months, and 12 months after randomization. The primary outcome was BMI at 12 months. Secondary outcomes included eating disorders symptomatology, other psychopathology, neuro-cognitive and social cognition, and acceptability. Additional service utilization was also assessed. Outcomes were analyzed using linear mixed models. Both treatments resulted in significant improvements in BMI and reductions in eating disorders symptomatology, distress levels, and clinical impairment over time, with no statistically significant difference between groups at either 6 or 12 months. Improvements in neuro-cognitive and social-cognitive measures over time were less consistent. One SSCM patient died. Compared with SSCM, MANTRA patients rated their treatment as significantly more acceptable and credible at 12 months. There was no significant difference between groups in additional service consumption. Both treatments appear to have value as first-line outpatient interventions for patients with broadly defined AN. Longer term outcomes remain to be evaluated. (c) 2015 APA, all rights reserved).

  17. Methodology to define biological reference values in the environmental and occupational fields: the contribution of the Italian Society for Reference Values (SIVR).

    Science.gov (United States)

    Aprea, Maria Cristina; Scapellato, Maria Luisa; Valsania, Maria Carmen; Perico, Andrea; Perbellini, Luigi; Ricossa, Maria Cristina; Pradella, Marco; Negri, Sara; Iavicoli, Ivo; Lovreglio, Piero; Salamon, Fabiola; Bettinelli, Maurizio; Apostoli, Pietro

    2017-04-21

    Biological reference values (RVs) explore the relationships between humans and their environment and habits. RVs are fundamental in the environmental field for assessing illnesses possibly associated with environmental pollution, and also in the occupational field, especially in the absence of established biological or environmental limits. The Italian Society for Reference Values (SIVR) determined to test criteria and procedures for the definition of RVs to be used in the environmental and occupational fields. The paper describes the SIVR methodology for defining RVs of xenobiotics and their metabolites. Aspects regarding the choice of population sample, the quality of analytical data, statistical analysis and control of variability factors are considered. The simultaneous interlaboratory circuits involved can be expected to increasingly improve the quality of the analytical data. Examples of RVs produced by SIVR are presented. In particular, levels of chromium, mercury, ethylenethiourea, 3,5,6-trichloro-2-pyridinol, 2,5-hexanedione, 1-hydroxypyrene and t,t-muconic acid measured in urine and expressed in micrograms/g creatinine (μg/g creat) or micrograms/L (μg/L) are reported. With the proposed procedure, SIVR intends to make its activities known to the scientific community in order to increase the number of laboratories involved in the definition of RVs for the Italian population. More research is needed to obtain further RVs in different biological matrices, such as hair, nails and exhaled breath. It is also necessary to update and improve the present reference values and broaden the portfolio of chemicals for which RVs are available. In the near future, SIVR intends to expand its scientific activity by using a multivariate approach for xenobiotics that may have a common origin, and to define RVs separately for children who may be exposed more than adults and be more vulnerable.

  18. A Methodology for Safety Culture Impact Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kiyoon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of)

    2014-05-15

    The purpose of this study is to develop methodology for assessing safety culture impact on nuclear power plants. A new methodology for assessing safety culture impact index has been developed and applied for the reference nuclear power plants. The developed SCII model might contribute to comparing the level of safety culture among nuclear power plants as well as to improving the safety of nuclear power plants. Safety culture is defined to be fundamental attitudes and behaviors of the plant staff which demonstrate that nuclear safety is the most important consideration in all activities conducted in nuclear power operation. Through several accidents of nuclear power plant including the Fukusima Daiichi in 2011 and Chernovyl accidents in 1986, the safety of nuclear power plant is emerging into a matter of interest. From the accident review report, it can be easily found out that safety culture is important and one of dominant contributors to accidents. However, the impact methodology for assessing safety culture has not been established analytically yet. It is difficult to develop the methodology for assessing safety culture impact quantitatively.

  19. A Methodology for Safety Culture Impact Assessment

    International Nuclear Information System (INIS)

    Han, Kiyoon; Jae, Moosung

    2014-01-01

    The purpose of this study is to develop methodology for assessing safety culture impact on nuclear power plants. A new methodology for assessing safety culture impact index has been developed and applied for the reference nuclear power plants. The developed SCII model might contribute to comparing the level of safety culture among nuclear power plants as well as to improving the safety of nuclear power plants. Safety culture is defined to be fundamental attitudes and behaviors of the plant staff which demonstrate that nuclear safety is the most important consideration in all activities conducted in nuclear power operation. Through several accidents of nuclear power plant including the Fukusima Daiichi in 2011 and Chernovyl accidents in 1986, the safety of nuclear power plant is emerging into a matter of interest. From the accident review report, it can be easily found out that safety culture is important and one of dominant contributors to accidents. However, the impact methodology for assessing safety culture has not been established analytically yet. It is difficult to develop the methodology for assessing safety culture impact quantitatively

  20. 'Emerging technologies for the changing global market' - Prioritization methodology for chemical replacement

    Science.gov (United States)

    Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt

    1993-01-01

    This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.

  1. Broad Prize: Do the Successes Spread?

    Science.gov (United States)

    Samuels, Christina A.

    2011-01-01

    When the Broad Prize for Urban Education was created in 2002, billionaire philanthropist Eli Broad said he hoped the awards, in addition to rewarding high-performing school districts, would foster healthy competition; boost the prestige of urban education, long viewed as dysfunctional; and showcase best practices. Over the 10 years the prize has…

  2. ASSESSMENT OF QUALITY OF LIFE: PRESENT AND FUTURE METHODOLOGICAL CHALLENGES

    Directory of Open Access Journals (Sweden)

    Isabel Benítez

    2016-01-01

    Full Text Available The growing importance of quality of life in diverse domains, such as health, school performance and social participation, has led to the development of new conceptualisations and assessments of the construct. This diversity of perspectives brings about many benefits, but it also creates an obstacle for the formulation of a single unifying definition of the construct and, therefore, an agreed instrument or assessment framework. The aim of this study is to discuss the current methodological challenges in the measurement of quality of life. Firstly, we provide a brief description of the construct as defined in various areas, then we examine the new methodological developments and different applications. We also present an overview of the different possibilities for future developments in defining and measuring quality of life in national and international studies.

  3. Integration of infrared thermography into various maintenance methodologies

    Science.gov (United States)

    Morgan, William T.

    1993-04-01

    Maintenance methodologies are in developmental stages throughout the world as global competitiveness drives all industries to improve operational efficiencies. Rapid progress in technical advancements has added an additional strain on maintenance organizations to progressively change. Accompanying needs for advanced training and documentation is the demand for utilization of various analytical instruments and quantitative methods. Infrared thermography is one of the primary elements of engineered approaches to maintenance. Current maintenance methodologies can be divided into six categories; Routine ('Breakdown'), Preventive, Predictive, Proactive, Reliability-Based, and Total Productive (TPM) maintenance. Each of these methodologies have distinctive approaches to achieving improved operational efficiencies. Popular though is that infrared thermography is a Predictive maintenance tool. While this is true, it is also true that it can be effectively integrated into each of the maintenance methodologies for achieving desired results. The six maintenance strategies will be defined. Infrared applications integrated into each will be composed in tabular form.

  4. FINANCIAL ACCOUNTING QUALITY AND ITS DEFINING CHARACTERISTICS

    Directory of Open Access Journals (Sweden)

    Andra M. ACHIM

    2014-11-01

    Full Text Available The importance ofhigh-quality financial statements is highlighted by the main standard-setting institutions activating in the field of accounting and reporting. These have issued Conceptual Frameworks which state and describe the qualitative characteristics of accounting information. In this qualitative study, the research methodology consists of reviewing the literature related to the definition of accounting quality and striving for understanding how it can be explained. The main objective of the study is to identify the characteristics information should possess in order to be of high quality. These characteristics also contribute to the way of defining financial accounting quality. The main conclusions that arise from this research are represented by the facts that indeed financial accounting quality cannot be uniquely defined and that financial information is of good quality when it enhances the characteristics incorporated in the conceptual frameworks issued by both International Accounting Standards Board and Financial Accounting Standards Board.

  5. PROBLEMS AND METHODOLOGY OF THE PETROLOGIC ANALYSIS OF COAL FACIES.

    Science.gov (United States)

    Chao, Edward C.T.

    1983-01-01

    This condensed synthesis gives a broad outline of the methodology of coal facies analysis, procedures for constructing sedimentation and geochemical formation curves, and micro- and macrostratigraphic analysis. The hypothetical coal bed profile has a 3-fold cycle of material characteristics. Based on studies of other similar profiles of the same coal bed, and on field studies of the sedimentary rock types and their facies interpretation, one can assume that the 3-fold subdivision is of regional significance.

  6. Implementing the cost-optimal methodology in EU countries

    DEFF Research Database (Denmark)

    Atanasiu, Bogdan; Kouloumpi, Ilektra; Thomsen, Kirsten Engelund

    This study presents three cost-optimal calculations. The overall aim is to provide a deeper analysis and to provide additional guidance on how to properly implement the cost-optimality methodology in Member States. Without proper guidance and lessons from exemplary case studies using realistic...... input data (reflecting the likely future development), there is a risk that the cost-optimal methodology may be implemented at sub-optimal levels. This could lead to a misalignment between the defined cost-optimal levels and the long-term goals, leaving a significant energy saving potential unexploited....... Therefore, this study provides more evidence on the implementation of the cost-optimal methodology and highlights the implications of choosing different values for key factors (e.g. discount rates, simulation variants/packages, costs, energy prices) at national levels. The study demonstrates how existing...

  7. Designing ordering and inventory management methodologies for purchased parts

    NARCIS (Netherlands)

    de Boer, L.; Looman, Arnold; Ruffini, F.A.J.

    2002-01-01

    This article presents a method for redesigning the ordering and inventory management methodologies for purchased parts in a manufacturing firm. The method takes the perspective of the purchasing and logistics manager, defines clusters of purchased items, and subsequently assigns each cluster to a

  8. Critical Race Design: An Emerging Methodological Approach to Anti-Racist Design and Implementation Research

    Science.gov (United States)

    Khalil, Deena; Kier, Meredith

    2017-01-01

    This article is about introducing Critical Race Design (CRD), a research methodology that centers race and equity at the nucleus of educational opportunities by design. First, the authors define design-based implementation research (DBIR; Penuel, Fishman, Cheng, & Sabelli, 2011) as an equity-oriented education research methodology where…

  9. A methodology for characterization and categorization of solutions for micro handling

    DEFF Research Database (Denmark)

    Gegeckaite, Asta; Hansen, Hans Nørgaard

    2005-01-01

    is in the range of 0.1-10 micrometers. The importance of considering the entire micro handling scenario is imperative if operational solutions should be designed. The methodology takes into consideration component design (dimension, geometry, material, weight etc.), type of handling operation (characteristics......This paper presents a methodology whereby solutions for micro handling are characterized and classified. The purpose of defining such a methodology is to identify different possible integrated solutions with respect to a specific micro handling scenario in a development phase. The typical accuracy......, tolerances, speed, lot sizes etc.) and handling/gripping principles (contact, non-contact etc.). The methodology will be applied to a case study in order to demonstrate the feasibility of the method....

  10. Reliability analysis for power supply system in a reprocessing facility based on GO methodology

    International Nuclear Information System (INIS)

    Wang Renze

    2014-01-01

    GO methodology was applied to analyze the reliability of power supply system in a typical reprocessing facility. Based on the fact that tie breakers are set in the system, tie breaker operator was defined. Then GO methodology modeling and quantitative analysis were performed sequently, minimal cut sets and average unavailability of the system were obtained. Parallel analysis between GO methodology and fault tree methodology was also performed. The results showed that setup of tie breakers was rational and necessary and that the modeling was much easier and the chart was much more succinct for GO methodology parallel with fault tree methodology to analyze the reliability of the power supply system. (author)

  11. Improvement of Safety Assessment Methodologies for Near Surface Disposal Facilities

    International Nuclear Information System (INIS)

    Batandjieva, B.; Torres-Vidal, C.

    2002-01-01

    The International Atomic Energy Agency (IAEA) Coordinated research program ''Improvement of Safety Assessment Methodologies for Near Surface Disposal Facilities'' (ISAM) has developed improved safety assessment methodology for near surface disposal facilities. The program has been underway for three years and has included around 75 active participants from 40 countries. It has also provided examples for application to three safety cases--vault, Radon type and borehole radioactive waste disposal facilities. The program has served as an excellent forum for exchange of information and good practices on safety assessment approaches and methodologies used worldwide. It also provided an opportunity for reaching broad consensus on the safety assessment methodologies to be applied to near surface low and intermediate level waste repositories. The methodology has found widespread acceptance and the need for its application on real waste disposal facilities has been clearly identified. The ISAM was finalized by the end of 2000, working material documents are available and an IAEA report will be published in 2002 summarizing the work performed during the three years of the program. The outcome of the ISAM program provides a sound basis for moving forward to a new IAEA program, which will focus on practical application of the safety assessment methodologies to different purposes, such as licensing radioactive waste repositories, development of design concepts, upgrading existing facilities, reassessment of operating repositories, etc. The new program will also provide an opportunity for development of guidance on application of the methodology that will be of assistance to both safety assessors and regulators

  12. Broad-band time-resolved near infrared spectroscopy in the TJ-II stellarator

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, M.C.; Pastor, I.; Cal, E. de la; McCarthy, K.J. [Laboratorio Nacional de Fusion, CIEMAT, Madrid (Spain); Diaz, D. [Universidad Autonoma de Madrid, Dept Quimica Fisica Aplicada, Madrid (Spain)

    2014-11-15

    First experimental results on broad-band, time-resolved Near Infrared (NIR;here loosely defined as covering from 750 to 1650 nm) passive spectroscopy using a high sensitivity InGaAs detector are reported for the TJ-II Stellarator. Experimental set-up is described together with its main characteristics, the most remarkable ones being its enhanced NIR response, broadband spectrum acquisition in a single shot, and time-resolved measurements with up to 1.8 kHz spectral rate. Prospects for future work and more extended physics studies in this newly open spectral region in TJ-II are discussed. (copyright 2014 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  13. High-Level Design for Ultra-Fast Software Defined Radio Prototyping on Multi-Processors Heterogeneous Platforms

    OpenAIRE

    Moy , Christophe; Raulet , Mickaël

    2010-01-01

    International audience; The design of Software Defined Radio (SDR) equipments (terminals, base stations, etc.) is still very challenging. We propose here a design methodology for ultra-fast prototyping on heterogeneous platforms made of GPPs (General Purpose Processors), DSPs (Digital Signal Processors) and FPGAs (Field Programmable Gate Array). Lying on a component-based approach, the methodology mainly aims at automating as much as possible the design from an algorithmic validation to a mul...

  14. A broad-scale structural classification of vegetation for practical purposes

    Directory of Open Access Journals (Sweden)

    E. Edwards

    1983-11-01

    Full Text Available An a priori system is presented for the broad structural classification of vegetation. The objectives are to provide a descriptive, consistent, easily applied system, with unambiguous, straight-forward terminology, which can be used in the field and with remote sensing and air photo techniques, and which can be used in conjuction with floristic and habitat terms to convey the essential physiognomy and structure of the vegetation. The attributes used are a primary set of four growth forms, a set of four projected crown cover classes, and a set of four height classes for each growth form. In addition, shrub substratum is used to define thicket and bushland. Special growth forms, substrata!, leaf and other attributes can be readily incorporated to extend the two-way table system where such detail is needed.

  15. User needs for a standardized CO2 emission assessment methodology for intelligent transport systems

    NARCIS (Netherlands)

    Mans, D.; Rekiel, J.; Wolfermann, A.; Klunder, G.

    2012-01-01

    The Amitran FP7 project will define a reference methodology to assess the impact of intelligent transport systems on CO2 emissions. The methodology is intended to be used as a reference by future projects and covers both passenger and freight transport. The project will lead to a validated

  16. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Gavras, A.; Belaunde, M.; Ferreira Pires, Luis; Andrade Almeida, João

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  17. Measuring service line competitive position. A systematic methodology for hospitals.

    Science.gov (United States)

    Studnicki, J

    1991-01-01

    To mount a broad effort aimed at improving their competitive position for some service or group of services, hospitals have begun to pursue product line management techniques. A few hospitals have even reorganized completely under the product line framework. The benefits include focusing accountability for operations and results, facilitating coordination between departments and functions, stimulating market segmentation, and promoting rigorous examination of new and existing programs. As part of its strategic planning process, a suburban Baltimore hospital developed a product line management methodology with six basic steps: (1) define the service lines (which they did by grouping all existing diagnosis-related groups into 35 service lines), (2) determine the contribution of each service line to total inpatient volume, (3) determine trends in service line volumes (by comparing data over time), (4) derive a useful comparison group (competing hospitals or groups of hospitals with comparable size, scope of services, payer mix, and financial status), (5) review multiple time frames, and (6) summarize the long- and short-term performance of the hospital's service lines to focus further analysis. This type of systematic and disciplined analysis can become part of a permanent strategic intelligence program. When hospitals have such a program in place, their market research, planning, budgeting, and operations will be tied together in a true management decision support system.

  18. Theory of information warfare: basic framework, methodology and conceptual apparatus

    Directory of Open Access Journals (Sweden)

    Олександр Васильович Курбан

    2015-11-01

    Full Text Available It is conducted a comprehensive theoretical study and determine the basic provisions of the modern theory of information warfare in on-line social networks. Three basic blocks, which systematized the theoretical and methodological basis of the topic, are established. There are information and psychological war, social off-line and on-line network. According to the three blocks, theoretical concepts are defined and methodological substantiation of information processes within the information warfare in the social on-line networks is formed

  19. [The importance of defining methodology for post-marketing observational studies on cardiovascular therapies].

    Science.gov (United States)

    Pelliccia, Francesco; Barillà, Francesco; Tanzilli, Gaetano; Viceconte, Nicola; Paravati, Vincenzo; Mangieri, Enrico; Gaudio, Carlo

    2017-01-01

    In recent years, a growing number of observational studies in cardiology have been carried out following the criticism that rigid design of randomized clinical trials produces information that is not applicable to the general patient. This approach is very common in several branches of medicine, first of all oncology, but has often been considered marginal in cardiology. The recent introduction of new oral anticoagulants (NOACs) on the market, however, has seen a proliferation of "real-life" studies, drawing the attention of cardiologists to the advantages and limitations of post-marketing studies. NOACs have been approved for use on the basis of large randomized clinical trials that have clearly documented their efficacy and safety. Since they have become available, the analysis of phase IV data has been considered crucial and therefore a great amount of information on the use of NOACs in daily practice has become available. It should be considered, however, that the possibility exists that results obtained from "real-world" studies, which do not apply rigid scientific criteria, may lead to incorrect conclusions. Accordingly, it is mandatory to fully define the operational standards of observational studies. All the protagonists of post-marketing analysis (physicians, epidemiologists, pharmacologists, statisticians) should handle the data strictly in order to ensure their reliability and comparability with other studies. To this end, it is crucial that researchers follow rigorous operational protocols for phase IV studies. Briefly, any "real-life" study should be prospective and adhere to what is prespecified by the research protocol - which must illustrate the background and rationale of the study, define its primary endpoint, and detail the methods, i.e. study design, population and variables.

  20. Utilization of critical group and representative person methodologies: differences and difficulties

    International Nuclear Information System (INIS)

    Ferreira, Nelson L.D.; Rochedo, Elaine R.R.; Mazzilli, Barbara P.

    2013-01-01

    In Brazil, the assessment of the environmental impact due to routine discharges of radionuclides, which is used to the public protection, normally is based on the determination of the so-called 'critical group'. For the same purpose, the ICRP (2007) proposed the adoption of the 'representative person', defined as the individual receiving a dose representative of the members of the population who are subject to the higher exposures. In this work, are discussed, basically, the different characteristics of each one (critical group and representative person), related, mainly, to its methodologies and the necessary data demanded. Some difficulties to obtain site specific data, mainly habit data, as well as the way they are used, are discussed too. The critical group methodology uses, basically, average values, while the representative person methodology performs deterministic or probabilistic analysis using values obtained from distributions. As reference, it was considered the predicted effluents releases from Uranium Hexafluoride Production Plant (USEXA) and the effective doses calculated to the members of the previously defined critical group of Centro Experimental Aramar (CEA). (author)

  1. Using the Spatial Distribution of Installers to Define Solar Photovoltaic Markets

    Energy Technology Data Exchange (ETDEWEB)

    O' Shaughnessy, Eric [National Renewable Energy Lab. (NREL), Golden, CO (United States); Nemet, Gregory F. [Univ. of Wisconsin, Madison, WI (United States); Darghouth, Naim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-09-01

    Solar PV market research to date has largely relied on arbitrary jurisdictional boundaries, such as counties, to study solar PV market dynamics. This paper seeks to improve solar PV market research by developing a methodology to define solar PV markets. The methodology is based on the spatial distribution of solar PV installers. An algorithm is developed and applied to a rich dataset of solar PV installations to study the outcomes of the installer-based market definitions. The installer-based approach exhibits several desirable properties. Specifically, the higher market granularity of the installer-based approach will allow future PV market research to study the relationship between market dynamics and pricing with more precision.

  2. What Synthesis Methodology Should I Use? A Review and Analysis of Approaches to Research Synthesis.

    Science.gov (United States)

    Schick-Makaroff, Kara; MacDonald, Marjorie; Plummer, Marilyn; Burgess, Judy; Neander, Wendy

    2016-01-01

    When we began this process, we were doctoral students and a faculty member in a research methods course. As students, we were facing a review of the literature for our dissertations. We encountered several different ways of conducting a review but were unable to locate any resources that synthesized all of the various synthesis methodologies. Our purpose is to present a comprehensive overview and assessment of the main approaches to research synthesis. We use 'research synthesis' as a broad overarching term to describe various approaches to combining, integrating, and synthesizing research findings. We conducted an integrative review of the literature to explore the historical, contextual, and evolving nature of research synthesis. We searched five databases, reviewed websites of key organizations, hand-searched several journals, and examined relevant texts from the reference lists of the documents we had already obtained. We identified four broad categories of research synthesis methodology including conventional, quantitative, qualitative, and emerging syntheses. Each of the broad categories was compared to the others on the following: key characteristics, purpose, method, product, context, underlying assumptions, unit of analysis, strengths and limitations, and when to use each approach. The current state of research synthesis reflects significant advancements in emerging synthesis studies that integrate diverse data types and sources. New approaches to research synthesis provide a much broader range of review alternatives available to health and social science students and researchers.

  3. CAGE IIIA Distributed Simulation Design Methodology

    Science.gov (United States)

    2014-05-01

    2 VHF Very High Frequency VLC Video LAN Codec – an Open-source cross-platform multimedia player and framework VM Virtual Machine VOIP Voice Over...Implementing Defence Experimentation (GUIDEx). The key challenges for this methodology are with understanding how to: • design it o define the...operation and to be available in the other nation’s simulations. The challenge for the CAGE campaign of experiments is to continue to build upon this

  4. Fire Support Requirements Methodology Study, Phase 2 Proceedings of the Fire Support Methodology Workshop

    Science.gov (United States)

    1975-12-18

    It was not immediatei- clear that the -approach- would- succeed in overcoming the deficiencies of present fire support methodologies which demand- an...support require analysis up to Level 6. They also felt that deficiencies in f technique were most serious at Levels 3, 4 and 5. It was accepted that...defined as: Tk2 = _Tkl ilk2 kl (2) Tkt = Tk,t-l - ’lMktMk,t-l + 𔃼kt ,t-2 I t > (3. where Mt refers to the-number of type k targets killed in time

  5. Towards more sustainable management of European food waste: Methodological approach and numerical application.

    Science.gov (United States)

    Manfredi, Simone; Cristobal, Jorge

    2016-09-01

    Trying to respond to the latest policy needs, the work presented in this article aims at developing a life-cycle based framework methodology to quantitatively evaluate the environmental and economic sustainability of European food waste management options. The methodology is structured into six steps aimed at defining boundaries and scope of the evaluation, evaluating environmental and economic impacts and identifying best performing options. The methodology is able to accommodate additional assessment criteria, for example the social dimension of sustainability, thus moving towards a comprehensive sustainability assessment framework. A numerical case study is also developed to provide an example of application of the proposed methodology to an average European context. Different options for food waste treatment are compared, including landfilling, composting, anaerobic digestion and incineration. The environmental dimension is evaluated with the software EASETECH, while the economic assessment is conducted based on different indicators expressing the costs associated with food waste management. Results show that the proposed methodology allows for a straightforward identification of the most sustainable options for food waste, thus can provide factual support to decision/policy making. However, it was also observed that results markedly depend on a number of user-defined assumptions, for example on the choice of the indicators to express the environmental and economic performance. © The Author(s) 2016.

  6. A methodology for overall consequence modeling in chemical industry

    International Nuclear Information System (INIS)

    Arunraj, N.S.; Maiti, J.

    2009-01-01

    Risk assessment in chemical process industry is a very important issue for safeguarding human and the ecosystem from damages caused to them. Consequence assessment is an integral part of risk assessment. However, the commonly used consequence estimation methods involve time-consuming complex mathematical models and simple assimilation of losses without considering all the consequence factors. This lead to the deterioration of quality of estimated risk value. So, the consequence modeling has to be performed in detail considering all major losses with optimal time to improve the decisive value of risk. The losses can be broadly categorized into production loss, assets loss, human health and safety loss, and environment loss. In this paper, a conceptual framework is developed to assess the overall consequence considering all the important components of major losses. Secondly, a methodology is developed for the calculation of all the major losses, which are normalized to yield the overall consequence. Finally, as an illustration, the proposed methodology is applied to a case study plant involving benzene extraction. The case study result using the proposed consequence assessment scheme is compared with that from the existing methodologies.

  7. Definably compact groups definable in real closed fields. I

    OpenAIRE

    Barriga, Eliana

    2017-01-01

    We study definably compact definably connected groups definable in a sufficiently saturated real closed field $R$. We introduce the notion of group-generic point for $\\bigvee$-definable groups and show the existence of group-generic points for definably compact groups definable in a sufficiently saturated o-minimal expansion of a real closed field. We use this notion along with some properties of generic sets to prove that for every definably compact definably connected group $G$ definable in...

  8. Defining Innovation: Using Soft Systems Methodology to Approach the Complexity of Innovation in Educational Technology

    Science.gov (United States)

    Cox, Glenda

    2010-01-01

    This paper explores what educational technologists in one South African Institution consider innovation to be. Ten educational technologists in various faculties across the university were interviewed and asked to define and answer questions about innovation. Their answers were coded and the results of the overlaps in coding have been assimilated…

  9. A theorem on the methodology of positive economics

    Directory of Open Access Journals (Sweden)

    Eduardo Pol

    2015-12-01

    Full Text Available It has long been recognized that the Milton Friedman’s 1953 essay on economic methodology (or F53, for short displays open-ended unclarities. For example, the notion of “unrealistic assumption” plays a role of absolutely fundamental importance in his methodological framework, but the term itself was never unambiguously defined in any of the Friedman’s contributions to the economics discipline. As a result, F53 is appealing and liberating because the choice of premises in economic theorizing is not subject to any constraints concerning the degree of realisticness (or unrealisticness of the assumptions. The question: “Does the methodology of positive economics prevent the overlapping between economics and science fiction?” comes very naturally, indeed. In this paper, we show the following theorem: the Friedman’s methodology of positive economics does not exclude science fiction. This theorem is a positive statement, and consequently, it does not involve value judgements. However, it throws a wrench on the formulation of economic policy based on surreal models.

  10. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    Buus, Lillian; Georgsen, Marianne; Ryberg, Thomas

    In this paper we discuss the notion of a learning methodology and situate this within the wider frame of learning design or ?Designing for Learning?. We discuss existing work within this broad area by trying to categorize different approaches and interpretations and we present our development...... of particular ?mediating design artefacts?. We discuss what can be viewed as a lack of attention paid to integrating the preferred teaching styles and learning philosophies of practitioners into design tools, and present a particular method for learning design; the COllaborative E-learning Design method (Co......Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning...

  11. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    Buus, Lillian; Georgsen, Marianne; Ryberg, Thomas

    2017-01-01

    In this paper we discuss the notion of a learning methodology and situate this within the wider frame of learning design or ?Designing for Learning?. We discuss existing work within this broad area by trying to categorize different approaches and interpretations and we present our development...... of particular ?mediating design artefacts?. We discuss what can be viewed as a lack of attention paid to integrating the preferred teaching styles and learning philosophies of practitioners into design tools, and present a particular method for learning design; the COllaborative E-learning Design method (Co......Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning...

  12. Defining key features of the broad autism phenotype: a comparison across parents of multiple- and single-incidence autism families.

    Science.gov (United States)

    Losh, Molly; Childress, Debra; Lam, Kristen; Piven, Joseph

    2008-06-05

    This study examined the frequency of personality, language, and social-behavioral characteristics believed to comprise the broad autism phenotype (BAP), across families differing in genetic liability to autism. We hypothesized that within this unique sample comprised of multiple-incidence autism families (MIAF), single-incidence autism families (SIAF), and control Down syndrome families (DWNS), a graded expression would be observed for the principal characteristics conferring genetic susceptibility to autism, in which such features would express most profoundly among parents from MIAFs, less strongly among SIAFs, and least of all among comparison parents from DWNS families, who should display population base rates. Analyses detected linear expression of traits in line with hypotheses, and further suggested differential intrafamilial expression across family types. In the vast majority of MIAFs both parents displayed BAP characteristics, whereas within SIAFs, it was equally likely that one, both, or neither parent show BAP features. The significance of these findings is discussed in relation to etiologic mechanisms in autism and relevance to molecular genetic studies. (c) 2007 Wiley-Liss, Inc.

  13. Staff and consumer perspectives on defining treatment success and failure in assertive community treatment.

    Science.gov (United States)

    Stull, Laura G; McGrew, John H; Salyers, Michelle P

    2010-09-01

    Although assertive community treatment (ACT) has been consistently recognized as effective, there has been little research as to what constitutes success in ACT. The purpose of this study was to understand how ACT consumers and staff define treatment success and failure and to examine whether definitions varied between staff and consumers. Investigators conducted semistructured interviews with 25 staff and 23 consumers from four ACT teams. Across perspectives, success and failure were most clearly related to consumer factors. Other themes included having basic needs met, being socially involved, and taking medications. Reduced hospitalizations were mentioned infrequently. Consumers were more likely than staff to identify the level or type of treatment as defining success and failure, whereas staff were more likely than consumers to discuss substance abuse when defining failure and improved symptoms when defining success. Success in ACT should be viewed more broadly than reduced hospitalizations and include domains such as social involvement.

  14. Validation, automatic generation and use of broad phonetic transcriptions

    NARCIS (Netherlands)

    Bael, Cristophe Patrick Jan Van

    2007-01-01

    Broad phonetic transcriptions represent the pronunciation of words as strings of characters from specifically designed symbol sets. In everyday life, broad phonetic transcriptions are often used as aids to pronounce (foreign) words. In addition, broad phonetic transcriptions are often used for

  15. Multicriteria methodology for decision aiding

    CERN Document Server

    Roy, Bernard

    1996-01-01

    This is the first comprehensive book to present, in English, the multicriteria methodology for decision aiding In the foreword the distinctive features and main ideas of the European School of MCDA are outlined The twelve chapters are essentially expository in nature, but scholarly in treatment Some questions, which are too often neglected in the literature on decision theory, such as how is a decision made, who are the actors, what is a decision aiding model, how to define the set of alternatives, are discussed Examples are used throughout the book to illustrate the various concepts Ways to model the consequences of each alternative and building criteria taking into account the inevitable imprecisions, uncertainties and indeterminations are described and illustrated The three classical operational approaches of MCDA synthesis in one criterion (including MAUT), synthesis by outranking relations, interactive local judgements, are studied This methodology tries to be a theoretical or intellectual framework dire...

  16. The Methodology of the Process of Formation of Innovation Management of Enterprises’ Development

    Directory of Open Access Journals (Sweden)

    Prokhorova Viktoriia V.

    2017-12-01

    Full Text Available The article is aimed at forming the methodology of process of innovation management of enterprises’ development in modern conditions. A study on formation of the essence of methodology was carried out, the stages of development of methods and means of scientific cognition were analyzed. The basic components of formation of methodology of innovation management of development of enterprises have been defined, i.e.: methods, types, principles, components, systematized aggregate. The relations of empirical and theoretical methods of scientific cognition were considered and defined. It has been determined that the increase of the volume and scope of scientific views, as well as the deepening of scientific knowledge in the disclosure of laws and regularities of functioning of real natural and social world, lead to the objective fact that is the desire of scientists to analyze methods and means by which modern innovative knowledge and views in the enterprise management system can be acquired and formed.

  17. Summary of the IAEA's BIOMASS reference biosphere methodology for Environment Agency staff

    International Nuclear Information System (INIS)

    Coughtrey, P.

    2001-01-01

    The International Atomic Energy Agency (IAEA) programme on BIOsphere Modelling and ASSessment (BIOMASS) was launched in October 1996, and will complete during 2001. The BIOMASS programme aimed to develop and apply a methodology for defining biospheres for practical radiological assessment of releases from radioactive waste disposal. This report provides a summary description of the BIOMASS methodology. The BIOMASS methodology has been developed through international collaboration and represents a major milestone in biosphere modelling. It provides an approach supported by a wide range of developers, regulators, biosphere experts and safety assessment specialists. The Environment Agency participated actively in the BIOMASS programme

  18. Defining the baseline in social life cycle assessment

    DEFF Research Database (Denmark)

    Jørgensen, Andreas; Finkbeiner, Matthias; Jørgensen, Michael Søgaard

    2010-01-01

    A relatively broad consensus has formed that the purpose of developing and using the social life cycle assessment (SLCA) is to improve the social conditions for the stakeholders affected by the assessed product's life cycle. To create this effect, the SLCA, among other things, needs to provide...... valid assessments of the consequence of the decision that it is to support. The consequence of a decision to implement a life cycle of a product can be seen as the difference between the decision being implemented and 'non-implemented' product life cycle. This difference can to some extent be found...... using the consequential environmental life cycle assessment (ELCA) methodology to identify the processes that change as a consequence of the decision. However, if social impacts are understood as certain changes in the lives of the stakeholders, then social impacts are not only related to product life...

  19. Base line definitions and methodological lessons from Zimbabwe

    International Nuclear Information System (INIS)

    Maya, R.S.

    1995-01-01

    The UNEP Greenhouse Gas Abatement Costing Studies carried out under the management of the UNEP Collaborating Centre On Energy and Environment at Risoe National Laboratories in Denmark has placed effort in generating methodological approaches to assessing the cost of abatement activities to reduce CO 2 emissions. These efforts have produced perhaps the most comprehensive set of methodological approaches to defining and assessing the cost of greenhouse gas abatement. Perhaps the most importance aspect of the UNEP study which involved teams of researchers from ten countries is the mix of countries in which the studies were conducted and hence the representation of views and concepts from researchers in these countries particularly those from developing countries namely, Zimbabwe, India, Venezuela, Brazil, Thailand and Senegal. Methodological lessons from Zimbabwe, therefore, would have benefited from the interactions with methodological experiences form the other participating countries. Methodological lessons from the Zimbabwean study can be placed in two categories. One relates to the modelling of tools to analyze economic trends and the various factors studied in order to determine the unit cost of CO 2 abatement. The other is the definition of factors influencing the levels of emissions reducible and those realised under specific economic trends. (au)

  20. Genetic signatures in the envelope glycoproteins of HIV-1 that associate with broadly neutralizing antibodies.

    Directory of Open Access Journals (Sweden)

    S Gnanakaran

    Full Text Available A steady increase in knowledge of the molecular and antigenic structure of the gp120 and gp41 HIV-1 envelope glycoproteins (Env is yielding important new insights for vaccine design, but it has been difficult to translate this information to an immunogen that elicits broadly neutralizing antibodies. To help bridge this gap, we used phylogenetically corrected statistical methods to identify amino acid signature patterns in Envs derived from people who have made potently neutralizing antibodies, with the hypothesis that these Envs may share common features that would be useful for incorporation in a vaccine immunogen. Before attempting this, essentially as a control, we explored the utility of our computational methods for defining signatures of complex neutralization phenotypes by analyzing Env sequences from 251 clonal viruses that were differentially sensitive to neutralization by the well-characterized gp120-specific monoclonal antibody, b12. We identified ten b12-neutralization signatures, including seven either in the b12-binding surface of gp120 or in the V2 region of gp120 that have been previously shown to impact b12 sensitivity. A simple algorithm based on the b12 signature pattern was predictive of b12 sensitivity/resistance in an additional blinded panel of 57 viruses. Upon obtaining these reassuring outcomes, we went on to apply these same computational methods to define signature patterns in Env from HIV-1 infected individuals who had potent, broadly neutralizing responses. We analyzed a checkerboard-style neutralization dataset with sera from 69 HIV-1-infected individuals tested against a panel of 25 different Envs. Distinct clusters of sera with high and low neutralization potencies were identified. Six signature positions in Env sequences obtained from the 69 samples were found to be strongly associated with either the high or low potency responses. Five sites were in the CD4-induced coreceptor binding site of gp120, suggesting an

  1. A performance assessment methodology for low-level radioactive waste disposal

    International Nuclear Information System (INIS)

    Derring, L.R.

    1990-01-01

    To demonstrate compliance with the performance objectives governing protection of the general population in 10 CFR 61.41, applicants for land disposal of low-level radioactive waste are required to conduct a pathways analysis, or quantitative evaluation of radionuclide release, transport through environmental media, and dose to man. The Nuclear Regulatory Commission staff defined a strategy and initiated a project at Sandia National Laboratories to develop a methodology for independently evaluating an applicant's analysis of postclosure performance. This performance assessment methodology was developed in five stages: identification of environmental pathways, ranking the significance of the pathways, identification and integration of models for pathway analyses, identification and selection of computer codes and techniques for the methodology, and implementation of the codes and documentation of the methodology. This paper summarizes the NRC approach for conducting evaluations of license applications for low-level radioactive waste facilities. 23 refs

  2. Practical implementation of a methodology for digital images authentication using forensics techniques

    OpenAIRE

    Francisco Rodríguez-Santos; Guillermo Delgado-Gutierréz; Leonardo Palacios-Luengas; Rubén Vázquez Medina

    2015-01-01

    This work presents a forensics analysis methodology implemented to detect modifications in JPEG digital images by analyzing the image’s metadata, thumbnail, camera traces and compression signatures. Best practices related with digital evidence and forensics analysis are considered to determine if the technical attributes and the qualities of an image are consistent with each other. This methodology is defined according to the recommendations of the Good Practice Guide for Computer-Based Elect...

  3. Defining the cortical visual systems: "what", "where", and "how"

    Science.gov (United States)

    Creem, S. H.; Proffitt, D. R.; Kaiser, M. K. (Principal Investigator)

    2001-01-01

    The visual system historically has been defined as consisting of at least two broad subsystems subserving object and spatial vision. These visual processing streams have been organized both structurally as two distinct pathways in the brain, and functionally for the types of tasks that they mediate. The classic definition by Ungerleider and Mishkin labeled a ventral "what" stream to process object information and a dorsal "where" stream to process spatial information. More recently, Goodale and Milner redefined the two visual systems with a focus on the different ways in which visual information is transformed for different goals. They relabeled the dorsal stream as a "how" system for transforming visual information using an egocentric frame of reference in preparation for direct action. This paper reviews recent research from psychophysics, neurophysiology, neuropsychology and neuroimaging to define the roles of the ventral and dorsal visual processing streams. We discuss a possible solution that allows for both "where" and "how" systems that are functionally and structurally organized within the posterior parietal lobe.

  4. Systematic review of communication partner training in aphasia: methodological quality.

    Science.gov (United States)

    Cherney, Leora R; Simmons-Mackie, Nina; Raymer, Anastasia; Armstrong, Elizabeth; Holland, Audrey

    2013-10-01

    Twenty-three studies identified from a previous systematic review examining the effects of communication partner training on persons with aphasia and their communication partners were evaluated for methodological quality. Two reviewers rated the studies on defined methodological quality criteria relevant to each study design. There were 11 group studies, seven single-subject participant design studies, and five qualitative studies. Quality scores were derived for each study. The mean inter-rater reliability of scores for each study design ranged from 85-93%, with Cohen's Kappa indicating substantial agreement between raters. Methodological quality of research on communication partner training in aphasia was highly varied. Overall, group studies employed the least rigorous methodology as compared to single subject and qualitative research. Only two of 11 group studies complied with more than half of the quality criteria. No group studies reported therapist blinding and only one group study reported participant blinding. Across all types of studies, the criterion of treatment fidelity was most commonly omitted. Failure to explicitly report certain methodological quality criteria may account for low ratings. Using methodological rating scales specific to the type of study design may help improve the methodological quality of aphasia treatment studies, including those on communication partner training.

  5. What Synthesis Methodology Should I Use? A Review and Analysis of Approaches to Research Synthesis

    Science.gov (United States)

    Schick-Makaroff, Kara; MacDonald, Marjorie; Plummer, Marilyn; Burgess, Judy; Neander, Wendy

    2016-01-01

    Background When we began this process, we were doctoral students and a faculty member in a research methods course. As students, we were facing a review of the literature for our dissertations. We encountered several different ways of conducting a review but were unable to locate any resources that synthesized all of the various synthesis methodologies. Our purpose is to present a comprehensive overview and assessment of the main approaches to research synthesis. We use ‘research synthesis’ as a broad overarching term to describe various approaches to combining, integrating, and synthesizing research findings. Methods We conducted an integrative review of the literature to explore the historical, contextual, and evolving nature of research synthesis. We searched five databases, reviewed websites of key organizations, hand-searched several journals, and examined relevant texts from the reference lists of the documents we had already obtained. Results We identified four broad categories of research synthesis methodology including conventional, quantitative, qualitative, and emerging syntheses. Each of the broad categories was compared to the others on the following: key characteristics, purpose, method, product, context, underlying assumptions, unit of analysis, strengths and limitations, and when to use each approach. Conclusions The current state of research synthesis reflects significant advancements in emerging synthesis studies that integrate diverse data types and sources. New approaches to research synthesis provide a much broader range of review alternatives available to health and social science students and researchers. PMID:29546155

  6. What Synthesis Methodology Should I Use? A Review and Analysis of Approaches to Research Synthesis.

    Directory of Open Access Journals (Sweden)

    Kara Schick-Makaroff

    2016-03-01

    Full Text Available Background: When we began this process, we were doctoral students and a faculty member in a research methods course. As students, we were facing a review of the literature for our dissertations. We encountered several different ways of conducting a review but were unable to locate any resources that synthesized all of the various synthesis methodologies. Our purpose is to present a comprehensive overview and assessment of the main approaches to research synthesis. We use ‘research synthesis’ as a broad overarching term to describe various approaches to combining, integrating, and synthesizing research findings. Methods: We conducted an integrative review of the literature to explore the historical, contextual, and evolving nature of research synthesis. We searched five databases, reviewed websites of key organizations, hand-searched several journals, and examined relevant texts from the reference lists of the documents we had already obtained. Results: We identified four broad categories of research synthesis methodology including conventional, quantitative, qualitative, and emerging syntheses. Each of the broad categories was compared to the others on the following: key characteristics, purpose, method, product, context, underlying assumptions, unit of analysis, strengths and limitations, and when to use each approach. Conclusions: The current state of research synthesis reflects significant advancements in emerging synthesis studies that integrate diverse data types and sources. New approaches to research synthesis provide a much broader range of review alternatives available to health and social science students and researchers.

  7. A Methodology to Implement an Information Security Management System

    Directory of Open Access Journals (Sweden)

    Alaíde Barbosa Martins

    2005-08-01

    Full Text Available Information security has actually been a major challenge to most organizations. Indeed, information security is an ongoing risk management process that covers all of the information that needs to be protected. ISO 17799 offers what companies need in order to better manage information security. The best way to implement this standard is to ease the security management process using a methodology that will define will define guidelines, procedures and tools that will be needed along the way. Hence, this paper proposes a methodology to assist companies in assessing their compliance with BS 7799/ ISO 17799 as well as planning and implementing the actions necessary to become compliant or certified to the standard. The concepts and ideas presented here had been applied in a case study involving the Cetrel S/A - Company of Environmental Protection. For this company, responsible for treatment of industrial residues generated by the Camaçari Petrochemical Complex and adjacent areas, to assure confidentiality and integrity of customers' data is a basic requirement.

  8. Action research methodology in clinical pharmacy

    DEFF Research Database (Denmark)

    Nørgaard, Lotte Stig; Sørensen, Ellen Westh

    2016-01-01

    Introduction The focus in clinical pharmacy practice is and has for the last 30-35 years been on changing the role of pharmacy staff into service orientation and patient counselling. One way of doing this is by involving staff in change process and as a researcher to take part in the change process...... by establishing partnerships with staff. On the background of the authors' widespread action research (AR)-based experiences, recommendations and comments for how to conduct an AR-study is described, and one of their AR-based studies illustrate the methodology and the research methods used. Methodology AR...... is defined as an approach to research which is based on a problem-solving relationship between researchers and clients, which aims at both solving a problem and at collaboratively generating new knowledge. Research questions relevant in AR-studies are: what was the working process in this change oriented...

  9. Methodological challenges in retailer buying behaviour research

    DEFF Research Database (Denmark)

    Hansen, Tommy Holm; Skytte, Hans

    This paper presents a review of studies on retailer buying behaviour with focus on the methodological issues. It is argued that the researcher of retailer buying behaviour is faced with particular challenges regarding the sample frame, defining th of analysis, potentially small populations and low...... response rates, buying centres and product specific behaviour. At the end, the authors propose a descriptive research design that will try to take account of the mentioned issues....

  10. Definably compact groups definable in real closed fields.II

    OpenAIRE

    Barriga, Eliana

    2017-01-01

    We continue the analysis of definably compact groups definable in a real closed field $\\mathcal{R}$. In [3], we proved that for every definably compact definably connected semialgebraic group $G$ over $\\mathcal{R}$ there are a connected $R$-algebraic group $H$, a definable injective map $\\phi$ from a generic definable neighborhood of the identity of $G$ into the group $H\\left(R\\right)$ of $R$-points of $H$ such that $\\phi$ acts as a group homomorphism inside its domain. The above result and o...

  11. Scaling up methodology for CO2 emissions in ICT applications in traffic and transport in Europe

    NARCIS (Netherlands)

    Mans, D.; Jonkers, E.; Giannelos, I.; Palanciuc, D.

    2013-01-01

    The Amitran project aims to define a reference methodology for evaluating the effects of ICT measures in trafäc and transport on energy efficiency and consequently CO2 emissions. This methodology can be used as a reference by future projects and will address different modes for both passenger and

  12. Neural Networks Methodology and Applications

    CERN Document Server

    Dreyfus, Gérard

    2005-01-01

    Neural networks represent a powerful data processing technique that has reached maturity and broad application. When clearly understood and appropriately used, they are a mandatory component in the toolbox of any engineer who wants make the best use of the available data, in order to build models, make predictions, mine data, recognize shapes or signals, etc. Ranging from theoretical foundations to real-life applications, this book is intended to provide engineers and researchers with clear methodologies for taking advantage of neural networks in industrial, financial or banking applications, many instances of which are presented in the book. For the benefit of readers wishing to gain deeper knowledge of the topics, the book features appendices that provide theoretical details for greater insight, and algorithmic details for efficient programming and implementation. The chapters have been written by experts ands seemlessly edited to present a coherent and comprehensive, yet not redundant, practically-oriented...

  13. A methodology for the transfer of probabilities between accident severity categories

    International Nuclear Information System (INIS)

    Whitlow, J.D.; Neuhauser, K.S.

    1991-01-01

    A methodology has been developed which allows the accident probabilities associated with one accident-severity category scheme to be transferred to another severity category scheme. The methodology requires that the schemes use a common set of parameters to define the categories. The transfer of accident probabilities is based on the relationships between probability of occurrence and each of the parameters used to define the categories. Because of the lack of historical data describing accident environments in engineering terms, these relationships may be difficult to obtain directly for some parameters. Numerical models or experienced judgement are often needed to obtain the relationships. These relationships, even if they are not exact, allow the accident probability associated with any severity category to be distributed within that category in a manner consistent with accident experience, which in turn will allow the accident probability to be appropriately transferred to a different category scheme

  14. Relative Hazard Calculation Methodology

    International Nuclear Information System (INIS)

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-01-01

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation)

  15. A Methodology to Support Decision Making in Flood Plan Mitigation

    Science.gov (United States)

    Biscarini, C.; di Francesco, S.; Manciola, P.

    2009-04-01

    The focus of the present document is on specific decision-making aspects of flood risk analysis. A flood is the result of runoff from rainfall in quantities too great to be confined in the low-water channels of streams. Little can be done to prevent a major flood, but we may be able to minimize damage within the flood plain of the river. This broad definition encompasses many possible mitigation measures. Floodplain management considers the integrated view of all engineering, nonstructural, and administrative measures for managing (minimizing) losses due to flooding on a comprehensive scale. The structural measures are the flood-control facilities designed according to flood characteristics and they include reservoirs, diversions, levees or dikes, and channel modifications. Flood-control measures that modify the damage susceptibility of floodplains are usually referred to as nonstructural measures and may require minor engineering works. On the other hand, those measures designed to modify the damage potential of permanent facilities are called non-structural and allow reducing potential damage during a flood event. Technical information is required to support the tasks of problem definition, plan formulation, and plan evaluation. The specific information needed and the related level of detail are dependent on the nature of the problem, the potential solutions, and the sensitivity of the findings to the basic information. Actions performed to set up and lay out the study are preliminary to the detailed analysis. They include: defining the study scope and detail, the field data collection, a review of previous studies and reports, and the assembly of needed maps and surveys. Risk analysis can be viewed as having many components: risk assessment, risk communication and risk management. Risk assessment comprises an analysis of the technical aspects of the problem, risk communication deals with conveying the information and risk management involves the decision process

  16. Study of methodology diversification in diagnostics

    International Nuclear Information System (INIS)

    Suda, Kazunori; Yonekawa, Tsuyoshi; Yoshikawa, Shinji; Hasegawa, Makoto

    1999-03-01

    There are several research activities to enhance safety and reliability of nuclear power plant operation and maintenance. We are developing a concept of an autonomous operation system where the role of operators is replaced with artificial intelligence. The purpose of the study described in this report is to develop a operator support system in abnormal plant situations. Conventionally, diagnostic modules based on individual methodology such as expert system have been developed and verified. In this report, methodology diversification is considered to integrate diagnostic modules which performance are confirmed using information processing technique. Technical issues to be considered in diagnostic methodology diversification are; 1)reliability of input data, 2)diversification of knowledge models, algorithms and reasoning schemes, 3)mutual complement and robustness. The diagnostic module utilizing the different approaches defined along with strategy of diversification was evaluated using fast breeder plant simulator. As a result, we confirmed that any singular diagnostic module can not meet accuracy criteria for the entire set of anomaly events. In contrast with this, we confirmed that every abnormality could be precisely diagnosed by a mutual combination. In other words, legitimacy of approach selected by strategy of diversification was shown, and methodology diversification attained clear efficiency for abnormal diagnosis. It has been also confirmed that the diversified diagnostic system implemented in this study is able to maintain its accuracy even in case that encountered scale of abnormality is different from reference cases embedded in the knowledge base. (author)

  17. A rare presentation of ruptured interstitial ectopic pregnancy with broad ligament hematoma: A case report

    Directory of Open Access Journals (Sweden)

    Ahmed M. Abbas

    2017-03-01

    Full Text Available Ectopic pregnancy is a major cause of maternal morbidity and mortality in the first trimester. Interstitial type is the most dangerous variety with a high risk of life-threatening internal hemorrhage. Obstetricians need a high index of suspicion to diagnose such rare type. We are reporting a rare case of ruptured interstitial ectopic pregnancy presented with a large broad ligament hematoma early in the first trimester. A 25-year-old woman was presented with gradual onset of increasing abdominal pain after 6 weeks of amenorrhea. She had a positive urinary pregnancy test. Abdominal ultrasound revealed bulky empty uterus and ill-defined mass at the right side of the uterus. On exploration, incision and drainage of broad ligament hematoma were performed in addition to right salpingectomy. Interstitial ectopic pregnancy represents a diagnostic and therapeutic challenge and frequently constitutes an obstetrical emergency. Its rupture early in the first trimester should be expected. Early diagnosis and proper management are the most important issues to avoid its catastrophic consequences.

  18. The "Critical" Elements of Illness Management and Recovery: Comparing Methodological Approaches.

    Science.gov (United States)

    McGuire, Alan B; Luther, Lauren; White, Dominique; White, Laura M; McGrew, John; Salyers, Michelle P

    2016-01-01

    This study examined three methodological approaches to defining the critical elements of Illness Management and Recovery (IMR), a curriculum-based approach to recovery. Sixty-seven IMR experts rated the criticality of 16 IMR elements on three dimensions: defining, essential, and impactful. Three elements (Recovery Orientation, Goal Setting and Follow-up, and IMR Curriculum) met all criteria for essential and defining and all but the most stringent criteria for impactful. Practitioners should consider competence in these areas as preeminent. The remaining 13 elements met varying criteria for essential and impactful. Findings suggest that criticality is a multifaceted construct, necessitating judgments about model elements across different criticality dimensions.

  19. CHARACTERIZATION OF SMALL AND MEDIUM ENTERPRISES (SMES OF POMERANIAN REGION IN SIX SIGMA METHODOLOGY APPLICATION

    Directory of Open Access Journals (Sweden)

    2011-12-01

    Full Text Available Background: Six Sigma is related to product’s characteristics and parameters of actions, needed to obtain these products. On the other hand, it is a multi-step, cyclic process aimed at the improvements leading to global standard, closed to the perfection. There is a growing interest in Six Sigma methodology among smaller organizations but there are still too little publications presented such events in the sector of small and medium enterprises, especially based on good empirical results. It was already noticed at the phase of the preliminary researches, that only small part of companies from this sector in Pomerian region use elements of this methodology. Methods: The companies were divided into groups by the type of their activities as well as the employment size. The questionnaires were sent to 150 randomly selected organizations in two steps and were addressed to senior managers. The questionnaire contained the questions about basic information about a company, the level of the knowledge and the practical application of Six Sigma methodology, opinions about improvements of processes occurring in the company, opinions about trainings in Six Sigma methodology. Results: The following hypotheses were proposed, statistically verified and received the answer: The lack of the adequate knowledge of Six Sigma methodology in SMEs limits the possibility to effectively monitor and improve processes - accepted. The use of statistical tools of Six Sigma methodology requires the broad action to popularize this knowledge among national SMEs - accepted. The level of the awareness of the importance as well as practical use of Six Sigma methodology in manufacturing SMEs is higher than in SMEs providing services - rejected, the level is equal. The level of the knowledge and the use of Six Sigma methodology in medium manufacturing companies is significantly higher than in small manufacturing companies - accepted. The level of the knowledge and the application

  20. Safety on Judo Children: Methodology and Results

    OpenAIRE

    Sacripanti, Attilio; De Blasis, Tania

    2017-01-01

    Many doctors although they have not firsthand experience of judo, describe it as a sport unsuitable for children. Theoretically speaking falls derived by Judo throwing techniques,could be potentially dangerous,especially for kids,if poorly managed.A lot of researches were focalized on trauma or injuries taking place in judo, both during training and competition The goal of this Research is to define and apply a scientific methodology to evaluate the hazard in falls by judo throws for children...

  1. Calibration Modeling Methodology to Optimize Performance for Low Range Applications

    Science.gov (United States)

    McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.

    2010-01-01

    Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.

  2. Methodology for comparing the health effects of electricity generation from uranium and coal fuels

    International Nuclear Information System (INIS)

    Rhyne, W.R.; El-Bassioni, A.A.

    1981-01-01

    A methodology was developed for comparing the health risks of electricity generation from uranium and coal fuels. The health effects attributable to the construction, operation, and decommissioning of each facility in the two fuel cycle were considered. The methodology is based on defining (1) requirement variables for the materials, energy, etc., (2) effluent variables associated with the requirement variables as well as with the fuel cycle facility operation, and (3) health impact variables for effluents and accidents. The materials, energy, etc., required for construction, operation, and decommissioning of each fuel cycle facility are defined as primary variables. The materials, energy, etc., needed to produce the primary variable are defined as secondary requirement variables. Each requirement variable (primary, secondary, etc.) has associated effluent variables and health impact variables. A diverging chain or tree is formed for each primary variable. Fortunately, most elements reoccur frequently to reduce the level of analysis complexity. 6 references, 11 figures, 6 tables

  3. Vertical Footbridge Vibrations: The Response Spectrum Methodology

    DEFF Research Database (Denmark)

    Georgakis, Christos; Ingólfsson, Einar Thór

    2008-01-01

    In this paper, a novel, accurate and readily codifiable methodology for the prediction of vertical footbridge response is presented. The methodology is based on the well-established response spectrum approach used in the majority of the world’s current seismic design codes of practice. The concept...... of a universally applicable reference response spectrum is introduced, from which the pedestrian-induced vertical response of any footbridge may be determined, based on a defined “event” and the probability of occurrence of that event. A series of Monte Carlo simulations are undertaken for the development...... period is introduced and its implication on the calculation of footbridge response is discussed. Finally, a brief comparison is made between the theoretically predicted pedestrian-induced vertical response of an 80m long RC footbridge (as an example) and actual field measurements. The comparison shows...

  4. EMISSION SIGNATURES FROM SUB-PARSEC BINARY SUPERMASSIVE BLACK HOLES. I. DIAGNOSTIC POWER OF BROAD EMISSION LINES

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Khai; Bogdanović, Tamara [Center for Relativistic Astrophysics, School of Physics, Georgia Institute of Technology, Atlanta GA 30332 (United States)

    2016-09-10

    Motivated by advances in observational searches for sub-parsec supermassive black hole binaries (SBHBs) made in the past few years, we develop a semi-analytic model to describe spectral emission-line signatures of these systems. The goal of this study is to aid the interpretation of spectroscopic searches for binaries and to help test one of the leading models of binary accretion flows in the literature: SBHB in a circumbinary disk. In this work, we present the methodology and a comparison of the preliminary model with the data. We model SBHB accretion flows as a set of three accretion disks: two mini-disks that are gravitationally bound to the individual black holes and a circumbinary disk. Given a physically motivated parameter space occupied by sub-parsec SBHBs, we calculate a synthetic database of nearly 15 million broad optical emission-line profiles and explore the dependence of the profile shapes on characteristic properties of SBHBs. We find that the modeled profiles show distinct statistical properties as a function of the semimajor axis, mass ratio, eccentricity of the binary, and the degree of alignment of the triple disk system. This suggests that the broad emission-line profiles from SBHB systems can in principle be used to infer the distribution of these parameters and as such merit further investigation. Calculated profiles are more morphologically heterogeneous than the broad emission lines in observed SBHB candidates and we discuss improved treatment of radiative transfer effects, which will allow a direct statistical comparison of the two groups.

  5. EMISSION SIGNATURES FROM SUB-PARSEC BINARY SUPERMASSIVE BLACK HOLES. I. DIAGNOSTIC POWER OF BROAD EMISSION LINES

    International Nuclear Information System (INIS)

    Nguyen, Khai; Bogdanović, Tamara

    2016-01-01

    Motivated by advances in observational searches for sub-parsec supermassive black hole binaries (SBHBs) made in the past few years, we develop a semi-analytic model to describe spectral emission-line signatures of these systems. The goal of this study is to aid the interpretation of spectroscopic searches for binaries and to help test one of the leading models of binary accretion flows in the literature: SBHB in a circumbinary disk. In this work, we present the methodology and a comparison of the preliminary model with the data. We model SBHB accretion flows as a set of three accretion disks: two mini-disks that are gravitationally bound to the individual black holes and a circumbinary disk. Given a physically motivated parameter space occupied by sub-parsec SBHBs, we calculate a synthetic database of nearly 15 million broad optical emission-line profiles and explore the dependence of the profile shapes on characteristic properties of SBHBs. We find that the modeled profiles show distinct statistical properties as a function of the semimajor axis, mass ratio, eccentricity of the binary, and the degree of alignment of the triple disk system. This suggests that the broad emission-line profiles from SBHB systems can in principle be used to infer the distribution of these parameters and as such merit further investigation. Calculated profiles are more morphologically heterogeneous than the broad emission lines in observed SBHB candidates and we discuss improved treatment of radiative transfer effects, which will allow a direct statistical comparison of the two groups.

  6. Methodological issues in the study of risk perception and human behavior

    International Nuclear Information System (INIS)

    Rathbun, P.F.

    1983-01-01

    The purpose of this paper is to provide a broad perspective on the use of the methods and techniques of the behavioral and social sciences as they pertain to the work of the Nuclear Regulatory Commission, particularly in issues of risk perception. Four major topics or themes are discussed: (1) a brief overview of the classic theories of risk perception; (2) current contractor work in the area of risk perception and cognitive psychology; (3) other uses of the social and behavioral sciences in the Agency; and (4) methodological considerations in using the techniques

  7. Analysing Everyday Online Political Talk in China: Theoretical and Methodological Reflections

    NARCIS (Netherlands)

    Wright, Scott; Graham, Todd; Sun, Yu; Yang Wang, Wilfred; Luo, Xiantian; Carson, Andrea

    2016-01-01

    This article explores the theoretical and methodological challenges of collecting and analysing everyday online political talk in China, and our approach to defining and coding such talk. In so doing, the article is designed to encourage further research in this area, taking forward a new agenda for

  8. A new way of looking at the risk in defined benefit pension plans.

    Science.gov (United States)

    Fore, D

    2000-01-01

    The portability feature of a defined contribution (DC) pension greatly reduces the risk to the accumulation of pension wealth. Conversely, defined benefit (DB) pensions have a variety of default risks that decrease the expected value of DB pension wealth. This paper examines those risks. Accrual of DB pension wealth is characterized in terms of purchases of risky bonds. Changing jobs triggers default on these bonds. Simulations are presented to show the potential loss in pension wealth from default. In addition, a methodology used to price corporate bonds is applied to generate estimates of the implied risk premiums of DB pension bonds over comparable riskless bonds.

  9. Emerging Methodologies in Pediatric Palliative Care Research: Six Case Studies

    Directory of Open Access Journals (Sweden)

    Katherine E. Nelson

    2018-02-01

    Full Text Available Given the broad focus of pediatric palliative care (PPC on the physical, emotional, and spiritual needs of children with potentially life-limiting illnesses and their families, PPC research requires creative methodological approaches. This manuscript, written by experienced PPC researchers, describes issues encountered in our own areas of research and the novel methods we have identified to target them. Specifically, we discuss potential approaches to: assessing symptoms among nonverbal children, evaluating medical interventions, identifying and treating problems related to polypharmacy, addressing missing data in longitudinal studies, evaluating longer-term efficacy of PPC interventions, and monitoring for inequities in PPC service delivery.

  10. Emerging Methodologies in Pediatric Palliative Care Research: Six Case Studies

    Science.gov (United States)

    Nelson, Katherine E.; Gerhardt, Cynthia A.; Rosenberg, Abby R.; Widger, Kimberley; Faerber, Jennifer A.; Feudtner, Chris

    2018-01-01

    Given the broad focus of pediatric palliative care (PPC) on the physical, emotional, and spiritual needs of children with potentially life-limiting illnesses and their families, PPC research requires creative methodological approaches. This manuscript, written by experienced PPC researchers, describes issues encountered in our own areas of research and the novel methods we have identified to target them. Specifically, we discuss potential approaches to: assessing symptoms among nonverbal children, evaluating medical interventions, identifying and treating problems related to polypharmacy, addressing missing data in longitudinal studies, evaluating longer-term efficacy of PPC interventions, and monitoring for inequities in PPC service delivery. PMID:29495384

  11. Production and testing of the VITAMIN-B6 fine-group and the BUGLE-93 broad-group neutron/photon cross-section libraries derived from ENDF/B-VI nuclear data

    International Nuclear Information System (INIS)

    Ingersoll, D.T.; White, J.E.; Wright, R.Q.; Hunter, H.T.; Slater, C.O.; Greene, N.M.; MacFarlane, R.E.

    1993-01-01

    A new multigroup cross-section library based on ENDF/B-VI data has been produced and tested for light water reactor shielding and reactor pressure vessel dosimetry applications. The broad-group library is designated BUGLE-93. The processing methodology is consistent with ANSI/ANS 6.1.2, since the ENDF data were first processed into a fine-group, ''pseudo problem-independent'' format and then collapsed into the final broad-group format. The fine-group library is designated VITAMIN-B6. An extensive integral data testing effort was also performed. In general, results using the new data show significant improvements relative to earlier ENDF data

  12. Are Some Attitudes More Self-Defining Than Others? Assessing Self-Related Attitude Functions and Their Consequences.

    Science.gov (United States)

    Zunick, Peter V; Teeny, Jacob D; Fazio, Russell H

    2017-08-01

    Attitudes serve multiple functions, some related to the self-concept. We call attitudes that help people define who they are "self-defining." Across four studies, we tested a brief self-report measure of the extent to which an attitude is self-defining. Studies 1 and 2 showed that self-defining attitudes tend to be extreme, positive, and unambivalent. Studies 3 and 4 produced two main findings. First, self-definition was related to, but not redundant with, a number of other characteristics of the attitude (e.g., attitude certainty). Second, self-definition predicted participants' intentions to spontaneously advocate and, in Study 4, their reactions to an opportunity to advocate behaviorally (i.e., writing about their attitude in an optional response box) following a self-threat. Overall, the results highlight the utility of this approach and, more broadly, demonstrate the value of considering the role of the self in attitudinal processes, and vice versa.

  13. Using Lean Six Sigma Methodology to Improve Quality of the Anesthesia Supply Chain in a Pediatric Hospital.

    Science.gov (United States)

    Roberts, Renée J; Wilson, Ashley E; Quezado, Zenaide

    2017-03-01

    Six Sigma and Lean methodologies are effective quality improvement tools in many health care settings. We applied the DMAIC methodology (define, measure, analyze, improve, control) to address deficiencies in our pediatric anesthesia supply chain. We defined supply chain problems by mapping existing processes and soliciting comments from those involved. We used daily distance walked by anesthesia technicians and number of callouts for missing supplies as measurements that we analyzed before and after implementing improvements (anesthesia cart redesign). We showed improvement in the metrics after those interventions were implemented, and those improvements were sustained and thus controlled 1 year after implementation.

  14. Methodological guide: management of industrial sites potentially contaminated by radioactive substances

    International Nuclear Information System (INIS)

    2001-01-01

    At the request of the Ministries of Health and the Environment, IPSN is preparing and publishing the first version of the methodological guide devoted to managing industrial sites potentially contaminated by radioactive substances. This guide describes a procedure for defining and choosing strategies for rehabilitating such industrial sites. (author)

  15. Formation of the methodological matrix of the strategic analysis of the enterprise

    Directory of Open Access Journals (Sweden)

    N.H. Vygovskaya

    2018-04-01

    Full Text Available The article is devoted to the study of the methodological matrix of the strategic analysis of the enterprise. The aim of this article is to analyze the influence of methodological changes in the 20th century on the methodology of strategic analysis; critical assessment and generalization of scientific approaches to its methods. Evaluation of scientific works on analysis made it possible to identify such problems in the methodology of strategic analysis as the lack of consideration of the features of strategic analysis in the formation of its methods, which often leads to confusion of methods of financial (economic, thrifty analysis; failure to use the fact that the strategic analysis contains, besides the methods of analyzing the internal and external environment, the methods of forecast analysis aimed at forming the strategy for the development of the enterprise; identification of the concepts «image», «reception», «method» of analysis; multidirectionality and indistinctness of signs of classification of methods of strategic analysis; blind copying of foreign methods of application of techniques and methods of strategic analysis without taking into account the specifics of domestic economic conditions. The expediency of using the system approach in forming the methodological design of strategic analysis is proved, which will allow to combine the methodology as a science of methods (a broad approach to the methods of strategic analysis with methodology as a set of applied methods and methods of analysis (narrow approach to methodology. The use of the system approach allowed to distinguish three levels of the methodology of strategic analysis. The first and second levels of methodology correspond to the level of science, the third level – the practice. When developing the third level of special methods of strategic analysis, an approach is applied that differentiates them depending on the stages of strategic analysis (methods of the stage

  16. Hybrid probabilistic and possibilistic safety assessment. Methodology and application

    International Nuclear Information System (INIS)

    Kato, Kazuyuki; Amano, Osamu; Ueda, Hiroyoshi; Ikeda, Takao; Yoshida, Hideji; Takase, Hiroyasu

    2002-01-01

    This paper presents a unified methodology to handle variability and ignorance by using probabilistic and possibilistic techniques respectively. The methodology has been applied to the safety assessment of geological disposal of high-level radioactive waste. Uncertainties associated with scenarios, models and parameters were defined in terms of fuzzy membership functions derived through a series of interviews to the experts, while variability was formulated by means of probability density functions (pdfs) based on available data sets. The exercise demonstrated the applicability of the new methodology and, in particular, its advantage in quantifying uncertainties based on expert opinion and in providing information on the dependence of assessment results on the level of conservatism. In addition, it was shown that sensitivity analysis can identify key parameters contributing to uncertainties associated with results of the overall assessment. The information mentioned above can be utilized to support decision-making and to guide the process of disposal system development and optimization of protection against potential exposure. (author)

  17. Broad-range PCR: past, present, or future of bacteriology?

    Science.gov (United States)

    Renvoisé, A; Brossier, F; Sougakoff, W; Jarlier, V; Aubry, A

    2013-08-01

    PCR targeting the gene encoding 16S ribosomal RNA (commonly named broad-range PCR or 16S PCR) has been used for 20 years as a polyvalent tool to study prokaryotes. Broad-range PCR was first used as a taxonomic tool, then in clinical microbiology. We will describe the use of broad-range PCR in clinical microbiology. The first application was identification of bacterial strains obtained by culture but whose phenotypic or proteomic identification remained difficult or impossible. This changed bacterial taxonomy and allowed discovering many new species. The second application of broad-range PCR in clinical microbiology is the detection of bacterial DNA from clinical samples; we will review the clinical settings in which the technique proved useful (such as endocarditis) and those in which it did not (such as characterization of bacteria in ascites, in cirrhotic patients). This technique allowed identifying the etiological agents for several diseases, such as Whipple disease. This review is a synthesis of data concerning the applications, assets, and drawbacks of broad-range PCR in clinical microbiology. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  18. Introduction to the “Research Tools” for Research Methodology course

    OpenAIRE

    Ebrahim, Nader Ale

    2016-01-01

    “Research Tools” can be defined as vehicles that broadly facilitate research and related activities. “Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research  outputs. Dr. Nader has collected over 800 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated...

  19. Risk management methodology for RBMN project

    International Nuclear Information System (INIS)

    Borssatto, Maria F.B.; Tello, Cledola C.O.; Uemura, George

    2013-01-01

    RBMN Project has been developed to design, construct and commission a national repository to dispose the low- and intermediate-level radioactive wastes from the operation of nuclear power plants and other industries that use radioactive sources and materials. Risk is a characteristic of all projects. The risks arise from uncertainties due to assumptions associated with the project and the environment in which it is executed. Risk management is the method by which these uncertainties are systematically monitored to ensure that the objectives of the project will be achieved. Considering the peculiarities of the Project, that is, comprehensive scope, multidisciplinary team, apparently polemic due to the unknowing of the subject by the stake holders, especially the community, it is being developed a specific methodology for risk management of this Project. This methodology will be critical for future generations who will be responsible for the final stages of the repository. It will provide greater guarantee to the processes already implemented and will maintain a specific list of risks and solutions for this Project, ensuring safety and security of the repository throughout its life cycle that is the planned to last at least three hundred years. This paper presents the tools and processes already defined, management actions aimed at developing a culture of proactive risk in order to minimize threats to this Project and promote actions that bring opportunities to its success. The methodology is based on solid research on the subject, considering methodologies already established and globally recognized as best practices for project management. (author)

  20. A Design Methodology for Medical Processes

    Science.gov (United States)

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  1. A Design Methodology for Medical Processes.

    Science.gov (United States)

    Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.

  2. Contextual assessment of organisational culture - methodological development in two case studies

    International Nuclear Information System (INIS)

    Reiman, T.; Oedewald, P.

    2002-01-01

    Despite the acknowledged significance of organisational culture in the nuclear field, previous cultural studies have concentrated on purely safety related matters, or been only descriptive in nature. New kinds of methods, taking into account the overall objectives of the organisation, were needed to assess culture and develop its working practices appropriately. VTT developed the Contextual Assessment of Organisational Culture (CAOC) methodology during the FINNUS programme. The methodology utilises two concepts, organisational culture and core task. The core task can be defined as the core demands and content of work that the organisation has to accomplish in order to be effective. The core task concept is used in assessing the central dimensions of the organisation's culture. Organisational culture is defined as a solution the company has generated in order to fulfil the perceived demands of its core task. The CAOC-methodology was applied in two case studies, in the Radiation and Nuclear Safety Authority of Finland and in the maintenance unit of Loviisa NPP. The aim of the studies was not only to assess the given culture, but also to give the personnel new concepts and new tools for reflecting on their organisation, their jobs and on appropriate working practices. The CAOC-methodology contributes to the design and redesign of work in complex sociotechnical systems. It strives to enhance organisations' capability to assess their current working practices and the meanings attached to them and compare these to the actual demands of their basic mission and so change unadaptive practices. (orig.)

  3. THE COMPETITIVENESS OF THE SOUTH AFRICAN AND AUSTRALIAN FLOWER INDUSTRIES: An application of three methodologies.

    OpenAIRE

    van Rooyen, I.M.; Kirsten, Johann F.; van Rooyen, C.J.; Collins, Ray

    2001-01-01

    Competitiveness is defined to include both comparative and competitive advantage. Three different methodologies are applied in the analysis of the flower industries of South Africa and Australia: "Determinants of competitive advantage" methodology of Michael Porter (1990) describes the factors influencing competitive advantage; "Revealed comparative advantage" states the relative importance of flower trade in each country; and the "Policy Analyses Matrix" calculates the comparative advantage ...

  4. Energy indicators for sustainable development: Guidelines and methodologies

    International Nuclear Information System (INIS)

    2005-04-01

    This publication is the product of an international initiative to define a set of Energy Indicators for Sustainable Development (EISD) and corresponding methodologies and guidelines. The successful completion of this work is the result of an intensive effort led by the International Atomic Energy Agency (IAEA) in cooperation with the United Nations Department of Economic and Social Affairs (UNDESA), the International Energy Agency (IEA), Eurostat and the European Environment Agency (EEA). The thematic framework, guidelines, methodology sheets and energy indicators set out in this publication reflect the expertise of these various agencies, recognized worldwide as leaders in energy and environmental statistics and analysis. While each agency has an active indicator programme, one goal of this joint endeavour has been to provide users with a consensus by leading experts on definitions, guidelines and methodologies for the development and worldwide use of a single set of energy indicators. No set of energy indicators can be final and definitive. To be useful, indicators must evolve over time to fit country-specific conditions, priorities and capabilities. The purpose of this publication is to present one set of EISD for consideration and use, particularly at the national level, and to serve as a starting point in the development of a more comprehensive and universally accepted set of energy indicators relevant to sustainable development. It is hoped that countries will use the EISD to assess their energy systems and to track their progress towards nationally defined sustainable development goals and objectives. It is also hoped that users of the information presented in this publication will contribute to refinements of energy indicators for sustainable development by adding their own unique perspectives to what is presented herein

  5. Energy indicators for sustainable development: Guidelines and methodologies

    International Nuclear Information System (INIS)

    2008-01-01

    This publication is the product of an international initiative to define a set of Energy Indicators for Sustainable Development (EISD) and corresponding methodologies and guidelines. The successful completion of this work is the result of an intensive effort led by the International Atomic Energy Agency (IAEA) in cooperation with the United Nations Department of Economic and Social Affairs (UNDESA), the International Energy Agency (IEA), Eurostat and the European Environment Agency (EEA). The thematic framework, guidelines, methodology sheets and energy indicators set out in this publication reflect the expertise of these various agencies, recognized worldwide as leaders in energy and environmental statistics and analysis. While each agency has an active indicator programme, one goal of this joint endeavour has been to provide users with a consensus by leading experts on definitions, guidelines and methodologies for the development and worldwide use of a single set of energy indicators. No set of energy indicators can be final and definitive. To be useful, indicators must evolve over time to fit country-specific conditions, priorities and capabilities. The purpose of this publication is to present one set of EISD for consideration and use, particularly at the national level, and to serve as a starting point in the development of a more comprehensive and universally accepted set of energy indicators relevant to sustainable development. It is hoped that countries will use the EISD to assess their energy systems and to track their progress towards nationally defined sustainable development goals and objectives. It is also hoped that users of the information presented in this publication will contribute to refinements of energy indicators for sustainable development by adding their own unique perspectives to what is presented herein

  6. A Methodological Approach to Assess the Impact of Smarting Action on Electricity Transmission and Distribution Networks Related to Europe 2020 Targets

    Directory of Open Access Journals (Sweden)

    Andrea Bonfiglio

    2017-01-01

    Full Text Available The achievement of the so-called 2020 targets requested by the European Union (EU has determined a significant growth of proposals of solutions and of technical projects aiming at reducing the CO2 emissions and increasing the energy efficiency, as well as the penetration of Renewable Energy Sources (RES in the electric network. As many of them ask for funding from the EU itself, there is the necessity to define a methodology to rank them and decide which projects should be sponsored to obtain the maximum effect on the EU 2020 targets. The present paper aims at (i defining a set of Key Performance Indicators (KPIs to compare different proposals, (ii proposing an analytical methodology to evaluate the defined KPIs and (iii evaluating the maximum impact that the considered action is capable of producing. The proposed methodology is applied to a set of possible interventions performed on a benchmark transmission network test case, in order to show that the defined indicators can be either calculated or measured and that they are useful to rank different “smarting actions”.

  7. Development of a comprehensive management site evaluation methodology

    International Nuclear Information System (INIS)

    Rodgers, J.C.; Onishi, Y.

    1981-01-01

    The Nuclear Regulatory Commission is in the process of preparing regulations that will define the necessary conditions for adequate disposal of low-level waste (LLW) by confinement in an LLW disposal facility. These proposed regulations form the context in which the motivation for the joint Los Alamos National Laboratory Battelle Pacific Northwest Laboratory program to develop a site-specific, LLW site evaluation methodology is discussed. The overall effort is divided into three development areas: land-use evaluation, environmental transport modelling, and long term scenario development including long-range climatology projections. At the present time four steps are envisioned in the application of the methodology to a site: site land use suitability assessment, land use-ecosystem interaction, contaminant transport simulation, and sensitivity analysis. Each of these steps is discussed in the paper. 12 refs

  8. Methodology for Validation of Housing Standards - the Kitchen as an Example

    DEFF Research Database (Denmark)

    Helle, Tina; Iwarsson, Susanne; Brandt, Åse

    We will present a novel methodology to validate housing standards. Person-environment fit among older persons with different levels of mobility device dependence was investigated during performance of everyday activities in realistic environments. Observational and self-report data were compared...... difficulties in rating accessibility problems. The actual use of mobility devices form part of the activity performance. Some standards are not defined properly to enable activity performance in realistic environments. After attending this activity, participants will be: • Aware of some of the methodological...... challenges concerning sampling and the measurement of accessibility problems among older people with different levels of dependence in mobility device use • Aware of the implications of using observation versus self-report data in research on person-environment fit • Consider the methodological implications...

  9. How to Define Nearly Net Zero Energy Buildings nZEB

    DEFF Research Database (Denmark)

    Kurnitski, Jarek; Allard, Francis; Braham, Derrick

    2011-01-01

    or maximum harmonized requirements as well as details of energy performance calculation framework, it will be up to the Member State to define what these for them exactly constitute. In the definition, local conditions are to be obviously taken into account, but the uniform methodology can be used in all......This REHVA Task Force proposes a technical definition for nearly zero energy buildings required in the implementation of the Energy performance of buildings directive recast. Energy calculation framework and system boundaries associated with the definition are provided to specify which energy flows...... in which way are taken into account in the energy performance assessment. The intention of the Task Force is to help the experts in the Member States in defining the nearly zero energy buildings in a uniform way. The directive requires nearly zero energy buildings, but since it does not give minimum...

  10. Grounded theory methodology - has it become a movement?

    OpenAIRE

    Berterö, Carina

    2012-01-01

    There is an ongoing debate regarding the nature of grounded theory, and an examination of many studies claiming to follow grounded theory indicates a wide range of approaches. In 1967 Glaser and Strauss’s ‘‘The Discovery of Grounded Theory; Strategies for Qualitative Research’’ was published and represented a breakthrough in qualitative research; it offered methodological consensus and systematic strategies for qualitative research practice. The defining characteristics of grounded theory inc...

  11. A hierarchical modeling methodology for the definition and selection of requirements

    Science.gov (United States)

    Dufresne, Stephane

    This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the

  12. Augmented Fotonovelas: A Visual Methodology for Community Engaged Research

    OpenAIRE

    Hidalgo, LeighAnna Grace

    2014-01-01

    Augmented Fotonovelas draw upon the aesthetic of traditional fotonovelas, but incorporate new technologies--such as video interviews, interactive mapping, smart phone technology, and Augmented Reality (AR). Augmented Fotonovelas also make the most of the classic form, utilizing photographs, text, and bubble captions. Through this methodology, new and old come together to produce Augmented Scholarship. I define Augmented Scholarship as knowledge production bridging the gap between communities ...

  13. Methodology for proliferation resistance and physical protection of Generation IV nuclear energy systems

    International Nuclear Information System (INIS)

    Bari, R.; Peterson, P.; Nishimura, R.; Roglans-Ribas, J.

    2005-01-01

    Enhanced proliferation resistance and physical protection (PR and PP) is one of the technology goals for advanced nuclear concepts. Under the auspices of the Generation IV International Forum an international experts group has been chartered to develop an evaluation methodology for PR and PP. This methodology will permit an objective PR and PP comparison between alternative nuclear systems and support design optimization to enhance robustness against proliferation, theft and sabotage. The assessment framework consists of identifying the threats to be considered, defining the PR and PP measures required to evaluate the resistance of a nuclear system to proliferation, theft or sabotage, and establishing quantitative methods to evaluate the proposed measures. The defined PR and PP measures are based on the design of the system (e.g., materials, processes, facilities), and institutional measures (e.g., safeguards, access control). The assessment methodology uses analysis of pathways' with respect to specific threats to determine the PR and PP measures. Analysis requires definition of the threats (i.e. objective, capability, strategy), decomposition of the system into its relevant elements (e.g., reactor core, fuel recycle facility, fuel storage), and identification of targets. (author)

  14. An Integrated Safety Assessment Methodology for Generation IV Nuclear Systems

    International Nuclear Information System (INIS)

    Leahy, Timothy J.

    2010-01-01

    The Generation IV International Forum (GIF) Risk and Safety Working Group (RSWG) was created to develop an effective approach for the safety of Generation IV advanced nuclear energy systems. Early work of the RSWG focused on defining a safety philosophy founded on lessons learned from current and prior generations of nuclear technologies, and on identifying technology characteristics that may help achieve Generation IV safety goals. More recent RSWG work has focused on the definition of an integrated safety assessment methodology for evaluating the safety of Generation IV systems. The methodology, tentatively called ISAM, is an integrated 'toolkit' consisting of analytical techniques that are available and matched to appropriate stages of Generation IV system concept development. The integrated methodology is intended to yield safety-related insights that help actively drive the evolving design throughout the technology development cycle, potentially resulting in enhanced safety, reduced costs, and shortened development time.

  15. Disclosure, apology, and offer programs: stakeholders' views of barriers to and strategies for broad implementation.

    Science.gov (United States)

    Bell, Sigall K; Smulowitz, Peter B; Woodward, Alan C; Mello, Michelle M; Duva, Anjali Mitter; Boothman, Richard C; Sands, Kenneth

    2012-12-01

    The Disclosure, Apology, and Offer (DA&O) model, a response to patient injuries caused by medical care, is an innovative approach receiving national attention for its early success as an alternative to the existing inherently adversarial, inefficient, and inequitable medical liability system. Examples of DA&O programs, however, are few. Through key informant interviews, we investigated the potential for more widespread implementation of this model by provider organizations and liability insurers, defining barriers to implementation and strategies for overcoming them. Our study focused on Massachusetts, but we also explored themes that are broadly generalizable to other states. We found strong support for the DA&O model among key stakeholders, who cited its benefits for both the liability system and patient safety. The respondents did not perceive any insurmountable barriers to broad implementation, and they identified strategies that could be pursued relatively quickly. Such solutions would permit a range of organizations to implement the model without legislative hurdles. Although more data are needed about the outcomes of DA&O programs, the model holds considerable promise for transforming the current approach to medical liability and patient safety. © 2012 Milbank Memorial Fund.

  16. Methodology features and problems of definition of the term «green business»

    OpenAIRE

    Stepanenko, Bohdana

    2010-01-01

    The term «green business» is defined. Theoretical and methodological principles of functioning of this type of activity are reflected. Basic aspects and main development stages of green business are marked out. The classification of green business enterprise is shown.

  17. How to conduct a qualitative meta-analysis: Tailoring methods to enhance methodological integrity.

    Science.gov (United States)

    Levitt, Heidi M

    2018-05-01

    Although qualitative research has long been of interest in the field of psychology, meta-analyses of qualitative literatures (sometimes called meta-syntheses) are still quite rare. Like quantitative meta-analyses, these methods function to aggregate findings and identify patterns across primary studies, but their aims, procedures, and methodological considerations may vary. This paper explains the function of qualitative meta-analyses and their methodological development. Recommendations have broad relevance but are framed with an eye toward their use in psychotherapy research. Rather than arguing for the adoption of any single meta-method, this paper advocates for considering how procedures can best be selected and adapted to enhance a meta-study's methodological integrity. Through the paper, recommendations are provided to help researchers identify procedures that can best serve their studies' specific goals. Meta-analysts are encouraged to consider the methodological integrity of their studies in relation to central research processes, including identifying a set of primary research studies, transforming primary findings into initial units of data for a meta-analysis, developing categories or themes, and communicating findings. The paper provides guidance for researchers who desire to tailor meta-analytic methods to meet their particular goals while enhancing the rigor of their research.

  18. Story-Making as Methodology: Disrupting Dominant Stories through Multimedia Storytelling.

    Science.gov (United States)

    Rice, Carla; Mündel, Ingrid

    2018-05-01

    In this essay, we discuss multimedia story-making methodologies developed through Re•Vision: The Centre for Art and Social Justice that investigates the power of the arts, especially story, to positively influence decision makers in diverse sectors. Our story-making methodology brings together majority and minoritized creators to represent previously unattended experiences (e.g., around mind-body differences, queer sexuality, urban Indigenous identity, and Inuit cultural voice) with an aim to building understanding and shifting policies/practices that create barriers to social inclusion and justice. We analyze our ongoing efforts to rework our storytelling methodology, spotlighting acts of revising carried out by facilitators and researchers as they/we redefine methodological terms for each storytelling context, by researcher-storytellers as they/we rework material from our lives, and by receivers of the stories as we revise our assumptions about particular embodied histories and how they are defined within dominant cultural narratives and institutional structures. This methodology, we argue, contributes to the existing qualitative lexicon by providing innovative new approaches not only for chronicling marginalized/misrepresented experiences and critically researching selves, but also for scaffolding intersectional alliances and for imagining more just futures. © 2018 Canadian Sociological Association/La Société canadienne de sociologie.

  19. A performance assessment methodology for low-level radioactive waste disposal

    International Nuclear Information System (INIS)

    Deering, L.R.; Kozak, M.W.

    1990-01-01

    To demonstrate compliance with the performance objectives governing protection of the general population in 10 CFR 61.41, applicants for land disposal of low-level radioactive waste are required to conduct a pathways analysis, or quantitative evaluation of radionuclide release, transport through environmental media, and dose to man. The Nuclear Regulatory Commission staff defined a strategy and initiated a project at Sandia National Laboratories to develop a methodology for independently evaluating an applicant's analysis of postclosure performance. This performance assessment methodology was developed in five stages: (1) identification of environmental pathways, (2) ranking, the significance of the pathways, (3) identification and integration of models for pathway analyses, (4) identification and selection of computer codes and techniques for the methodology, and (5) implementation of the codes and documentation of the methodology. The final methodology implements analytical and simple numerical solutions for source term, ground-water flow and transport, surface water transport, air transport, food chain, and dosimetry analyses, as well as more complex numerical solutions for multidimensional or transient analyses when more detailed assessments are needed. The capability to perform both simple and complex analyses is accomplished through modular modeling, which permits substitution of various models and codes to analyze system components

  20. Behavioral issues in operations management new trends in design, management, and methodologies

    CERN Document Server

    2013-01-01

    Behavioral Operations Management  has been identified in the last years as one of the most promising emerging fields in Operations Management. Behavioral Issues in Operations Management  explains and examines up-to-date research in this field, which works to analyze the impact of human behavior on the management of complex operating systems.   A collection of studies from leading scholars presents different methodologies and approaches, supported by real data and case studies. Issues such as building trust and strong cooperative relationships with suppliers, enhancing motivation and designing proper incentives for stimulating more effective decision maker behaviours are considered. The main decision-making processes affected by behavioral issues are also analyzed with a focus on new product development, logistics, and supply chain integration.   The broad coverage of methodologies and practical implications makes Behavioral Issues in Operations Management an ideal reference for both researchers developing...

  1. A simple nanoindentation-based methodology to assess the strength of brittle thin films

    International Nuclear Information System (INIS)

    Borrero-Lopez, Oscar; Hoffman, Mark; Bendavid, Avi; Martin, Phil J.

    2008-01-01

    In this work, we report a simple methodology to assess the mechanical strength of sub-micron brittle films. Nanoindentation of as-deposited tetrahedral amorphous carbon (ta-C) and Ti-Si-N nanocomposite films on silicon substrates followed by cross-sectional examination of the damage with a focused ion beam (FIB) miller allows the occurrence of cracking to be assessed in comparison with discontinuities (pop-ins) in the load-displacement curves. Strength is determined from the critical loads at which first cracking occurs using the theory of plates on a soft foundation. This methodology enables Weibull plots to be readily obtained, avoiding complex freestanding-film machining processes. This is of great relevance, since the mechanical strength of thin films ultimately controls their reliable use in a broad range of functional uses such as tribological coatings, magnetic drives, MEMS and biomedical applications

  2. Successful Technology Commercialization – Yes or No? Improving the Odds. The Quick Look Methodology and Process

    OpenAIRE

    Pletcher, Gary; Zehner II, William Bradley

    2017-01-01

    This article explores the relationships which transform new scientific knowledge into new commercial products, services, and ventures to create wealth creation. The major technology and marketing commercialization dilemmas are defined and addressed. The Quicklook methodology and related processes to quickly assess the commercial viability and potential of a scientific research project is explained. Using the Quicklook methodology and process early in the research and development process i...

  3. Methodological issues in studies of air pollution and reproductive health.

    Science.gov (United States)

    Woodruff, Tracey J; Parker, Jennifer D; Darrow, Lyndsey A; Slama, Rémy; Bell, Michelle L; Choi, Hyunok; Glinianaia, Svetlana; Hoggatt, Katherine J; Karr, Catherine J; Lobdell, Danelle T; Wilhelm, Michelle

    2009-04-01

    In the past decade there have been an increasing number of scientific studies describing possible effects of air pollution on perinatal health. These papers have mostly focused on commonly monitored air pollutants, primarily ozone (O(3)), particulate matter (PM), sulfur dioxide (SO(2)), carbon monoxide (CO), and nitrogen dioxide (NO(2)), and various indices of perinatal health, including fetal growth, pregnancy duration, and infant mortality. While most published studies have found some marker of air pollution related to some types of perinatal outcomes, variability exists in the nature of the pollutants and outcomes associated. Synthesis of the findings has been difficult for various reasons, including differences in study design and analysis. A workshop was held in September 2007 to discuss methodological differences in the published studies as a basis for understanding differences in study findings and to identify priorities for future research, including novel approaches for existing data. Four broad topic areas were considered: confounding and effect modification, spatial and temporal exposure variations, vulnerable windows of exposure, and multiple pollutants. Here we present a synopsis of the methodological issues and challenges in each area and make recommendations for future study. Two key recommendations include: (1) parallel analyses of existing data sets using a standardized methodological approach to disentangle true differences in associations from methodological differences among studies; and (2) identification of animal studies to inform important mechanistic research gaps. This work is of critical public health importance because of widespread exposure and because perinatal outcomes are important markers of future child and adult health.

  4. The Evaluation Methodology of Information Support

    Directory of Open Access Journals (Sweden)

    Lubos Necesal

    2016-01-01

    Full Text Available Knowledge, information and people are the motive force in today's organizations. Successful organizations need to find the right employees and provide them with the right and highquality information. This is a complex problem. In the world where information plays more and more important role, employees have to be skilled at information activities (searching, processing, saving, etc. of information and information system/-s (IS they work with. Organizations have to cover both these areas. Therefore, we need an effective instrument, which could be used to evaluate new employees within admission or as regular evaluating of current employees, to evaluate information system, whether it is an appropriate tool for fulfilling the employee’s tasks within the organization, and to evaluate how the organization covers the foregoing areas. Such instrument is the “Evaluation methodology of information support in organization”. This paper defines the term “information support“ and its role in organization. The body of the paper proposes the “Evaluation methodology of information support in organization”. The conclusion discusses contributions of information support evaluation

  5. Gate-defined Quantum Confinement in Suspended Bilayer Graphene

    Science.gov (United States)

    Allen, Monica

    2013-03-01

    Quantum confined devices in carbon-based materials offer unique possibilities for applications ranging from quantum computation to sensing. In particular, nanostructured carbon is a promising candidate for spin-based quantum computation due to the ability to suppress hyperfine coupling to nuclear spins, a dominant source of spin decoherence. Yet graphene lacks an intrinsic bandgap, which poses a serious challenge for the creation of such devices. We present a novel approach to quantum confinement utilizing tunnel barriers defined by local electric fields that break sublattice symmetry in suspended bilayer graphene. This technique electrostatically confines charges via band structure control, thereby eliminating the edge and substrate disorder that hinders on-chip etched nanostructures to date. We report clean single electron tunneling through gate-defined quantum dots in two regimes: at zero magnetic field using the energy gap induced by a perpendicular electric field and at finite magnetic fields using Landau level confinement. The observed Coulomb blockade periodicity agrees with electrostatic simulations based on local top-gate geometry, a direct demonstration of local control over the band structure of graphene. This technology integrates quantum confinement with pristine device quality and access to vibrational modes, enabling wide applications from electromechanical sensors to quantum bits. More broadly, the ability to externally tailor the graphene bandgap over nanometer scales opens a new unexplored avenue for creating quantum devices.

  6. Comparative Studies: historical, epistemological and methodological notes

    Directory of Open Access Journals (Sweden)

    Juan Ignacio Piovani

    2017-09-01

    Full Text Available In this article some historical, epistemological and methodological issues related to comparative studies in the social sciences are addressed, with specific reference to the field of education. The starting point is a discussion of the meaning of comparison, its logical structure and its presence in science and in everyday life. It follows the presentation and critical appraisal of the perspectives regarding comparison as a scientific method. It is argued that, even rejecting this restrictive meaning of comparison as a method, there is some consensus on the specificity of comparative studies within the social sciences. And in relation to them, the article address in more detail those studies that can be defined as trans-contextual (cross-national and cross-cultural, with emphasis on the main methodological and technical challenges they face. The socio-historical comparative perspective, which has gained importance in recent years in the field of education, is also discussed.

  7. Gamma ray auto absorption correction evaluation methodology

    International Nuclear Information System (INIS)

    Gugiu, Daniela; Roth, Csaba; Ghinescu, Alecse

    2010-01-01

    Neutron activation analysis (NAA) is a well established nuclear technique, suited to investigate the microstructural or elemental composition and can be applied to studies of a large variety of samples. The work with large samples involves, beside the development of large irradiation devices with well know neutron field characteristics, the knowledge of perturbing phenomena and adequate evaluation of correction factors like: neutron self shielding, extended source correction, gamma ray auto absorption. The objective of the works presented in this paper is to validate an appropriate methodology for gamma ray auto absorption correction evaluation for large inhomogeneous samples. For this purpose a benchmark experiment has been defined - a simple gamma ray transmission experiment, easy to be reproduced. The gamma ray attenuation in pottery samples has been measured and computed using MCNP5 code. The results show a good agreement between the computed and measured values, proving that the proposed methodology is able to evaluate the correction factors. (authors)

  8. Methodological proposal for the definition of improvement strategies in logistics of SME

    Directory of Open Access Journals (Sweden)

    Yeimy Liseth Becerra

    2014-12-01

    Full Text Available A methodological proposal for defining strategies of improvement in logistics of SMEs is presented as a means to fulfill a specific objective of the project Methodological design on storage logistics, acquisition, ownership of information systems and communication for Colombian SMEs, baker subsector, which currently runs the research group SEPRO, of Universidad Nacional of Colombia and supported by Colciencias. The project corresponds to the completion of the last stage of the base project, and aims to implement the corresponding target, raised in the research project that has been developing the research group SEPRO. To do this, it was made a review of the methodology used during the execution of the basic project, as well as the state of the art of techniques used in similar research for the evaluation and definition of breeding strategies in SMEs logistics. Revised techniques were compared and a proposed methodology was configured, which consists of the techniques that represented the greatest advantages for the research development.

  9. Defining effective community support for long-term psychiatric patients according to behavioural principles.

    Science.gov (United States)

    Evans, I M; Moltzen, N L

    2000-08-01

    The purpose of this article is to define the characteristics of effective support in community mental health settings for patients with serious and persistent mental illness. A broad literature providing empirical evidence on competent caregiver behaviours and styles is selectively reviewed. Relevant findings from family caregiver research and studies of social environments that enhance skill development in people with intellectual disabilities are incorporated, within a cognitive-behavioural framework. Six important domains are identified which represent positive caregiver styles: acceptance, creating a positive atmosphere, expectations of change, responsiveness, normalisation and educativeness. The characteristics hypothesised to be critical for caregivers and support workers are defined in a general way that can allow for individualisation according to the goals of the programs and the cultural priorities of staff and patients. Further empirical validation of these characteristics would enable community mental health services to provide more specialised clinical treatments.

  10. Methodological procedures and analytical instruments to evaluate an indicators integrated archive for urban management

    International Nuclear Information System (INIS)

    Del Ciello, R.; Napoleoni, S.

    1998-01-01

    This guide provides the results of a research developed at ENEA (National Agency for new Technology, Energy and the Environment) Casaccia center (Rome, Italy) aimed to define methodological procedures and analytical instruments needed to carry out an indicators integrated archive for urban management. The guide also defines the scheme of a negotiation process aimed to reach and exchange data and information among governmental and local administrations, non-governmental organizations and scientific bodies [it

  11. A PART OF RESEARCH METHODOLOGY COURSE: Introduction to the Research Tools

    OpenAIRE

    Ebrahim, Nader Ale

    2016-01-01

    “Research Tools” can be defined as vehicles that broadly facilitate research and related activities. “Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research  outputs. Dr. Nader has collected over 800 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated...

  12. Defining recovery in chronic fatigue syndrome: a critical review.

    Science.gov (United States)

    Adamowicz, Jenna L; Caikauskaite, Indre; Friedberg, Fred

    2014-11-01

    In chronic fatigue syndrome (CFS), the lack of consensus on how recovery should be defined or interpreted has generated controversy and confusion. The purpose of this paper was to systematically review, compare, and evaluate the definitions of recovery reported in the CFS literature and to make recommendations about the scope of recovery assessments. A search was done using the MEDLINE, PubMed, PsycINFO, CINAHL, and Cochrane databases for peer review papers that contained the search terms "chronic fatigue syndrome" and "recovery," "reversal," "remission," and/or "treatment response." From the 22 extracted studies, recovery was operationally defined by reference with one or more of these domains: (1) pre-morbid functioning; (2) both fatigue and function; (3) fatigue (or related symptoms) alone; (4) function alone; and/or (5) brief global assessment. Almost all of the studies measuring recovery in CFS did so differently. The brief global assessment was the most common outcome measure used to define recovery. Estimates of recovery ranged from 0 to 66 % in intervention studies and 2.6 to 62 % in naturalistic studies. Given that the term "recovery" was often based on limited assessments and less than full restoration of health, other more precise and accurate labels (e.g., clinically significant improvement) may be more appropriate and informative. In keeping with common understandings of the term recovery, we recommend a consistent definition that captures a broad-based return to health with assessments of both fatigue and function as well as the patient's perceptions of his/her recovery status.

  13. Interdisciplinary Evaluation of Broadly-Reactive HLA Class II Restricted Epitopes Eliciting HIV-Specific CD4+T Cell Responses

    DEFF Research Database (Denmark)

    Buggert, M.; Norström, M.; Lundegaard, Claus

    2011-01-01

    , the functional and immunodominant discrepancies of CD4+ T cell responses targeting promiscuous MHC II restricted HIV epitopes remains poorly defined. Thus, utilization of interdisciplinary approaches might aid revealing broadly- reactive peptides eliciting CD4 + T cell responses. Methods: We utilized the novel...... bioinformatic prediction program NetMHCIIpan to select 64 optimized MHC II restricted epitopes located in the HIV Gag, Pol, Env, Nef and Tat regions. The epitopes were selected to cover the global diversity of the virus (multiple subtypes) and the human immune system(diverse MHC II types). Optimized...

  14. Application of broad-spectrum resequencing microarray for genotyping rhabdoviruses.

    Science.gov (United States)

    Dacheux, Laurent; Berthet, Nicolas; Dissard, Gabriel; Holmes, Edward C; Delmas, Olivier; Larrous, Florence; Guigon, Ghislaine; Dickinson, Philip; Faye, Ousmane; Sall, Amadou A; Old, Iain G; Kong, Katherine; Kennedy, Giulia C; Manuguerra, Jean-Claude; Cole, Stewart T; Caro, Valérie; Gessain, Antoine; Bourhy, Hervé

    2010-09-01

    The rapid and accurate identification of pathogens is critical in the control of infectious disease. To this end, we analyzed the capacity for viral detection and identification of a newly described high-density resequencing microarray (RMA), termed PathogenID, which was designed for multiple pathogen detection using database similarity searching. We focused on one of the largest and most diverse viral families described to date, the family Rhabdoviridae. We demonstrate that this approach has the potential to identify both known and related viruses for which precise sequence information is unavailable. In particular, we demonstrate that a strategy based on consensus sequence determination for analysis of RMA output data enabled successful detection of viruses exhibiting up to 26% nucleotide divergence with the closest sequence tiled on the array. Using clinical specimens obtained from rabid patients and animals, this method also shows a high species level concordance with standard reference assays, indicating that it is amenable for the development of diagnostic assays. Finally, 12 animal rhabdoviruses which were currently unclassified, unassigned, or assigned as tentative species within the family Rhabdoviridae were successfully detected. These new data allowed an unprecedented phylogenetic analysis of 106 rhabdoviruses and further suggest that the principles and methodology developed here may be used for the broad-spectrum surveillance and the broader-scale investigation of biodiversity in the viral world.

  15. Application of Broad-Spectrum Resequencing Microarray for Genotyping Rhabdoviruses▿

    Science.gov (United States)

    Dacheux, Laurent; Berthet, Nicolas; Dissard, Gabriel; Holmes, Edward C.; Delmas, Olivier; Larrous, Florence; Guigon, Ghislaine; Dickinson, Philip; Faye, Ousmane; Sall, Amadou A.; Old, Iain G.; Kong, Katherine; Kennedy, Giulia C.; Manuguerra, Jean-Claude; Cole, Stewart T.; Caro, Valérie; Gessain, Antoine; Bourhy, Hervé

    2010-01-01

    The rapid and accurate identification of pathogens is critical in the control of infectious disease. To this end, we analyzed the capacity for viral detection and identification of a newly described high-density resequencing microarray (RMA), termed PathogenID, which was designed for multiple pathogen detection using database similarity searching. We focused on one of the largest and most diverse viral families described to date, the family Rhabdoviridae. We demonstrate that this approach has the potential to identify both known and related viruses for which precise sequence information is unavailable. In particular, we demonstrate that a strategy based on consensus sequence determination for analysis of RMA output data enabled successful detection of viruses exhibiting up to 26% nucleotide divergence with the closest sequence tiled on the array. Using clinical specimens obtained from rabid patients and animals, this method also shows a high species level concordance with standard reference assays, indicating that it is amenable for the development of diagnostic assays. Finally, 12 animal rhabdoviruses which were currently unclassified, unassigned, or assigned as tentative species within the family Rhabdoviridae were successfully detected. These new data allowed an unprecedented phylogenetic analysis of 106 rhabdoviruses and further suggest that the principles and methodology developed here may be used for the broad-spectrum surveillance and the broader-scale investigation of biodiversity in the viral world. PMID:20610710

  16. A Practical Methodology for Disaggregating the Drivers of Drug Costs Using Administrative Data.

    Science.gov (United States)

    Lungu, Elena R; Manti, Orlando J; Levine, Mitchell A H; Clark, Douglas A; Potashnik, Tanya M; McKinley, Carol I

    2017-09-01

    Prescription drug expenditures represent a significant component of health care costs in Canada, with estimates of $28.8 billion spent in 2014. Identifying the major cost drivers and the effect they have on prescription drug expenditures allows policy makers and researchers to interpret current cost pressures and anticipate future expenditure levels. To identify the major drivers of prescription drug costs and to develop a methodology to disaggregate the impact of each of the individual drivers. The methodology proposed in this study uses the Laspeyres approach for cost decomposition. This approach isolates the effect of the change in a specific factor (e.g., price) by holding the other factor(s) (e.g., quantity) constant at the base-period value. The Laspeyres approach is expanded to a multi-factorial framework to isolate and quantify several factors that drive prescription drug cost. Three broad categories of effects are considered: volume, price and drug-mix effects. For each category, important sub-effects are quantified. This study presents a new and comprehensive methodology for decomposing the change in prescription drug costs over time including step-by-step demonstrations of how the formulas were derived. This methodology has practical applications for health policy decision makers and can aid researchers in conducting cost driver analyses. The methodology can be adjusted depending on the purpose and analytical depth of the research and data availability. © 2017 Journal of Population Therapeutics and Clinical Pharmacology. All rights reserved.

  17. Sewage treatment processes: The methodology for the resort communities; Tecnologias de Depuracion: la metodologia de seleccion para poblaciones turisticas

    Energy Technology Data Exchange (ETDEWEB)

    Nieves de la Vega, G.; Kovacs, Z. [AQUA/PLAN, S.A. (Spain)

    1995-06-01

    The selection of adequate sewage treatment processes for resort communities has to be based upon a detailed knowledge of the characteristics of sewerage discharges. In order to define a methodology, the most representative variables such as climatology, seasonal variation, required treatment efficiency, sewage characteristics and availability of land, are identified. A wide range of available treatment processes is defined and the relationship between variables and priority criteria is analysed. Finally, a decision-diagram allowing the selection of the most adequate treatment process in each particular case is presented. The methodology is applied to mountain resort communities. (Author)

  18. СONTENTS OF THE METHODOLOGICAL AND TECHNOLOGICAL SUPPORT OF THE EDUCATION QUALITY MANAGEMENT INFORMATION SYSTEM FOR FUTURE ECONOMISTS

    Directory of Open Access Journals (Sweden)

    Kostiantyn S. Khoruzhyi

    2014-12-01

    Full Text Available In the article, the content and nature of organizational activities in scope of methodological and technological support of the education quality management information system (EQMIS for future economists are described. The content of the organizational activities for the implementation of methodological and technological support of EQMIS for future economists includes four stages (preparatory, instructional/adaptational, methodological/basic, as well as experimental/evaluational and contains a set of methodological and technological measures for each of the stages of the EQMIS implementation. A study of the pedagogical impact of the proposed methodology of using EQMIS in the formation of professional competence of economics students was also conducted. The main stages, methods and sequence of implementation arrangements for the methodological and technological support of EQMIS are defined.

  19. The evaluation framework for business process management methodologies

    Directory of Open Access Journals (Sweden)

    Sebastian Lahajnar

    2016-06-01

    Full Text Available In an intense competition in the global market, organisations seek to take advantage of all their internal and external potentials, advantages, and resources. It has been found that, in addition to competitive products and services, a good business also requires an effective management of business processes, which is the discipline of the business process management (BPM. The introduction of the BPM in the organisation requires a thoughtful selection of an appropriate methodological approach, since the latter will formalize activities, products, applications and other efforts of the organisation in this field. Despite many technology-driven solutions of software companies, recommendations of consulting companies, techniques, good practices and tools, the decision on what methodology to choose is anything but simple. The aim of this article is to simplify the adoption of such decisions by building a framework for the evaluation of BPM methodologies according to a qualitative multi-attribute decision-making method. The framework defines a hierarchical decision-making model, formalizes the decision-making process and thus contributes significantly to an independent, credible final decision that is the most appropriate for a specific organisation.

  20. Methodology for Monitoring Sustainable Development of Isolated Microgrids in Rural Communities

    Directory of Open Access Journals (Sweden)

    Claudia Rahmann

    2016-11-01

    Full Text Available Microgrids are a rapidly evolving and increasingly common form of local power generation used to serve the needs of both rural and urban communities. In this paper, we present a methodology to evaluate the evolution of the sustainability of stand-alone microgrids projects. The proposed methodology considers a composite sustainability index (CSI that includes both positive and negative impacts of the operation of the microgrid in a given community. The CSI is constructed along environmental, social, economic and technical dimensions of the microgrid. The sub-indexes of each dimension are aggregated into the CSI via a set of adaptive weighting factors, which indicate the relative importance of the corresponding dimension in the sustainability goals. The proposed methodology aims to be a support instrument for policy makers especially when defining sound corrective measures to guarantee the sustainability of small, isolated microgrid projects. To validate the performance of the proposed methodology, a microgrid installed in the northern part of Chile (Huatacondo has been used as a benchmarking project.

  1. Broad-band hard X-ray reflectors

    DEFF Research Database (Denmark)

    Joensen, K.D.; Gorenstein, P.; Hoghoj, P.

    1997-01-01

    Interest in optics for hard X-ray broad-band application is growing. In this paper, we compare the hard X-ray (20-100 keV) reflectivity obtained with an energy-dispersive reflectometer, of a standard commercial gold thin-film with that of a 600 bilayer W/Si X-ray supermirror. The reflectivity...... of the multilayer is found to agree extraordinarily well with theory (assuming an interface roughness of 4.5 Angstrom), while the agreement for the gold film is less, The overall performance of the supermirror is superior to that of gold, extending the band of reflection at least a factor of 2.8 beyond...... that of the gold, Various other design options are discussed, and we conclude that continued interest in the X-ray supermirror for broad-band hard X-ray applications is warranted....

  2. Feminist approaches to social science: epistemological and methodological tenets.

    Science.gov (United States)

    Campbell, R; Wasco, S M

    2000-12-01

    This paper is a primer for community psychologists on feminist research. Much like the field of community psychology, feminist scholarship is defined by its values and process. Informed by the political ideologies of the 1970s women's movement (liberal, radical, socialist feminism, and womanism), feminist scholars reinterpreted classic concepts in philosophy of science to create feminist epistemologies and methodologies. Feminist epistemologies, such as feminist empiricism, standpoint theory, and postmodernism, recognize women's lived experiences as legitimate sources of knowledge. Feminist methodologies attempt to eradicate sexist bias in research and find ways to capture women's voices that are consistent with feminist ideals. Practically, the process of feminist research is characterized by four primary features: (1) expanding methodologies to include both quantitative and qualitative methods, (2) connecting women for group-level data collection, (3) reducing the hierarchical relationship between researchers and their participants to facilitate trust and disclosure, and (4) recognizing and reflecting upon the emotionality of women's lives. Recommendations for how community psychologists can integrate feminist scholarship into their practice are discussed.

  3. Quantile arithmetic methodology for uncertainty propagation in fault trees

    International Nuclear Information System (INIS)

    Abdelhai, M.; Ragheb, M.

    1986-01-01

    A methodology based on quantile arithmetic, the probabilistic analog to interval analysis, is proposed for the computation of uncertainties propagation in fault tree analysis. The basic events' continuous probability density functions (pdf's) are represented by equivalent discrete distributions by dividing them into a number of quantiles N. Quantile arithmetic is then used to performthe binary arithmetical operations corresponding to the logical gates in the Boolean expression of the top event expression of a given fault tree. The computational advantage of the present methodology as compared with the widely used Monte Carlo method was demonstrated for the cases of summation of M normal variables through the efficiency ratio defined as the product of the labor and error ratios. The efficiency ratio values obtained by the suggested methodology for M = 2 were 2279 for N = 5, 445 for N = 25, and 66 for N = 45 when compared with the results for 19,200 Monte Carlo samples at the 40th percentile point. Another advantage of the approach is that the exact analytical value of the median is always obtained for the top event

  4. Test Automation Process Improvement A case study of BroadSoft

    OpenAIRE

    Gummadi, Jalendar

    2016-01-01

    This master thesis research is about improvement of test automation process at BroadSoft Finland as a case study. Test automation project recently started at BroadSoft but the project is not properly integrated in to existing process. Project is about converting manual test cases to automation test cases. The aim of this thesis is about studying existing BroadSoft test process and studying different test automation frameworks. In this thesis different test automation process are studied ...

  5. Review Essay: Grenzgänger Seeks Reflexive Methodology

    Directory of Open Access Journals (Sweden)

    Wolff-Michael Roth

    2002-09-01

    Full Text Available Reflexive Methodology reviews major strands of current thought in epistemology, philosophy, social science, and interpretive methods. The book falls short in that it neither does a thorough job reviewing the literature nor does it provide method-related advice useful to students. Grenzgängerin constitutes a collection of essays on a broad range of topics, but which are only loosely connected if at all. Drawing on DERRIDA and the notion of a historical science of the historical subject, I attempt to practice method, something I missed in both texts. I make explicit the historical nature of my own writing and the historical nature of my subject. I make explicit intertextuality and in the process practice reflexivity in the particular way I am writing. URN: urn:nbn:de:0114-fqs020328

  6. Antiviral Therapy by HIV-1 Broadly Neutralizing and Inhibitory Antibodies

    Directory of Open Access Journals (Sweden)

    Zhiqing Zhang

    2016-11-01

    Full Text Available Human immunodeficiency virus type 1 (HIV-1 infection causes acquired immune deficiency syndrome (AIDS, a global epidemic for more than three decades. HIV-1 replication is primarily controlled through antiretroviral therapy (ART but this treatment does not cure HIV-1 infection. Furthermore, there is increasing viral resistance to ART, and side effects associated with long-term therapy. Consequently, there is a need of alternative candidates for HIV-1 prevention and therapy. Recent advances have discovered multiple broadly neutralizing antibodies against HIV-1. In this review, we describe the key epitopes on the HIV-1 Env protein and the reciprocal broadly neutralizing antibodies, and discuss the ongoing clinical trials of broadly neutralizing and inhibitory antibody therapy as well as antibody combinations, bispecific antibodies, and methods that improve therapeutic efficacy by combining broadly neutralizing antibodies (bNAbs with latency reversing agents. Compared with ART, HIV-1 therapeutics that incorporate these broadly neutralizing and inhibitory antibodies offer the advantage of decreasing virus load and clearing infected cells, which is a promising prospect in HIV-1 prevention and treatment.

  7. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  8. GENERIC APPROACH IN CHOICE OF ADEQUATE METHODOLOGY FOR THE ASSESSMENT OF IT INVESTMENTS

    Directory of Open Access Journals (Sweden)

    Melita Kozina

    2012-07-01

    Full Text Available Investments into information technology (IT, (hereinafter: IT investments havereached very high figures, which are still continually on the rise. IT potentials are being usedin an increasing number of ways. Various company managers have different approaches tothis issue. A large number of methods/models for the assessment of IT investments isavailable, so the question is posed of how to choose the adequate assessment category. Thesaid reasons have initiated a need for defining the generic approach in the choice ofadequate methodology for the assessment of IT investments, which was indeed the goal ofthis paper. General ideas to this approach stem from the fact that each IT investment has itspurpose and belongs to a certain type of IT investment (decision-making aspect whichdemands its relevant methodology for assessing IT investments. Two groups of demands(conditions have been defined in choosing relevant methodology. The first group pertains tomethodology analysis and determination of its compatibility with characteristics of thedefined decision-making aspect. The second group of conditions pertains to methodologyanalysis with respect to its possibilities (abilities of integrating quantity, quality and riskfactors of IT decision. Conducted field research shows that the assessment of IT investmentshas been done mainly using simpler methods/models and their combinations, and is focusedon quantity aspects of IT values.

  9. Development of enterprise architecture management methodology for teaching purposes

    Directory of Open Access Journals (Sweden)

    Dmitry V. Kudryavtsev

    2017-01-01

    the main methods and technologies of the enterprise architecture in practice. The requirements for the simplified methodology were defined.These requirements were taken into account during methodology development and can be used when applying the methodology – it will be possible to understand the goals that can be achieved with its help, the assumptions made and the existing limitations. The proposed methodology defines the layers, aspects and objects of the enterprise architecture, describes the tasks of managing enterprise architecture and the artifacts created (lists, matrices, diagrams.The proposed methodology can be used to conduct training projects in which methods and tools of enterprise architecture management are used to optimize the company’s work based on the capabilities of information technology. Such projects will help not only to develop students’ skills, but also to establish a dialogue between universities and industry – the provider of problems and tasks for solving within students’ projects.The proposed simplified methodology for managing the architecture of the enterprise is tested and used by the authors when conducting courses in the leading universities of the Russian Federation: Graduate School of Management (St. Petersburg State University, St. Petersburg State University, Finance University under the Government of the Russian Federation, Higher School of Economics and BonchBruevich Saint-Petersburg State University of Telecommunications.

  10. Safeguards Automated Facility Evaluation (SAFE) methodology

    International Nuclear Information System (INIS)

    Chapman, L.D.; Grady, L.M.; Bennett, H.A.; Sasser, D.W.; Engi, D.

    1978-01-01

    The SAFE procedure is an efficient method of evaluating the physical protection system of a nuclear facility. Since the algorithms used in SAFE for path generation and evaluation are analytical, many paths can be evaluated with a modest investment in computer time. SAFE is easy to use because the information required is well-defined and the interactive nature of this procedure lends itself to straightforward operation. The modular approach that has been taken allows other functionally equivalent modules to be substituted as they become available. The SAFE procedure has broad applications in the nuclear facility safeguards field as well as in the security field in general. Any fixed facility containing valuable materials or components to be protected from theft or sabotage could be analyzed using this same automated evaluation technique

  11. Coal resources available for development; a methodology and pilot study

    Science.gov (United States)

    Eggleston, Jane R.; Carter, M. Devereux; Cobb, James C.

    1990-01-01

    Coal accounts for a major portion of our Nation's energy supply in projections for the future. A demonstrated reserve base of more than 475 billion short tons, as the Department of Energy currently estimates, indicates that, on the basis of today's rate of consumption, the United States has enough coal to meet projected energy needs for almost 200 years. However, the traditional procedures used for estimating the demonstrated reserve base do not account for many environmental and technological restrictions placed on coal mining. A new methodology has been developed to determine the quantity of coal that might actually be available for mining under current and foreseeable conditions. This methodology is unique in its approach, because it applies restrictions to the coal resource before it is mined. Previous methodologies incorporated restrictions into the recovery factor (a percentage), which was then globally applied to the reserve (minable coal) tonnage to derive a recoverable coal tonnage. None of the previous methodologies define the restrictions and their area and amount of impact specifically. Because these restrictions and their impacts are defined in this new methodology, it is possible to achieve more accurate and specific assessments of available resources. This methodology has been tested in a cooperative project between the U.S. Geological Survey and the Kentucky Geological Survey on the Matewan 7.5-minute quadrangle in eastern Kentucky. Pertinent geologic, mining, land-use, and technological data were collected, assimilated, and plotted. The National Coal Resources Data System was used as the repository for data, and its geographic information system software was applied to these data to eliminate restricted coal and quantify that which is available for mining. This methodology does not consider recovery factors or the economic factors that would be considered by a company before mining. Results of the pilot study indicate that, of the estimated

  12. Formation of broad Balmer wings in symbiotic stars

    International Nuclear Information System (INIS)

    Chang, Seok-Jun; Heo, Jeong-Eun; Hong, Chae-Lin; Lee, Hee-Won

    2016-01-01

    Symbiotic stars are binary systems composed of a hot white dwarf and a mass losing giant. In addition to many prominent emission lines symbiotic stars exhibit Raman scattered O VI features at 6825 and 7088 Å. Another notable feature present in the spectra of many symbiotics is the broad wings around Balmer lines. Astrophysical mechanisms that can produce broad wings include Thomson scattering by free electrons and Raman scattering of Ly,β and higher series by neutral hydrogen. In this poster presentation we produce broad wings around Hα and H,β adopting a Monte Carlo techinique in order to make a quantitative comparison of these two mechanisms. Thomson wings are characterized by the exponential cutoff given by the termal width whereas the Raman wings are dependent on the column density and continuum shape in the far UV region. A brief discussion is provided. (paper)

  13. On-line process failure diagnosis: The necessity and a comparative review of the methodologies

    International Nuclear Information System (INIS)

    Kim, I.S.

    1991-01-01

    Three basic approaches to process failure management are defined and discussed to elucidate the role of diagnosis in the operation of nuclear power plants. The rationale for the necessity of diagnosis is given from various perspectives. A comparative review of some representative diagnostic methodologies is presented and their shortcomings are discussed. Based on the insights from the review, the desirable characteristics from the review, the desirable characteristics of advanced diagnostic methodologies are derived from the viewpoints of failure detection, diagnosis, and correction. 11 refs

  14. 33 CFR 117.921 - Broad River.

    Science.gov (United States)

    2010-07-01

    ... OPERATION REGULATIONS Specific Requirements South Carolina § 117.921 Broad River. (a) The draw of the S170 bridge, mile 14.0 near Beaufort, shall open on signal if at least 24 hours notice is given. (b) The draw...

  15. Evidence for a Broad Autism Phenotype

    NARCIS (Netherlands)

    K. de Groot (Kristel); J.W. van Strien (Jan)

    2017-01-01

    textabstractThe broad autism phenotype implies the existence of a continuum ranging from individuals displaying almost no autistic traits to severely impaired diagnosed individuals. Recent studies have linked this variation in autistic traits to several domains of functioning. However, studies

  16. Analysis of gaming community using Soft System Methodology

    OpenAIRE

    Hurych, Jan

    2015-01-01

    This diploma thesis aims to analyse virtual gaming community and it's problems in case of community belonging to EU server of the game called World of Tanks. To solve these problems, Soft System Methodology by P. Checkland, is used. The thesis includes analysis of significance of gaming communities for the gaming industry as a whole. Gaming community is then defined as a soft system. There are 3 problems analysed in the practical part of the thesis using newer version of SSM. One iteration of...

  17. Methodology for flammable gas evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Hopkins, J.D., Westinghouse Hanford

    1996-06-12

    There are 177 radioactive waste storage tanks at the Hanford Site. The waste generates flammable gases. The waste releases gas continuously, but in some tanks the waste has shown a tendency to trap these flammable gases. When enough gas is trapped in a tank`s waste matrix, it may be released in a way that renders part or all of the tank atmosphere flammable for a period of time. Tanks must be evaluated against previously defined criteria to determine whether they can present a flammable gas hazard. This document presents the methodology for evaluating tanks in two areas of concern in the tank headspace:steady-state flammable-gas concentration resulting from continuous release, and concentration resulting from an episodic gas release.

  18. Defining operational taxonomic units using DNA barcode data.

    Science.gov (United States)

    Blaxter, Mark; Mann, Jenna; Chapman, Tom; Thomas, Fran; Whitton, Claire; Floyd, Robin; Abebe, Eyualem

    2005-10-29

    The scale of diversity of life on this planet is a significant challenge for any scientific programme hoping to produce a complete catalogue, whatever means is used. For DNA barcoding studies, this difficulty is compounded by the realization that any chosen barcode sequence is not the gene 'for' speciation and that taxa have evolutionary histories. How are we to disentangle the confounding effects of reticulate population genetic processes? Using the DNA barcode data from meiofaunal surveys, here we discuss the benefits of treating the taxa defined by barcodes without reference to their correspondence to 'species', and suggest that using this non-idealist approach facilitates access to taxon groups that are not accessible to other methods of enumeration and classification. Major issues remain, in particular the methodologies for taxon discrimination in DNA barcode data.

  19. Application and licensing requirements of the Framatome ANP RLBLOCA methodology

    International Nuclear Information System (INIS)

    Martin, R.P.; Dunn, B.M.

    2004-01-01

    The Framatome ANP Realistic Large-Break LOCA methodology (FANP RLBLOCA) is an analysis approach approved by the US NRC for supporting the licensing basis of 3- and 4-loop Westinghouse PWRs and CE 2x4 PWRs. It was developed consistent with the NRC's Code Scaling, Applicability, and Uncertainty (CSAU) methodology for performing best-estimate large-break LOCAs. The CSAU methodology consists of three key elements with the second and third element addressing uncertainty identification and application. Unique to the CSAU methodology is the use of engineering judgment and the Process Identification and Ranking Table (PIRT) defined in the first element to lay the groundwork for achieving the ultimate goal of quantifying the total uncertainty in predicted measures of interest associated with the large-break LOCA. It is the PIRT that not only directs the methodology development, but also directs the methodology review. While the FANP RLBLOCA methodology was generically approved, a plant-specific application is customized in two ways addressing how the unique plant characterization 1) is translated to code input and 2) relates to the unique methodology licensing requirements. Related to the former, plants are required by 10 CFR 50.36 to define a technical specification limiting condition for operation based on the following criteria: 1. Installed instrumentation that is used in the control room to detect, and indicate, a significant abnormal degradation of the reactor coolant pressure boundary. 2. A process variable, design feature, or operating restriction that is an initial condition of a design basis accident or transient analysis that either assumes the failure of or presents a challenge to the integrity of a fission product barrier. 3. A structure, system, or component that is part of the primary success path and which functions or actuates to mitigate a design basis accident or transient that either assumes the failure of or presents a challenge to the integrity of a

  20. Examining Approaches to Research on Self-Regulated Learning: Conceptual and Methodological Considerations

    Science.gov (United States)

    Karabenick, Stuart A.; Zusho, Akane

    2015-01-01

    We provide a conceptual commentary on the articles in this special issue, first by describing the unique features of each study, focusing on what we consider to be their theoretical and methodological contributions, and then by highlighting significant crosscutting themes and future directions in the study of SRL. Specifically, we define SRL to be…

  1. A methodology for evaluating social impact of Environmental Education Master Training Program

    Directory of Open Access Journals (Sweden)

    Loret de Mola, E.

    2014-01-01

    Full Text Available The paper is intended to describe a methodology for evaluating social impact of Environmental Education master training program by presenting its main stages. The framework serving as starting point and other empirical methods lead to systematized and define the terms of environmental professional training, professional performance of the environmental educator, evaluation, evaluation of professional performance of environmental educators and impact evaluation; as well as distinguishing the functions of impact evaluation in the postgraduate program favoring professor, tutors and trainees development. Previously appraised by consulting experts who gave it high ranks, this methodology is currently being used in evaluating second and third editions.

  2. Methodology for assessing laser-based equipment

    Science.gov (United States)

    Pelegrina-Bonilla, Gabriel; Hermsdorf, Jörg; Thombansen, Ulrich; Abels, Peter; Kaierle, Stefan; Neumann, Jörg

    2017-10-01

    Methodologies for the assessment of technology's maturity are widely used in industry and research. Probably the best known are technology readiness levels (TRLs), initially pioneered by the National Aeronautics and Space Administration (NASA). At the beginning, only descriptively defined TRLs existed, but over time, automated assessment techniques in the form of questionnaires emerged in order to determine TRLs. Originally TRLs targeted equipment for space applications, but the demands on industrial relevant equipment are partly different in terms of, for example, overall costs, product quantities, or the presence of competitors. Therefore, we present a commonly valid assessment methodology with the aim of assessing laser-based equipment for industrial use, in general. The assessment is carried out with the help of a questionnaire, which allows for a user-friendly and easy accessible way to monitor the progress from the lab-proven state to the application-ready product throughout the complete development period. The assessment result is presented in a multidimensional metric in order to reveal the current specific strengths and weaknesses of the equipment development process, which can be used to direct the remaining development process of the equipment in the right direction.

  3. A diagnostic methodology for refrigerating systems; Methodologie de diagnostic des installations frigorifiques

    Energy Technology Data Exchange (ETDEWEB)

    Vrinat, G. [Association Francaise du Froid (AFF), 75 - Paris (France)

    1997-12-31

    A diagnostic methodology for refrigerating machines, equipment and plants has been defined and evaluated for EDF, the French national power utility and ADEME, the French Agency for Energy Conservation, in the framework of energy conservation objectives: the diagnostic method should enable to identify malfunctions, assess the cost efficiency of the equipment, identify limiting factors, and consider corrective measures

  4. A Constrained Genetic Algorithm with Adaptively Defined Fitness Function in MRS Quantification

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; Graveron-Demilly, D.; van Ormondt, D.

    MRS Signal quantification is a rather involved procedure and has attracted the interest of the medical engineering community, regarding the development of computationally efficient methodologies. Significant contributions based on Computational Intelligence tools, such as Neural Networks (NNs), demonstrated a good performance but not without drawbacks already discussed by the authors. On the other hand preliminary application of Genetic Algorithms (GA) has already been reported in the literature by the authors regarding the peak detection problem encountered in MRS quantification using the Voigt line shape model. This paper investigates a novel constrained genetic algorithm involving a generic and adaptively defined fitness function which extends the simple genetic algorithm methodology in case of noisy signals. The applicability of this new algorithm is scrutinized through experimentation in artificial MRS signals interleaved with noise, regarding its signal fitting capabilities. Although extensive experiments with real world MRS signals are necessary, the herein shown performance illustrates the method's potential to be established as a generic MRS metabolites quantification procedure.

  5. Analysis of Feedback processes in Online Group Interaction: a methodological model

    Directory of Open Access Journals (Sweden)

    Anna Espasa

    2013-06-01

    Full Text Available The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the feedback. Research on feedback processes has predominantly focused on feedback design rather than on how students utilize feedback to improve learning. This methodological model fills this gap contributing to analyse the implementation of the feedback processes while students discuss collaboratively in a specific case of writing assignments. A review of different methodological models was carried out to define a framework adjusted to the analysis of the relationship of written and asynchronous group interaction, and students' activity and changes incorporated into the final text. The model proposed includes the following dimensions: 1 student participation 2 nature of student learning and 3 quality of student learning. The main contribution of this article is to present the methodological model and also to ascertain the model's operativity regarding how students incorporate such feedback into their essays.

  6. An intelligent design methodology for nuclear power systems

    International Nuclear Information System (INIS)

    Nassersharif, B.; Martin, R.P.; Portal, M.G.; Gaeta, M.J.

    1989-01-01

    The goal of this investigation is to research possible methodologies into automating the design of, specifically, nuclear power facilities; however, it is relevant to all thermal power systems. The strategy of this research has been to concentrate on individual areas of the thermal design process, investigate procedures performed, develop methodology to emulate that behavior, and prototype it in the form of a computer program. The design process has been generalized as follows: problem definition, design definition, component selection procedure, optimization and engineering analysis, testing and final design with the problem definition defining constraints that will be applied to the selection procedure as well as design definition. The result of this research is a prototype computer program applying an original procedure for the selection of the best set of real components that would be used in constructing a system with desired performance characteristics. The mathematical model used for the selection procedure is possibility theory

  7. Defining European Wholesale Electricity Markets. An 'And/Or' Approach

    International Nuclear Information System (INIS)

    Dijkgraaf, E.; Janssen, M.C.W.

    2009-09-01

    An important question in the dynamic European wholesale markets for electricity is whether to define the geographical market at the level of an individual member state or more broadly. We show that if we currently take the traditional approach by considering for each member state whether there is one single other country that provides a substitute for domestic production, the market in each separate member state has still to be considered a separate market. However, if we allow for the possibility that at different moments in time there is another country that provides a substitute for domestic production, then the conclusion should be that certain member states do not constitute a separate geographical market. This is in particular true for Belgium, but also for The Netherlands, France, and to some extent also for Germany and Austria. We call this alternative approach the 'and/or' approach.

  8. The broad autism phenotype in parents of individuals with autism: a systematic review of the literature

    Directory of Open Access Journals (Sweden)

    Lidia Prata Cruz

    2013-12-01

    Full Text Available The broad autism phenotype (BAP is a milder manifestation of the defining symptoms of the syndrome in individuals without autism. This study conducted a systematic review of studies about behavioral characteristics of interpersonal relationships, communication and rigidity, as well as about three cognitive models, Theory of Mind, central coherence and executive function, in parents of individuals with autism. The indexed databases were LILACS, IBECS, Web of Science, and MEDLINE, and the studies retrieved were published between 1991 and March 2012. Parents of individuals with autism have more difficulties in interpersonal relationships and in pragmatic language use and have more rigidity traits. The inclusions of the cognitive theories in the group of BAP characteristics were inconclusive.

  9. An Introduction to Replication Research in Gifted Education: Shiny and New Is Not the Same as Useful

    Science.gov (United States)

    Makel, Matthew C.; Plucker, Jonathan A.

    2015-01-01

    This methodological brief introduces readers to replication methods and their uses. Broadly defined, replication is the duplication of previously conducted research to verify or expand the original findings. Replication is particularly useful in the gifted education context because so much education theory and research are based on general…

  10. Defining the Intrinsic Cardiac Risks of Operations to Improve Preoperative Cardiac Risk Assessments.

    Science.gov (United States)

    Liu, Jason B; Liu, Yaoming; Cohen, Mark E; Ko, Clifford Y; Sweitzer, Bobbie J

    2018-02-01

    Current preoperative cardiac risk stratification practices group operations into broad categories, which might inadequately consider the intrinsic cardiac risks of individual operations. We sought to define the intrinsic cardiac risks of individual operations and to demonstrate how grouping operations might lead to imprecise estimates of perioperative cardiac risk. Elective operations (based on Common Procedural Terminology codes) performed from January 1, 2010 to December 31, 2015 at hospitals participating in the American College of Surgeons National Surgical Quality Improvement Program were studied. A composite measure of perioperative adverse cardiac events was defined as either cardiac arrest requiring cardiopulmonary resuscitation or acute myocardial infarction. Operations' intrinsic cardiac risks were derived from mixed-effects models while controlling for patient mix. Resultant risks were sorted into low-, intermediate-, and high-risk categories, and the most commonly performed operations within each category were identified. Intrinsic operative risks were also examined using a representative grouping of operations to portray within-group variation. Sixty-six low, 30 intermediate, and 106 high intrinsic cardiac risk operations were identified. Excisional breast biopsy had the lowest intrinsic cardiac risk (overall rate, 0.01%; odds ratio, 0.11; 95% CI, 0.02 to 0.25) relative to the average, whereas aorto-bifemoral bypass grafting had the highest (overall rate, 4.1%; odds ratio, 6.61; 95% CI, 5.54 to 7.90). There was wide variation in the intrinsic cardiac risks of operations within the representative grouping (median odds ratio, 1.40; interquartile range, 0.88 to 2.17). A continuum of intrinsic cardiac risk exists among operations. Grouping operations into broad categories inadequately accounts for the intrinsic cardiac risk of individual operations.

  11. Research methodological issues in evaluating herbal interventions

    Directory of Open Access Journals (Sweden)

    Dipika Bansal

    2010-02-01

    Full Text Available Dipika Bansal, Debasish Hota, Amitava ChakrabartiPostgraduate Institute of Medical Education and Research, Chandigarh, IndiaAbstract: Randomized controlled trials provide the best evidence, and is seen as the gold standard for allopathic research. Herbal therapies are not an integral part of conventional care although they are still used by patients in their health care management. These medicines need to be subjected to rigorous research to establish their effectiveness and safety. Clearly defined treatments are required and should be recorded in a manner that enables other suitably trained researchers to reproduce them reliably. Quality control of herbal products is also a prerequisite of credible clinical trials. Methodological strategies for investigating the herbal interventions and the issues regarding appropriate patient selection, randomization and blinding, placebo effects and choice of comparator, occupational standardization and the selection of appropriate study endpoints to prove efficacy are being discussed. This paper will review research options and propose some suggestions for future research design.Keywords: CAM research, herbal therapies, methodology, clinical trial

  12. Climate and desertification: indicators for an assessment methodology

    International Nuclear Information System (INIS)

    Sciortino, M.; Caiaffa, E.; Fattoruso, G.; Donolo, R.; Salvetti, G.

    2009-01-01

    This work aims to define a methodology that, on the basis of commonly available surface climate records, assesses indicators of the increase or decrease of the extension of territories vulnerable to desertification and land degradation. The definition and quantification of environmental policy relevant indicators aims to improve the understanding and the decision making processes in dry lands. the results of this study show that since 1931 changes of climate involved 90% of the territory of the Sicilian region, with stronger intensity in the internal areas of Enna, Caltanissetta and Palermo provinces. (Author) 9 refs.

  13. Deficient motion-defined and texture-defined figure-ground segregation in amblyopic children.

    Science.gov (United States)

    Wang, Jane; Ho, Cindy S; Giaschi, Deborah E

    2007-01-01

    Motion-defined form deficits in the fellow eye and the amblyopic eye of children with amblyopia implicate possible direction-selective motion processing or static figure-ground segregation deficits. Deficient motion-defined form perception in the fellow eye of amblyopic children may not be fully accounted for by a general motion processing deficit. This study investigates the contribution of figure-ground segregation deficits to the motion-defined form perception deficits in amblyopia. Performances of 6 amblyopic children (5 anisometropic, 1 anisostrabismic) and 32 control children with normal vision were assessed on motion-defined form, texture-defined form, and global motion tasks. Performance on motion-defined and texture-defined form tasks was significantly worse in amblyopic children than in control children. Performance on global motion tasks was not significantly different between the 2 groups. Faulty figure-ground segregation mechanisms are likely responsible for the observed motion-defined form perception deficits in amblyopia.

  14. Evolving paradigms for desensitization in managing broadly HLA sensitized transplant candidates.

    Science.gov (United States)

    Reinsmoen, Nancy L; Lai, Chi-Hung; Vo, Ashley; Jordan, Stanley C

    2012-04-01

    The broadly human leukocyte antigen (HLA) sensitized patient awaiting organ transplantation remains a persistent and significant problem for transplant medicine. Sensitization occurs as a consequence of exposure to HLA antigens through pregnancy, blood and platelet transfusions, and previous transplants. Early experience with desensitization protocols coupled with improved diagnostics for donor-specific antibodies (DSAs) and renal pathology have greatly improved transplant rates and outcomes for patients once considered un-transplantable or at high risk for poor outcomes. More recent advances have occurred through implementation of a national allocation system requiring the entering of unacceptable antigens that reduces the rate of crossmatch positivity. Current desensitization therapies include high-dose intravenous immunoglobulin (IVIG), plasma exchange (PLEX) with low-dose IVIG, and IVIG combined with rituximab. Developing therapies include proteasome inhibitors aimed at plasma cells and modifiers of complement-mediated injury. Here we discuss the important advancements in desensitization including defining the risk for antibody-mediated rejection prior to transplantation and the evolution of therapies aimed at reducing the impact of antibody injury on allografts.

  15. Defining Disability: Understandings of and Attitudes Towards Ableism and Disability

    Directory of Open Access Journals (Sweden)

    Carli Friedman

    2017-03-01

    Full Text Available Disabled people, amidst political and social gains, continue to experience discrimination in multiple areas. Understanding how such discrimination, named here as ableism, operates is important and may require studying perspectives of people who do not claim a disability identity.  Ableism may be expressed in a number of ways, and examining how a particular group, in this case siblings of disabled people, understand and value disability may contribute to overall understandings about how ableism works. Thus, the purpose of this study is to explore relationships between siblings of disabled people's broad societal understandings of disability and their attitudes towards it. In order to tease out this relationship further we have also examined factors that impact how people define disability. Using both social psychological and sociological approaches, we have contextualized individual attitudes as providing additional new information about social meanings of disability, and set this study's results against the larger backdrops of debates over meanings of disability within Disability Studies. In our research, participants revealed complex understandings of disability, but most often defined disability as preventing or slowing action, as an atypical function, a lack of independence, and as a socially constructed obstacle. Participants' unconscious (implicit disability attitudes significantly related to their understandings of disability as lacking independence, impairment, and/or in relation to the norm, and their conscious (explicit disability attitudes. Moreover, longer employment in a disability-related industry was correlated with defining disability as a general difference, rather than as slowing or limiting of tasks.

  16. Survey of Dynamic PSA Methodologies

    International Nuclear Information System (INIS)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung; Kim, Taewan

    2015-01-01

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  17. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  18. A survey of dynamic methodologies for probabilistic safety assessment of nuclear power plants

    International Nuclear Information System (INIS)

    Aldemir, Tunc

    2013-01-01

    Highlights: ► Dynamic methodologies for probabilistic safety assessment (PSA) are surveyed. ► These methodologies overcome the limitations of the traditional approach to PSA. ► They are suitable for PSA using a best estimate plus uncertainty approach. ► They are highly computation intensive and produce very large number of scenarios. ► Use of scenario clustering can assist the analysis of the results. -- Abstract: Dynamic methodologies for probabilistic safety assessment (PSA) are defined as those which use a time-dependent phenomenological model of system evolution along with its stochastic behavior to account for possible dependencies between failure events. Over the past 30 years, numerous concerns have been raised in the literature regarding the capability of the traditional static modeling approaches such as the event-tree/fault-tree methodology to adequately account for the impact of process/hardware/software/firmware/human interactions on the stochastic system behavior. A survey of the types of dynamic PSA methodologies proposed to date is presented, as well as a brief summary of an example application for the PSA modeling of a digital feedwater control system of an operating pressurized water reactor. The use of dynamic methodologies for PSA modeling of passive components and phenomenological uncertainties are also discussed.

  19. Methodological Proposal for Optimal Location of Emergency Operation Centers through Multi-Criteria Approach

    Directory of Open Access Journals (Sweden)

    Umberto Di Matteo

    2016-01-01

    Full Text Available Territorial vulnerability and risk analysis play a fundamental role in urban planning and emergency management. Requirements analysis of such aspects are possible to define more and more effective risk mitigation strategies providing efficient response plans to events. Many mitigation strategies as well as many response plans have in common the purpose of minimizing response time in order to decrease the level of vulnerability of the concerning area. The response time to a perturbing event is in fact an essential parameter to define the hazard of the considered site and literature is unanimous in considering it. In this context, the article proposes a methodology for the optimization of the location on the territory of emergency operation centers (EOCs, reducing response times and mitigating in this way the vulnerability of the area. The proposed methodology is based on a multi-criteria decision making (MCDM hybrid type AHP (Analytic Hierarchy Process-Electre. This method has been applied in the territory of Bressanone and Vipiteno (Bolzano-Italy, simulating the need to build a new barrack of Fire Department. A campaign of interviews with operators and industry experts and the collection of spatial data from the portals of the concerned authorities has been carried out in order to get the number of necessary data for the implementation of the proposed methodology.

  20. In Vitro Methodologies to Evaluate the Effects of Hair Care Products on Hair Fiber

    Directory of Open Access Journals (Sweden)

    Robson Miranda da Gama

    2017-01-01

    Full Text Available Consumers use different hair care products to change the physical appearance of their hair, such as shampoos, conditioners, hair dye and hair straighteners. They expect cosmetics products to be available in the market to meet their needs in a broad and effective manner. Evaluating efficacy of hair care products in vitro involves the use of highly accurate equipment. This review aims to discuss in vitro methodologies used to evaluate the effects of hair care products on hair fiber, which can be assessed by various methods, such as Scanning Electron Microscopy, Transmission Electron Microscopy, Atomic Force Microscopy, Optical Coherence Tomography, Infrared Spectroscopy, Raman Spectroscopy, Protein Loss, Electrophoresis, color and brightness, thermal analysis and measuring mechanical resistance to combing and elasticity. The methodology used to test hair fibers must be selected according to the property being evaluated, such as sensory characteristics, determination of brightness, resistance to rupture, elasticity and integrity of hair strain and cortex, among others. If equipment is appropriate and accurate, reproducibility and ease of employment of the analytical methodology will be possible. Normally, the data set must be discussed in order to obtain conclusive answers to the test.

  1. An elastic-plastic fracture mechanics based methodology to characterize cracking behavior and its application to environmental assisted processes

    International Nuclear Information System (INIS)

    Alvarez, J.A.; Gutierrez-Solana, F.

    1999-01-01

    Cracking processes suffered by new structural and piping steels when used in petroleum or other energy installations have demonstrated the need for a cracking resistance characterization methodology. This methodology, valid for both elastic and elastoplastic regimes, should be able to define crack propagation kinetics as a function of their controlling local parameters. This work summarizes an experimental and analytical methodology that has been shown to be suitable for characterizing cracking processes using compact tensile specimens, especially subcritical environmentally assisted ones, such as those induced by hydrogen in microalloyed steels. The applied and validated methodology has been shown to offer quantitative results of cracking behavior and to correlate these with the existing fracture micromechanisms. (orig.)

  2. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    Science.gov (United States)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  3. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available This section will address the methodology that was applied to develop the South African Digital Health Innovation Ecosystem (DHIE). Each chapter under Section B represents a specific phase in the methodology....

  4. Broad-Band Spectroscopy of Hercules X-1 with Suzaku

    Science.gov (United States)

    Asami, Fumi; Enoto, Teruaki; Iwakiri, Wataru; Yamada, Shin'ya; Tamagawa, Toru; Mihara, Tatehiro; Nagase, Fumiaki

    2014-01-01

    Hercules X-1 was observed with Suzaku in the main-on state from 2005 to 2010. The 0.4- 100 keV wide-band spectra obtained in four observations showed a broad hump around 4-9 keV in addition to narrow Fe lines at 6.4 and 6.7 keV. The hump was seen in all the four observations regardless of the selection of the continuum models. Thus it is considered a stable and intrinsic spectral feature in Her X-1. The broad hump lacked a sharp structure like an absorption edge. Thus it was represented by two different spectral models: an ionized partial covering or an additional broad line at 6.5 keV. The former required a persistently existing ionized absorber, whose origin was unclear. In the latter case, the Gaussian fitting of the 6.5-keV line needs a large width of sigma = 1.0-1.5 keV and a large equivalent width of 400-900 eV. If the broad line originates from Fe fluorescence of accreting matter, its large width may be explained by the Doppler broadening in the accretion flow. However, the large equivalent width may be inconsistent with a simple accretion geometry.

  5. Broad-Band Analysis of Polar Motion Excitations

    Science.gov (United States)

    Chen, J.

    2016-12-01

    Earth rotational changes, i.e. polar motion and length-of-day (LOD), are driven by two types of geophysical excitations: 1) mass redistribution within the Earth system, and 2) angular momentum exchange between the solid Earth (more precisely the crust) and other components of the Earth system. Accurate quantification of Earth rotational excitations has been difficult, due to the lack of global-scale observations of mass redistribution and angular momentum exchange. The over 14-years time-variable gravity measurements from the Gravity Recovery and Climate Experiment (GRACE) have provided a unique means for quantifying Earth rotational excitations from mass redistribution in different components of the climate system. Comparisons between observed Earth rotational changes and geophysical excitations estimated from GRACE, satellite laser ranging (SLR) and climate models show that GRACE-derived excitations agree remarkably well with polar motion observations over a broad-band of frequencies. GRACE estimates also suggest that accelerated polar region ice melting in recent years and corresponding sea level rise have played an important role in driving long-term polar motion as well. With several estimates of polar motion excitations, it is possible to estimate broad-band noise variance and noise power spectra in each, given reasonable assumptions about noise independence. Results based on GRACE CSR RL05 solutions clearly outperform other estimates with the lowest noise levels over a broad band of frequencies.

  6. Six methodological steps to build medical data warehouses for research.

    Science.gov (United States)

    Szirbik, N B; Pelletier, C; Chaussalet, T

    2006-09-01

    We propose a simple methodology for heterogeneous data collection and central repository-style database design in healthcare. Our method can be used with or without other software development frameworks, and we argue that its application can save a relevant amount of implementation effort. Also, we believe that the method can be used in other fields of research, especially those that have a strong interdisciplinary nature. The idea emerged during a healthcare research project, which consisted among others in grouping information from heterogeneous and distributed information sources. We developed this methodology by the lessons learned when we had to build a data repository, containing information about elderly patients flows in the UK's long-term care system (LTC). We explain thoroughly those aspects that influenced the methodology building. The methodology is defined by six steps, which can be aligned with various iterative development frameworks. We describe here the alignment of our methodology with the RUP (rational unified process) framework. The methodology emphasizes current trends, as early identification of critical requirements, data modelling, close and timely interaction with users and stakeholders, ontology building, quality management, and exception handling. Of a special interest is the ontological engineering aspect, which had the effects with the highest impact after the project. That is, it helped stakeholders to perform better collaborative negotiations that brought better solutions for the overall system investigated. An insight into the problems faced by others helps to lead the negotiators to win-win situations. We consider that this should be the social result of any project that collects data for better decision making that leads finally to enhanced global outcomes.

  7. Education and Broad Concepts of Agency

    Science.gov (United States)

    Winch, Christopher

    2014-01-01

    Drawing on recent debates about the relationship between propositional and practical knowledge, this article is concerned with broad concepts of agency. Specifically, it is concerned with agency that involves the forming and putting into effect of intentions over relatively extended periods, particularly in work contexts (called, for want of a…

  8. Broad supernatural punishment but not moralizing high gods precede the evolution of political complexity in Austronesia.

    Science.gov (United States)

    Watts, Joseph; Greenhill, Simon J; Atkinson, Quentin D; Currie, Thomas E; Bulbulia, Joseph; Gray, Russell D

    2015-04-07

    Supernatural belief presents an explanatory challenge to evolutionary theorists-it is both costly and prevalent. One influential functional explanation claims that the imagined threat of supernatural punishment can suppress selfishness and enhance cooperation. Specifically, morally concerned supreme deities or 'moralizing high gods' have been argued to reduce free-riding in large social groups, enabling believers to build the kind of complex societies that define modern humanity. Previous cross-cultural studies claiming to support the MHG hypothesis rely on correlational analyses only and do not correct for the statistical non-independence of sampled cultures. Here we use a Bayesian phylogenetic approach with a sample of 96 Austronesian cultures to test the MHG hypothesis as well as an alternative supernatural punishment hypothesis that allows punishment by a broad range of moralizing agents. We find evidence that broad supernatural punishment drives political complexity, whereas MHGs follow political complexity. We suggest that the concept of MHGs diffused as part of a suite of traits arising from cultural exchange between complex societies. Our results show the power of phylogenetic methods to address long-standing debates about the origins and functions of religion in human society. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  9. 33 CFR 110.27 - Lynn Harbor in Broad Sound, Mass.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Lynn Harbor in Broad Sound, Mass. 110.27 Section 110.27 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY ANCHORAGES ANCHORAGE REGULATIONS Special Anchorage Areas § 110.27 Lynn Harbor in Broad Sound, Mass. North of...

  10. The broad line region of AGN: Kinematics and physics

    Directory of Open Access Journals (Sweden)

    Popović L.Č.

    2006-01-01

    Full Text Available In this paper a discussion of kinematics and physics of the Broad Line Region (BLR is given. The possible physical conditions in the BLR and problems in determination of the physical parameters (electron temperature and density are considered. Moreover, one analyses the geometry of the BLR and the probability that (at least a fraction of the radiation in the Broad Emission Lines (BELs originates from a relativistic accretion disk.

  11. FDAAA legislation is working, but methodological flaws undermine the reliability of clinical trials: a cross-sectional study

    OpenAIRE

    Douglas H. Marin dos Santos; Álvaro N. Atallah

    2015-01-01

    The relationship between clinical research and the pharmaceutical industry has placed clinical trials in jeopardy. According to the medical literature, more than 70% of clinical trials are industry-funded. Many of these trials remain unpublished or have methodological flaws that distort their results. In 2007, it was signed into law the Food and Drug Administration Amendments Act (FDAAA), aiming to provide publicly access to a broad range of biomedical information to be made available on the ...

  12. Final report of the accident phenomenology and consequence (APAC) methodology evaluation. Spills Working Group

    Energy Technology Data Exchange (ETDEWEB)

    Brereton, S.; Shinn, J. [Lawrence Livermore National Lab., CA (United States); Hesse, D [Battelle Columbus Labs., OH (United States); Kaninich, D. [Westinghouse Savannah River Co., Aiken, SC (United States); Lazaro, M. [Argonne National Lab., IL (United States); Mubayi, V. [Brookhaven National Lab., Upton, NY (United States)

    1997-08-01

    The Spills Working Group was one of six working groups established under the Accident Phenomenology and Consequence (APAC) methodology evaluation program. The objectives of APAC were to assess methodologies available in the accident phenomenology and consequence analysis area and to evaluate their adequacy for use in preparing DOE facility safety basis documentation, such as Basis for Interim Operation (BIO), Justification for Continued Operation (JCO), Hazard Analysis Documents, and Safety Analysis Reports (SARs). Additional objectives of APAC were to identify development needs and to define standard practices to be followed in the analyses supporting facility safety basis documentation. The Spills Working Group focused on methodologies for estimating four types of spill source terms: liquid chemical spills and evaporation, pressurized liquid/gas releases, solid spills and resuspension/sublimation, and resuspension of particulate matter from liquid spills.

  13. Assessing the impact of healthcare research: A systematic review of methodological frameworks.

    Directory of Open Access Journals (Sweden)

    Samantha Cruz Rivera

    2017-08-01

    Full Text Available Increasingly, researchers need to demonstrate the impact of their research to their sponsors, funders, and fellow academics. However, the most appropriate way of measuring the impact of healthcare research is subject to debate. We aimed to identify the existing methodological frameworks used to measure healthcare research impact and to summarise the common themes and metrics in an impact matrix.Two independent investigators systematically searched the Medical Literature Analysis and Retrieval System Online (MEDLINE, the Excerpta Medica Database (EMBASE, the Cumulative Index to Nursing and Allied Health Literature (CINAHL+, the Health Management Information Consortium, and the Journal of Research Evaluation from inception until May 2017 for publications that presented a methodological framework for research impact. We then summarised the common concepts and themes across methodological frameworks and identified the metrics used to evaluate differing forms of impact. Twenty-four unique methodological frameworks were identified, addressing 5 broad categories of impact: (1 'primary research-related impact', (2 'influence on policy making', (3 'health and health systems impact', (4 'health-related and societal impact', and (5 'broader economic impact'. These categories were subdivided into 16 common impact subgroups. Authors of the included publications proposed 80 different metrics aimed at measuring impact in these areas. The main limitation of the study was the potential exclusion of relevant articles, as a consequence of the poor indexing of the databases searched.The measurement of research impact is an essential exercise to help direct the allocation of limited research resources, to maximise research benefit, and to help minimise research waste. This review provides a collective summary of existing methodological frameworks for research impact, which funders may use to inform the measurement of research impact and researchers may use to inform

  14. Seismic reliability assessment methodology for CANDU concrete containment structures

    International Nuclear Information System (INIS)

    Stephens, M.J.; Nessim, M.A.; Hong, H.P.

    1995-05-01

    A study was undertaken to develop a reliability-based methodology for the assessment of existing CANDU concrete containment structures with respect to seismic loading. The focus of the study was on defining appropriate specified values and partial safety factors for earthquake loading and resistance parameters. Key issues addressed in the work were the identification of an approach to select design earthquake spectra that satisfy consistent safety levels, and the use of structure-specific data in the evaluation of structural resistance. (author). 23 refs., 9 tabs., 15 figs

  15. Assessment of methodologies for radioactive waste management

    International Nuclear Information System (INIS)

    Hoos, I.R.

    1978-01-01

    No quantitative methodology is adequate to encompass and assess all the risks, no risk/benefit calculation is fine-tuned enough to supply decision-makers with the full range and all of the dimensions. Quality assurance cannot be conceived in terms of systems design alone, but must be maintained vigilantly and with integrity throughout the process. The responsibility of the NRC is fairly well established with respect to overall reactor safety. With respect to the management of radioactive wastes, its mission is not yet so clearly delineated. Herein lies a challenge and an opportunity. Where the known quantitative methodologies are restrictive and likely to have negative feedback effect on authority and public support, the broader lens and the bolder thrust are called for. The cozy cocoon of figures ultimately protects no one. The Commission, having acknowledged that the management of radioactive wastes is not merely a technological matter can now take the socially responsible position of exploring as fully and confronting as candidly as possible the total range of dimensions involved. Paradoxically, it is Charles J. Hitch, intellectual progenitor of the methodology, who observes that we may be missing the meaning of his message by relying too heavily on quantitative analysis and thus defining our task too narrowly. We live in a closed system, in which science and technology, politics and economics, and, above all, social and human elements interact, sometimes to create the problems, sometimes to articulate the questions, and sometimes to find viable solutions

  16. Putting Foucault to work: an approach to the practical application of Foucault's methodological imperatives

    Directory of Open Access Journals (Sweden)

    DAVID A. NICHOLLS

    2009-01-01

    Full Text Available This paper presents an overview of the methodological approach taken in a recently completed Foucauldian discourse analysis of physiotherapy practice. In keeping with other approaches common to postmodern research this paper resists the temptation to define a proper or ‘correct’ interpretation of Foucault’s methodological oeuvre; preferring instead to apply a range of Foucauldian propositions to examples drawn directly from the thesis. In the paper I elucidate on the blended archaeological and genealogical approach I took and unpack some of the key imperatives, principles and rules I grappled with in completing the thesis.

  17. Analysis of mammalian gene function through broad based phenotypic screens across a consortium of mouse clinics

    Science.gov (United States)

    Adams, David J; Adams, Niels C; Adler, Thure; Aguilar-Pimentel, Antonio; Ali-Hadji, Dalila; Amann, Gregory; André, Philippe; Atkins, Sarah; Auburtin, Aurelie; Ayadi, Abdel; Becker, Julien; Becker, Lore; Bedu, Elodie; Bekeredjian, Raffi; Birling, Marie-Christine; Blake, Andrew; Bottomley, Joanna; Bowl, Mike; Brault, Véronique; Busch, Dirk H; Bussell, James N; Calzada-Wack, Julia; Cater, Heather; Champy, Marie-France; Charles, Philippe; Chevalier, Claire; Chiani, Francesco; Codner, Gemma F; Combe, Roy; Cox, Roger; Dalloneau, Emilie; Dierich, André; Di Fenza, Armida; Doe, Brendan; Duchon, Arnaud; Eickelberg, Oliver; Esapa, Chris T; El Fertak, Lahcen; Feigel, Tanja; Emelyanova, Irina; Estabel, Jeanne; Favor, Jack; Flenniken, Ann; Gambadoro, Alessia; Garrett, Lilian; Gates, Hilary; Gerdin, Anna-Karin; Gkoutos, George; Greenaway, Simon; Glasl, Lisa; Goetz, Patrice; Da Cruz, Isabelle Goncalves; Götz, Alexander; Graw, Jochen; Guimond, Alain; Hans, Wolfgang; Hicks, Geoff; Hölter, Sabine M; Höfler, Heinz; Hancock, John M; Hoehndorf, Robert; Hough, Tertius; Houghton, Richard; Hurt, Anja; Ivandic, Boris; Jacobs, Hughes; Jacquot, Sylvie; Jones, Nora; Karp, Natasha A; Katus, Hugo A; Kitchen, Sharon; Klein-Rodewald, Tanja; Klingenspor, Martin; Klopstock, Thomas; Lalanne, Valerie; Leblanc, Sophie; Lengger, Christoph; le Marchand, Elise; Ludwig, Tonia; Lux, Aline; McKerlie, Colin; Maier, Holger; Mandel, Jean-Louis; Marschall, Susan; Mark, Manuel; Melvin, David G; Meziane, Hamid; Micklich, Kateryna; Mittelhauser, Christophe; Monassier, Laurent; Moulaert, David; Muller, Stéphanie; Naton, Beatrix; Neff, Frauke; Nolan, Patrick M; Nutter, Lauryl MJ; Ollert, Markus; Pavlovic, Guillaume; Pellegata, Natalia S; Peter, Emilie; Petit-Demoulière, Benoit; Pickard, Amanda; Podrini, Christine; Potter, Paul; Pouilly, Laurent; Puk, Oliver; Richardson, David; Rousseau, Stephane; Quintanilla-Fend, Leticia; Quwailid, Mohamed M; Racz, Ildiko; Rathkolb, Birgit; Riet, Fabrice; Rossant, Janet; Roux, Michel; Rozman, Jan; Ryder, Ed; Salisbury, Jennifer; Santos, Luis; Schäble, Karl-Heinz; Schiller, Evelyn; Schrewe, Anja; Schulz, Holger; Steinkamp, Ralf; Simon, Michelle; Stewart, Michelle; Stöger, Claudia; Stöger, Tobias; Sun, Minxuan; Sunter, David; Teboul, Lydia; Tilly, Isabelle; Tocchini-Valentini, Glauco P; Tost, Monica; Treise, Irina; Vasseur, Laurent; Velot, Emilie; Vogt-Weisenhorn, Daniela; Wagner, Christelle; Walling, Alison; Weber, Bruno; Wendling, Olivia; Westerberg, Henrik; Willershäuser, Monja; Wolf, Eckhard; Wolter, Anne; Wood, Joe; Wurst, Wolfgang; Yildirim, Ali Önder; Zeh, Ramona; Zimmer, Andreas; Zimprich, Annemarie

    2015-01-01

    The function of the majority of genes in the mouse and human genomes remains unknown. The mouse ES cell knockout resource provides a basis for characterisation of relationships between gene and phenotype. The EUMODIC consortium developed and validated robust methodologies for broad-based phenotyping of knockouts through a pipeline comprising 20 disease-orientated platforms. We developed novel statistical methods for pipeline design and data analysis aimed at detecting reproducible phenotypes with high power. We acquired phenotype data from 449 mutant alleles, representing 320 unique genes, of which half had no prior functional annotation. We captured data from over 27,000 mice finding that 83% of the mutant lines are phenodeviant, with 65% demonstrating pleiotropy. Surprisingly, we found significant differences in phenotype annotation according to zygosity. Novel phenotypes were uncovered for many genes with unknown function providing a powerful basis for hypothesis generation and further investigation in diverse systems. PMID:26214591

  18. INPRO Methodology for Sustainability Assessment of Nuclear Energy Systems: Environmental Impact of Stressors. INPRO Manual

    International Nuclear Information System (INIS)

    2016-01-01

    This publication provides guidance on assessing of sustainability of a nuclear energy system (NES) in the area of environmental impact of stressors. The INPRO methodology is a comprehensive tool for the assessment of sustainability of an NES. Basic principles, user requirements and criteria have been defined in different areas of INPRO methodology. These include economics, infrastructure, waste management, proliferation resistance, environmental impact of stressors, environmental impact from depletion of resources, and safety of nuclear reactors and fuel cycle facilities. The ultimate goal of the application of the INPRO methodology is to check whether the assessed NES fulfils all the criteria, and hence the user requirements and basic principles, and therefore presents a system for a Member State that is sustainable in the long term

  19. The development of a checklist to enhance methodological quality in intervention programs

    Directory of Open Access Journals (Sweden)

    Salvador Chacón-Moscoso

    2016-11-01

    Full Text Available The methodological quality of primary studies is an important issue when performing meta-analyses or systematic reviews. Nevertheless, there are no clear criteria for how methodological quality should be analyzed. Controversies emerge when considering the various theoretical and empirical definitions, especially in relation to three interrelated problems: the lack of representativeness, utility, and feasibility. In this article, we (a systematize and summarize the available literature about methodological quality in primary studies; (b propose a specific, parsimonious, 12-item checklist to empirically define the methodological quality of primary studies based on a content validity study; and (c present an inter-coder reliability study for the resulting 12 items. This paper provides a precise and rigorous description of the development of this checklist, highlighting the clearly specified criteria for the inclusion of items and a substantial inter-coder agreement in the different items. Rather than simply proposing another checklist, however, it then argues that the list constitutes an assessment tool with respect to the representativeness, utility, and feasibility of the most frequent methodological quality items in the literature, one that provides practitioners and researchers with clear criteria for choosing items that may be adequate to their needs. We propose individual methodological features as indicators of quality, arguing that these need to be taken into account when designing, implementing, or evaluating an intervention program. This enhances methodological quality of intervention programs and fosters the cumulative knowledge based on meta-analyses of these interventions. Future development of the checklist is discussed.

  20. Methodology to assess the radiological sensitivity of soils: Application to Spanish soils

    International Nuclear Information System (INIS)

    Trueba Alonso, C.

    2005-01-01

    A methodology, based on standard physical and chemical soil properties, has been developed to estimate the radiological sensitivity of soils to a 137 C s and 90 S r contamination. In this framework, the soil radiological sensitivity is defined as the soil capability to mobilise or to retain these radionuclides. The purpose of this methodology is to assess, in terms of radiological sensitivity indexes, the behaviour of 137 C s and 90 S r in soils and their fluxes to man, considering two exposure pathways, the external irradiation exposure and the internal exposure from ingestion. The methodology is applied to the great variety of soil types found in Spain, where the soil profile is the reference unit for the assessment. The results for these soil types show, that their basic soil properties are the key to categorise the radiological sensitivity according to the risks considered. The final categorisation allows to identify soils specially sensible and improves the radiological impact assessment predictions. (Author)

  1. An Improved Cambridge Filter Pad Extraction Methodology to Obtain More Accurate Water and “Tar” Values: In Situ Cambridge Filter Pad Extraction Methodology

    Directory of Open Access Journals (Sweden)

    Ghosh David

    2014-07-01

    Full Text Available Previous investigations by others and internal investigations at Philip Morris International (PMI have shown that the standard trapping and extraction procedure used for conventional cigarettes, defined in the International Standard ISO 4387 (Cigarettes -- Determination of total and nicotine-free dry particulate matter using a routine analytical smoking machine, is not suitable for high-water content aerosols. Errors occur because of water losses during the opening of the Cambridge filter pad holder to remove the filter pad as well as during the manual handling of the filter pad, and because the commercially available filter pad holder, which is constructed out of plastic, may adsorb water. This results in inaccurate values for the water content, and erroneous and overestimated values for Nicotine Free Dry Particulate Matter (NFDPM. A modified 44 mm Cambridge filter pad holder and extraction equipment which supports in situ extraction methodology has been developed and tested. The principle of the in situ extraction methodology is to avoid any of the above mentioned water losses by extracting the loaded filter pad while kept in the Cambridge filter pad holder which is hermetically sealed by two caps. This is achieved by flushing the extraction solvent numerous times through the hermetically sealed Cambridge filter pad holder by means of an in situ extractor. The in situ methodology showed a significantly more complete water recovery, resulting in more accurate NFDPM values for high-water content aerosols compared to the standard ISO methodology. The work presented in this publication demonstrates that the in situ extraction methodology applies to a wider range of smoking products and smoking regimens, whereas the standard ISO methodology only applies to a limited range of smoking products and smoking regimens, e.g., conventional cigarettes smoked under ISO smoking regimen. In cases where a comparison of yields between the PMI HTP and

  2. Territory development as economic and geographical activity (theory, methodology, practice

    Directory of Open Access Journals (Sweden)

    Vitaliy Nikolaevich Lazhentsev

    2013-03-01

    Full Text Available Accents in a description of theory and methodology of territory development are displaced from distribution of the national benefits on formation of territorial natural and economic systems and organization of economical and geographical activity. The author reveals theconcept of «territory development» and reviews its placein thetheory and methodology of human geography and regionaleconomy. In the articletheindividual directions ofeconomic activity areconsidered. The author has made an attempt to definethesubject matter of five levels of «ideal» territorial and economic systems as a part of objects of the nature, societies, population settlement, production, infrastructure and management. The author’s position of interpretation of sequences of mechanisms of territory development working according to a Nested Doll principle (mechanism of economy, economic management mechanism, controlling mechanism of economy is presented. The author shows the indicators, which authentically define territory development

  3. Using Web-Based Instruction to Teach Music Theory in the Piano Studio: Defining, Designing, and Implementing an Integrative Approach

    Science.gov (United States)

    Carney, Robert D.

    2010-01-01

    This dissertation rationalizes the best use of Web-based instruction (WBI) for teaching music theory to private piano students in the later primary grades. It uses an integrative research methodology for defining, designing, and implementing a curriculum that includes WBI. Research from the fields of music education, educational technology,…

  4. Spectral zone selection methodology for pebble bed reactors

    International Nuclear Information System (INIS)

    Mphahlele, Ramatsemela; Ougouag, Abderrafi M.; Ivanov, Kostadin N.; Gougar, Hans D.

    2011-01-01

    A methodology is developed for determining boundaries of spectral zones for pebble bed reactors. A spectral zone is defined as a region made up of a number of nodes whose characteristics are collectively similar and that are assigned the same few-group diffusion constants. The spectral zones are selected in such a manner that the difference (error) between the reference transport solution and the diffusion code solution takes a minimum value. This is achieved by choosing spectral zones through optimally minimizing this error. The objective function for the optimization algorithm is the total reaction rate error, which is defined as the sum of the leakage, absorption and fission reaction rates errors in each zone. The selection of these spectral zones is such that the core calculation results based on diffusion theory are within an acceptable tolerance as compared to a proper transport reference solution. Through this work, a consistent approach for identifying spectral zones that yield more accurate diffusion results is introduced.

  5. The Impact of Consumer Knowledge Bias on Narrow-Scope Trust, Broad-Scope Trust, and Relationship Satisfaction

    DEFF Research Database (Denmark)

    Hansen, Torben; Grønholdt, Lars; Josiassen, Alexander

    2016-01-01

    This study investigates how consumer knowledge bias - defined as knowledge over/underconfidence (O/U) - influences two types of trust (broad-scope trust and narrow-scope trust) and consumer relationship satisfaction. Based on a survey comprising 756 mutual fund investors, the contribution...... of this study to the marketing literature is twofold. First, taking a marketing relationship approach this study suggests and demonstrates that knowledge O/U positively influences relationship satisfaction and narrow-scope trust such that the more knowledge O/U a customer becomes, the higher/lower the level...... is low compared to high. Notably, the study findings strongly suggest that marketing managers should carry out their relationship satisfaction and trust improvement efforts relative to the combination of customers‟ subjective and objective knowledge....

  6. A methodology for creating greenways through multidisciplinary sustainable landscape planning.

    Science.gov (United States)

    Pena, Selma Beatriz; Abreu, Maria Manuela; Teles, Rui; Espírito-Santo, Maria Dalila

    2010-01-01

    This research proposes a methodology for defining greenways via sustainable planning. This approach includes the analysis and discussion of culture and natural processes that occur in the landscape. The proposed methodology is structured in three phases: eco-cultural analysis; synthesis and diagnosis; and proposal. An interdisciplinary approach provides an assessment of the relationships between landscape structure and landscape dynamics, which are essential to any landscape management or land use. The landscape eco-cultural analysis provides a biophysical, dynamic (geomorphologic rate), vegetation (habitats from directive 92/43/EEC) and cultural characterisation. The knowledge obtained by this analysis then supports the definition of priority actions to stabilise the landscape and the management measures for the habitats. After the analysis and diagnosis phases, a proposal for the development of sustainable greenways can be achieved. This methodology was applied to a study area of the Azambuja Municipality in the Lisbon Metropolitan Area (Portugal). The application of the proposed methodology to the study area shows that landscape stability is crucial for greenway users in order to appreciate the landscape and its natural and cultural elements in a sustainable and healthy way, both by cycling or by foot. A balanced landscape will increase the value of greenways and in return, they can develop socio-economic activities with benefits for rural communities. Copyright 2009 Elsevier Ltd. All rights reserved.

  7. High-frequency measurements of aeolian saltation flux: Field-based methodology and applications

    Science.gov (United States)

    Martin, Raleigh L.; Kok, Jasper F.; Hugenholtz, Chris H.; Barchyn, Thomas E.; Chamecki, Marcelo; Ellis, Jean T.

    2018-02-01

    Aeolian transport of sand and dust is driven by turbulent winds that fluctuate over a broad range of temporal and spatial scales. However, commonly used aeolian transport models do not explicitly account for such fluctuations, likely contributing to substantial discrepancies between models and measurements. Underlying this problem is the absence of accurate sand flux measurements at the short time scales at which wind speed fluctuates. Here, we draw on extensive field measurements of aeolian saltation to develop a methodology for generating high-frequency (up to 25 Hz) time series of total (vertically-integrated) saltation flux, namely by calibrating high-frequency (HF) particle counts to low-frequency (LF) flux measurements. The methodology follows four steps: (1) fit exponential curves to vertical profiles of saltation flux from LF saltation traps, (2) determine empirical calibration factors through comparison of LF exponential fits to HF number counts over concurrent time intervals, (3) apply these calibration factors to subsamples of the saltation count time series to obtain HF height-specific saltation fluxes, and (4) aggregate the calibrated HF height-specific saltation fluxes into estimates of total saltation fluxes. When coupled to high-frequency measurements of wind velocity, this methodology offers new opportunities for understanding how aeolian saltation dynamics respond to variability in driving winds over time scales from tens of milliseconds to days.

  8. Broad beam ion sources and some surface processes

    International Nuclear Information System (INIS)

    Neumann, H.; Scholze, F.; Tarz, M.; Schindler, A.; Wiese, R.; Nestler, M.; Blum, T.

    2005-01-01

    Modern broad-beam multi-aperture ion sources are widely used in material and surface technology applications. Customizing the generated ion beam properties (i. e. the ion current density profile) for specific demands of the application is a main challenge in the improvement of the ion beam technologies. First we introduce ion sources based on different plasma excitation principles shortly. An overview of source plasma and ion beam measurement methods deliver input data for modelling methods. This beam profile modelling using numerical trajectory codes and the validation of the results by Faraday cup measurements as a basis for ion beam profile design are described. Furthermore possibilities for ex situ and in situ beam profile control are demonstrated, like a special method for in situ control of a linear ion source beam profile, a grid modification for circular beam profile design and a cluster principle for broad beam sources. By means of these methods, the beam shape may be adapted to specific technological demands. Examples of broad beam source application in ion beam figuring of optical surfaces, modification of stainless steel, photo voltaic processes and deposition of EUVL-multilayer stacks are finally presented. (Author)

  9. Plasma IL-8 and IL-6 levels can be used to define a group with low risk of septicaemia among cancer patients with fever and neutropenia

    NARCIS (Netherlands)

    de Bont, ESJM; Vellenga, E; Swaanenburg, JCJM; Fidler, [No Value; Visser-van Brummen, PJ; Kamps, WA

    The standard therapy for patients with fever and chemotherapy-related neutropenia is hospitalization and infusion of broad-spectrum antibiotics. Early discharge of a defined group of patients at low risk for septicaemia would be of great advantage for these patients. Ih this study plasma

  10. Plasma IL-8 and IL-6 levels can be used to define a group with low risk of septicaemia among cancer patients with fever and neutropenia

    NARCIS (Netherlands)

    de Bont, ESJM; Vellenga, E; Swaanenburg, JCJM; Fidler, [No Value; Visser-van Brummen, PJ; Kamps, WA

    1999-01-01

    The standard therapy for patients with fever and chemotherapy-related neutropenia is hospitalization and infusion of broad-spectrum antibiotics. Early discharge of a defined group of patients at low risk for septicaemia would be of great advantage for these patients. Ih this study plasma

  11. Subjective and objective outcomes in randomized clinical trials

    DEFF Research Database (Denmark)

    Moustgaard, Helene; Bello, Segun; Miller, Franklin G

    2014-01-01

    explicitly defined the terms. CONCLUSION: The terms "subjective" and "objective" are ambiguous when used to describe outcomes in randomized clinical trials. We suggest that the terms should be defined explicitly when used in connection with the assessment of risk of bias in a clinical trial......OBJECTIVES: The degree of bias in randomized clinical trials varies depending on whether the outcome is subjective or objective. Assessment of the risk of bias in a clinical trial will therefore often involve categorization of the type of outcome. Our primary aim was to examine how the concepts...... "subjective outcome" and "objective outcome" are defined in methodological publications and clinical trial reports. To put this examination into perspective, we also provide an overview of how outcomes are classified more broadly. STUDY DESIGN AND SETTING: A systematic review of methodological publications...

  12. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  13. A systematic methodology review of phase I radiation dose escalation trials

    International Nuclear Information System (INIS)

    Pijls-Johannesma, Madelon; Mastrigt, Ghislaine van; Hahn, Steve M.; De Ruysscher, Dirk; Baumert, Brigitta G.; Lammering, Guido; Buijsen, Jeroen; Bentzen, Soren M.; Lievens, Yolande; Kramar, Andrew; Lambin, Philippe

    2010-01-01

    Background and purpose: The purpose of this review is to evaluate the methodology used in published phase I radiotherapy (RT) dose escalation trials. A specific emphasis was placed on the frequency of reporting late complications as endpoint. Materials and methods: We performed a systematic literature review using a predefined search strategy to identify all phase I trials reporting on external radiotherapy dose escalation in cancer patients. Results: Fifty-three trials (phase I: n = 36, phase I-II: n = 17) fulfilled the inclusion criteria. Of these, 20 used a modified Fibonacci design for the RT dose escalation, but 32 did not specify a design. Late toxicity was variously defined as >3 months (n = 43) or > 6 months (n = 3) after RT, or not defined (n = 7). In only nine studies the maximum tolerated dose (MTD) was related to late toxicity, while only half the studies reported the minimum follow-up period for dose escalation (n = 26). Conclusion: In phase I RT trials, late complications are often not taken into account and there is currently no consensus on the methodology used for radiation dose escalation studies. We therefore propose a decision-tree algorithm which depends on the endpoint selected and whether a validated early surrogate endpoint is available, in order to choose the most appropriate study design.

  14. Development and demonstration of a validation methodology for vehicle lateral dynamics simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Kutluay, Emir

    2013-02-01

    In this thesis a validation methodology to be used in the assessment of the vehicle dynamics simulation models is presented. Simulation of vehicle dynamics is used to estimate the dynamic responses of existing or proposed vehicles and has a wide array of applications in the development of vehicle technologies. Although simulation environments, measurement tools and mathematical theories on vehicle dynamics are well established, the methodical link between the experimental test data and validity analysis of the simulation model is still lacking. The developed validation paradigm has a top-down approach to the problem. It is ascertained that vehicle dynamics simulation models can only be validated using test maneuvers although they are aimed for real world maneuvers. Test maneuvers are determined according to the requirements of the real event at the start of the model development project and data handling techniques, validation metrics and criteria are declared for each of the selected maneuvers. If the simulation results satisfy these criteria, then the simulation is deemed ''not invalid''. If the simulation model fails to meet the criteria, the model is deemed invalid, and model iteration should be performed. The results are analyzed to determine if the results indicate a modeling error or a modeling inadequacy; and if a conditional validity in terms of system variables can be defined. Three test cases are used to demonstrate the application of the methodology. The developed methodology successfully identified the shortcomings of the tested simulation model, and defined the limits of application. The tested simulation model is found to be acceptable but valid only in a certain dynamical range. Several insights for the deficiencies of the model are reported in the analysis but the iteration step of the methodology is not demonstrated. Utilizing the proposed methodology will help to achieve more time and cost efficient simulation projects with

  15. Using Procedure Codes to Define Radiation Toxicity in Administrative Data: The Devil is in the Details.

    Science.gov (United States)

    Meyer, Anne-Marie; Kuo, Tzy-Mey; Chang, YunKyung; Carpenter, William R; Chen, Ronald C; Sturmer, Til

    2017-05-01

    Systematic coding systems are used to define clinically meaningful outcomes when leveraging administrative claims data for research. How and when these codes are applied within a research study can have implications for the study validity and their specificity can vary significantly depending on treatment received. Data are from the Surveillance, Epidemiology, and End Results-Medicare linked dataset. We use propensity score methods in a retrospective cohort of prostate cancer patients first examined in a recently published radiation oncology comparative effectiveness study. With the narrowly defined outcome definition, the toxicity event outcome rate ratio was 0.88 per 100 person-years (95% confidence interval, 0.71-1.08). With the broadly defined outcome, the rate ratio was comparable, with 0.89 per 100 person-years (95% confidence interval, 0.76-1.04), although individual event rates were doubled. Some evidence of surveillance bias was suggested by a higher rate of endoscopic procedures the first year of follow-up in patients who received proton therapy compared with those receiving intensity-modulated radiation treatment (11.15 vs. 8.90, respectively). This study demonstrates the risk of introducing bias through subjective application of procedure codes. Careful consideration is required when using procedure codes to define outcomes in administrative data.

  16. Structural Determination of the Broadly Reactive Anti-IGHV1-69 Anti-idiotypic Antibody G6 and Its Idiotope.

    Science.gov (United States)

    Avnir, Yuval; Prachanronarong, Kristina L; Zhang, Zhen; Hou, Shurong; Peterson, Eric C; Sui, Jianhua; Zayed, Hatem; Kurella, Vinodh B; McGuire, Andrew T; Stamatatos, Leonidas; Hilbert, Brendan J; Bohn, Markus-Frederik; Kowalik, Timothy F; Jensen, Jeffrey D; Finberg, Robert W; Wang, Jennifer P; Goodall, Margaret; Jefferis, Roy; Zhu, Quan; Kurt Yilmaz, Nese; Schiffer, Celia A; Marasco, Wayne A

    2017-12-12

    The heavy chain IGHV1-69 germline gene exhibits a high level of polymorphism and shows biased use in protective antibody (Ab) responses to infections and vaccines. It is also highly expressed in several B cell malignancies and autoimmune diseases. G6 is an anti-idiotypic monoclonal Ab that selectively binds to IGHV1-69 heavy chain germline gene 51p1 alleles that have been implicated in these Ab responses and disease processes. Here, we determine the co-crystal structure of humanized G6 (hG6.3) in complex with anti-influenza hemagglutinin stem-directed broadly neutralizing Ab D80. The core of the hG6.3 idiotope is a continuous string of CDR-H2 residues starting with M53 and ending with N58. G6 binding studies demonstrate the remarkable breadth of binding to 51p1 IGHV1-69 Abs with diverse CDR-H3, light chain, and antigen binding specificities. These studies detail the broad expression of the G6 cross-reactive idiotype (CRI) that further define its potential role in precision medicine. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Structural Determination of the Broadly Reactive Anti-IGHV1-69 Anti-idiotypic Antibody G6 and Its Idiotope

    Directory of Open Access Journals (Sweden)

    Yuval Avnir

    2017-12-01

    Full Text Available The heavy chain IGHV1-69 germline gene exhibits a high level of polymorphism and shows biased use in protective antibody (Ab responses to infections and vaccines. It is also highly expressed in several B cell malignancies and autoimmune diseases. G6 is an anti-idiotypic monoclonal Ab that selectively binds to IGHV1-69 heavy chain germline gene 51p1 alleles that have been implicated in these Ab responses and disease processes. Here, we determine the co-crystal structure of humanized G6 (hG6.3 in complex with anti-influenza hemagglutinin stem-directed broadly neutralizing Ab D80. The core of the hG6.3 idiotope is a continuous string of CDR-H2 residues starting with M53 and ending with N58. G6 binding studies demonstrate the remarkable breadth of binding to 51p1 IGHV1-69 Abs with diverse CDR-H3, light chain, and antigen binding specificities. These studies detail the broad expression of the G6 cross-reactive idiotype (CRI that further define its potential role in precision medicine.

  18. Safety class methodology

    International Nuclear Information System (INIS)

    Donner, E.B.; Low, J.M.; Lux, C.R.

    1992-01-01

    DOE Order 6430.1A, General Design Criteria (GDC), requires that DOE facilities be evaluated with respect to ''safety class items.'' Although the GDC defines safety class items, it does not provide a methodology for selecting safety class items. The methodology described in this paper was developed to assure that Safety Class Items at the Savannah River Site (SRS) are selected in a consistent and technically defensible manner. Safety class items are those in the highest of four categories determined to be of special importance to nuclear safety and, merit appropriately higher-quality design, fabrication, and industrial test standards and codes. The identification of safety class items is approached using a cascading strategy that begins at the 'safety function' level (i.e., a cooling function, ventilation function, etc.) and proceeds down to the system, component, or structure level. Thus, the items that are required to support a safety function are SCls. The basic steps in this procedure apply to the determination of SCls for both new project activities, and for operating facilities. The GDC lists six characteristics of SCls to be considered as a starting point for safety item classification. They are as follows: 1. Those items whose failure would produce exposure consequences that would exceed the guidelines in Section 1300-1.4, ''Guidance on Limiting Exposure of the Public,'' at the site boundary or nearest point of public access 2. Those items required to maintain operating parameters within the safety limits specified in the Operational Safety Requirements during normal operations and anticipated operational occurrences. 3. Those items required for nuclear criticality safety. 4. Those items required to monitor the release of radioactive material to the environment during and after a Design Basis Accident. Those items required to achieve, and maintain the facility in a safe shutdown condition 6. Those items that control Safety Class Item listed above

  19. Minimizing communication cost among distributed controllers in software defined networks

    Science.gov (United States)

    Arlimatti, Shivaleela; Elbreiki, Walid; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed

    2016-08-01

    Software Defined Networking (SDN) is a new paradigm to increase the flexibility of today's network by promising for a programmable network. The fundamental idea behind this new architecture is to simplify network complexity by decoupling control plane and data plane of the network devices, and by making the control plane centralized. Recently controllers have distributed to solve the problem of single point of failure, and to increase scalability and flexibility during workload distribution. Even though, controllers are flexible and scalable to accommodate more number of network switches, yet the problem of intercommunication cost between distributed controllers is still challenging issue in the Software Defined Network environment. This paper, aims to fill the gap by proposing a new mechanism, which minimizes intercommunication cost with graph partitioning algorithm, an NP hard problem. The methodology proposed in this paper is, swapping of network elements between controller domains to minimize communication cost by calculating communication gain. The swapping of elements minimizes inter and intra communication cost among network domains. We validate our work with the OMNeT++ simulation environment tool. Simulation results show that the proposed mechanism minimizes the inter domain communication cost among controllers compared to traditional distributed controllers.

  20. Defining geographic coal markets using price data and shipments data

    International Nuclear Information System (INIS)

    Waarell, Linda

    2005-01-01

    Given the importance of coal in world energy supply an analysis of the relevant geographic market is essential for consumers, producers, as well as for competition policy. The purpose of this paper is to define the relevant economic market for steam and coking coal, and to test the hypothesis of single world markets for these coal products. Methodologically the paper relies on two different tests for defining markets, using both shipments data and price data. The results from both methods point in the same direction. In the case of coking coal the results indicate that the market is essentially global in scope, and also that the market has become more integrated over time. The results for steam coal show that the market is more regional in scope, and there exist no clear tendencies of increased integration over time. One policy implication of the finding that the steam coal market is more regional in scope, and thus that the market boundary is smaller than if the market would have been international, is that a merger and acquisition in this market likely would have been of a more concern for antitrust authorities than the same activity on the coking coal market

  1. Quick-E-scan: A methodology for the energy scan of SMEs

    International Nuclear Information System (INIS)

    Cagno, E.; Trucco, P.; Trianni, A.; Sala, G.

    2010-01-01

    This paper introduces the Quick-E-Scan methodology that has been developed to achieve the operational energy efficiency of small and medium enterprises (SMEs), characterized by being scarcely disposed to long energy audits and by a limited budget for energy management programs. On one side, through dividing the firm into functional units - either service (lighting, HVAC, etc.) or production units - the main consuming areas are identified and a criticality index is defined; conversely, an enhancement index highlights the gap of each unit towards the best available techniques (BATs) in energy management programs. Finally, a priority index, created with the junction of the two indexes, points out the most profitable areas in which energy saving measures should be implemented. The methodology, particularly quick and simple, has been successfully tested in 38 SMEs in Northern Italy.

  2. Concept mapping methodology and community-engaged research: A perfect pairing.

    Science.gov (United States)

    Vaughn, Lisa M; Jones, Jennifer R; Booth, Emily; Burke, Jessica G

    2017-02-01

    Concept mapping methodology as refined by Trochim et al. is uniquely suited to engage communities in all aspects of research from project set-up to data collection to interpreting results to dissemination of results, and an increasing number of research studies have utilized the methodology for exploring complex health issues in communities. In the current manuscript, we present the results of a literature search of peer-reviewed articles in health-related research where concept mapping was used in collaboration with the community. A total of 103 articles met the inclusion criteria. We first address how community engagement was defined in the articles and then focus on the articles describing high community engagement and the associated community outcomes/benefits and methodological challenges. A majority (61%; n=63) of the articles were classified as low to moderate community engagement and participation while 38% (n=39) of the articles were classified as high community engagement and participation. The results of this literature review enhance our understanding of how concept mapping can be used in direct collaboration with communities and highlights the many potential benefits for both researchers and communities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Exploring the Central Nervous System: methodological state of the art

    International Nuclear Information System (INIS)

    Darcourt, Jacques; Koulibaly, Pierre-Malick; Migneco, Octave

    2005-01-01

    The analysis of the clinical use of brain SPECT demonstrate a defacing between the methodological developments published recently and its current use in clinical practice. We review a description of recent methodological developments that could be useful in three classical clinical application: the diagnosis of Alzheimer's disease, the evaluation of dopaminergic neurotransmission in Parkinson's Disease and the study of epilepsy. In Alzheimer's disease the methods of spatial standardization and the comparison to a normative data base are more useful to observers that have the least experience and for this end methodological approaches that are oriented to routine work better and are simpler than SPM. Quantification is essential in the study of dopaminergic neurotransmission and the measurement of binding potential appears biased due to septal penetration, attenuation, diffusion and partial volume effect. Partial volume effect introduces most error and its correction is difficult because of the co registration precision required with magnetic resonance images. The study of epilepsy by subtraction of ictal and interictal SPECT has demonstrated its clinical value. It is a fusion of images operation that has now very well defined methods (au)

  4. Prioritization methodology for chemical replacement

    Science.gov (United States)

    Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott

    1995-01-01

    Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology

  5. A methodology for Manufacturing Execution Systems (MES) implementation

    Science.gov (United States)

    Govindaraju, Rajesri; Putra, Krisna

    2016-02-01

    Manufacturing execution system is information systems (IS) application that bridges the gap between IS at the top level, namely enterprise resource planning (ERP), and IS at the lower levels, namely the automation systems. MES provides a media for optimizing the manufacturing process as a whole in a real time basis. By the use of MES in combination with the implementation of ERP and other automation systems, a manufacturing company is expected to have high competitiveness. In implementing MES, functional integration -making all the components of the manufacturing system able to work well together, is the most difficult challenge. For this, there has been an industry standard that specifies the sub-systems of a manufacturing execution systems and defines the boundaries between ERP systems, MES, and other automation systems. The standard is known as ISA-95. Although the advantages from the use of MES have been stated in some studies, not much research being done on how to implement MES effectively. The purpose of this study is to develop a methodology describing how MES implementation project should be managed, utilising the support of ISA- 95 reference model in the system development process. A proposed methodology was developed based on a general IS development methodology. The developed methodology were then revisited based on the understanding about the specific charateristics of MES implementation project found in an Indonesian steel manufacturing company implementation case. The case study highlighted the importance of applying an effective requirement elicitation method during innitial system assessment process, managing system interfaces and labor division in the design process, and performing a pilot deployment before putting the whole system into operation.

  6. The decade 1989-1998 in Spanish psychology: an analysis of research in statistics, methodology, and psychometric theory.

    Science.gov (United States)

    García-Pérez, M A

    2001-11-01

    This paper presents an analysis of research published in the decade 1989-1998 by Spanish faculty members in the areas of statistical methods, research methodology, and psychometric theory. Database search and direct correspondence with faculty members in Departments of Methodology across Spain rendered a list of 193 papers published in these broad areas by 82 faculty members. These and other faculty members had actually published 931 papers over the decade of analysis, but 738 of them addressed topics not appropriate for description in this report. Classification and analysis of these 193 papers revealed topics that have attracted the most interest (psychophysics, item response theory, analysis of variance, sequential analysis, and meta-analysis) as well as other topics that have received less attention (scaling, factor analysis, time series, and structural models). A significant number of papers also dealt with various methodological issues (software, algorithms, instrumentation, and techniques). A substantial part of this report is devoted to describing the issues addressed across these 193 papers--most of which are written in the Spanish language and published in Spanish journals--and some representative references are given.

  7. Soft system methodology and decision making in community planning system

    OpenAIRE

    Křupka, Jiří; Kašparová, Miloslava; Jirava, Pavel; Mandys, Jan; Ferynová, Lenka; Duplinský, Josef

    2013-01-01

    A model of community planning was defined in this paper. The model was designed for the city of Pardubice and works with real questionnaire research data sets in its evaluation phase. Questionnaires were submitted to fill users, providers and sponsors of social services. When creating the model was used Checkland’s soft system methodology. Also soft computing methods and decision trees were used to create the model. The model was implemented in the data mining tool IBM SPSS Modeler 14.

  8. Distributing the Corporate Income Tax: Revised U.S. Treasury Methodology

    OpenAIRE

    Cronin, Julie Anne; Lin, Emily Y.; Power, Laura; Cooper, Michael

    2013-01-01

    The purpose of this analysis is to improve the U.S. Department of the Treasury’s distributional model and methodology by defining new model parameters. We compute the percentage of capital income attributable to normal versus supernormal return, the percentage of normal return attributable to the "cash flow tax" portion of the tax that does not impose a tax burden, and the portion of the burdensome tax on the normal return to capital borne by capital income versus labor income. In summary, 82...

  9. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Quantifying the Accuracy of Digital Hemispherical Photography for Leaf Area Index Estimates on Broad-Leaved Tree Species.

    Science.gov (United States)

    Gilardelli, Carlo; Orlando, Francesca; Movedi, Ermes; Confalonieri, Roberto

    2018-03-29

    Digital hemispherical photography (DHP) has been widely used to estimate leaf area index (LAI) in forestry. Despite the advancement in the processing of hemispherical images with dedicated tools, several steps are still manual and thus easily affected by user's experience and sensibility. The purpose of this study was to quantify the impact of user's subjectivity on DHP LAI estimates for broad-leaved woody canopies using the software Can-Eye. Following the ISO 5725 protocol, we quantified the repeatability and reproducibility of the method, thus defining its precision for a wide range of broad-leaved canopies markedly differing for their structure. To get a complete evaluation of the method accuracy, we also quantified its trueness using artificial canopy images with known canopy cover. Moreover, the effect of the segmentation method was analysed. The best results for precision (restrained limits of repeatability and reproducibility) were obtained for high LAI values (>5) with limits corresponding to a variation of 22% in the estimated LAI values. Poorer results were obtained for medium and low LAI values, with a variation of the estimated LAI values that exceeded the 40%. Regardless of the LAI range explored, satisfactory results were achieved for trees in row-structured plantations (limits almost equal to the 30% of the estimated LAI). Satisfactory results were achieved for trueness, regardless of the canopy structure. The paired t -test revealed that the effect of the segmentation method on LAI estimates was significant. Despite a non-negligible user effect, the accuracy metrics for DHP are consistent with those determined for other indirect methods for LAI estimates, confirming the overall reliability of DHP in broad-leaved woody canopies.

  11. Quantifying the Accuracy of Digital Hemispherical Photography for Leaf Area Index Estimates on Broad-Leaved Tree Species

    Directory of Open Access Journals (Sweden)

    Carlo Gilardelli

    2018-03-01

    Full Text Available Digital hemispherical photography (DHP has been widely used to estimate leaf area index (LAI in forestry. Despite the advancement in the processing of hemispherical images with dedicated tools, several steps are still manual and thus easily affected by user’s experience and sensibility. The purpose of this study was to quantify the impact of user’s subjectivity on DHP LAI estimates for broad-leaved woody canopies using the software Can-Eye. Following the ISO 5725 protocol, we quantified the repeatability and reproducibility of the method, thus defining its precision for a wide range of broad-leaved canopies markedly differing for their structure. To get a complete evaluation of the method accuracy, we also quantified its trueness using artificial canopy images with known canopy cover. Moreover, the effect of the segmentation method was analysed. The best results for precision (restrained limits of repeatability and reproducibility were obtained for high LAI values (>5 with limits corresponding to a variation of 22% in the estimated LAI values. Poorer results were obtained for medium and low LAI values, with a variation of the estimated LAI values that exceeded the 40%. Regardless of the LAI range explored, satisfactory results were achieved for trees in row-structured plantations (limits almost equal to the 30% of the estimated LAI. Satisfactory results were achieved for trueness, regardless of the canopy structure. The paired t-test revealed that the effect of the segmentation method on LAI estimates was significant. Despite a non-negligible user effect, the accuracy metrics for DHP are consistent with those determined for other indirect methods for LAI estimates, confirming the overall reliability of DHP in broad-leaved woody canopies.

  12. Partial costs of global climate change adaptation for the supply of raw industrial and municipal water: a methodology and application

    International Nuclear Information System (INIS)

    Ward, Philip J; Pauw, W Pieter; Brander, Luke M; Aerts, Jeroen C J H; Strzepek, Kenneth M; Hughes, Gordon A

    2010-01-01

    Despite growing recognition of the importance of climate change adaptation, few global estimates of the costs involved are available for the water supply sector. We present a methodology for estimating partial global and regional adaptation costs for raw industrial and domestic water supply, for a limited number of adaptation strategies, and apply the method using results of two climate models. In this paper, adaptation costs are defined as those for providing enough raw water to meet future industrial and municipal water demand, based on country-level demand projections to 2050. We first estimate costs for a baseline scenario excluding climate change, and then additional climate change adaptation costs. Increased demand is assumed to be met through a combination of increased reservoir yield and alternative backstop measures. Under such controversial measures, we project global adaptation costs of $12 bn p.a., with 83-90% in developing countries; the highest costs are in Sub-Saharan Africa. Globally, adaptation costs are low compared to baseline costs ($73 bn p.a.), which supports the notion of mainstreaming climate change adaptation into broader policy aims. The method provides a tool for estimating broad costs at the global and regional scale; such information is of key importance in international negotiations.

  13. Partial costs of global climate change adaptation for the supply of raw industrial and municipal water: a methodology and application

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Philip J; Pauw, W Pieter; Brander, Luke M; Aerts, Jeroen C J H [Institute for Environmental Studies (IVM), VU University Amsterdam (Netherlands); Strzepek, Kenneth M [Joint Program on the Science and Policy of Global Change, Massachusetts Institute of Technology, MA (United States); Hughes, Gordon A, E-mail: philip.ward@ivm.vu.nl [School of Economics, University of Edinburgh (United Kingdom)

    2010-10-15

    Despite growing recognition of the importance of climate change adaptation, few global estimates of the costs involved are available for the water supply sector. We present a methodology for estimating partial global and regional adaptation costs for raw industrial and domestic water supply, for a limited number of adaptation strategies, and apply the method using results of two climate models. In this paper, adaptation costs are defined as those for providing enough raw water to meet future industrial and municipal water demand, based on country-level demand projections to 2050. We first estimate costs for a baseline scenario excluding climate change, and then additional climate change adaptation costs. Increased demand is assumed to be met through a combination of increased reservoir yield and alternative backstop measures. Under such controversial measures, we project global adaptation costs of $12 bn p.a., with 83-90% in developing countries; the highest costs are in Sub-Saharan Africa. Globally, adaptation costs are low compared to baseline costs ($73 bn p.a.), which supports the notion of mainstreaming climate change adaptation into broader policy aims. The method provides a tool for estimating broad costs at the global and regional scale; such information is of key importance in international negotiations.

  14. Broad spectrum bioactive sunscreens.

    Science.gov (United States)

    Velasco, Maria Valéria Robles; Sarruf, Fernanda Daud; Salgado-Santos, Idalina Maria Nunes; Haroutiounian-Filho, Carlos Alberto; Kaneko, Telma Mary; Baby, André Rolim

    2008-11-03

    The development of sunscreens containing reduced concentration of chemical UV filters, even though, possessing broad spectrum effectiveness with the use of natural raw materials that improve and infer UV absorption is of great interest. Due to the structural similarities between polyphenolic compounds and organic UV filters, they might exert photoprotection activity. The objective of the present research work was to develop bioactive sunscreen delivery systems containing rutin, Passiflora incarnata L. and Plantago lanceolata extracts associated or not with organic and inorganic UV filters. UV transmission of the sunscreen delivery system films was performed by using diffuse transmittance measurements coupling to an integrating sphere. In vitro photoprotection efficacy was evaluated according to the following parameters: estimated sun protection factor (SPF); Boot's Star Rating category; UVA/UVB ratio; and critical wavelength (lambda(c)). Sunscreen delivery systems obtained SPF values ranging from 0.972+/-0.004 to 28.064+/-2.429 and bioactive compounds interacted with the UV filters positive and negatively. This behavior may be attributed to: the composition of the delivery system; the presence of inorganic UV filter and quantitative composition of the organic UV filters; and the phytochemical composition of the P. incarnata L. and P. lanceolata extracts. Among all associations of bioactive compounds and UV filters, we found that the broad spectrum sunscreen was accomplished when 1.68% (w/w) P. incarnata L. dry extract was in the presence of 7.0% (w/w) ethylhexyl methoxycinnamate, 2.0% (w/w) benzophenone-3 and 2.0% (w/w) TiO(2). It was demonstrated that this association generated estimated SPF of 20.072+/-0.906 and it has improved the protective defense against UVA radiation accompanying augmentation of the UVA/UVB ratio from 0.49 to 0.52 and lambda(c) from 364 to 368.6nm.

  15. Broad-band beam buncher

    International Nuclear Information System (INIS)

    Goldberg, D.A.; Flood, W.S.; Arthur, A.A.; Voelker, F.

    1986-01-01

    This patent describes a broad-band beam buncher. This beam buncher consists of: a housing adapted to be eacuated, an electron gun in the housing for producing a beam of electrons, buncher means in the housing forming a buncher cavity which has an entrance opening for receiving the electron beam and an exit opening through which the electron beam passes out of the buncher cavity, a drift tube electrode in the buncher cavity and disposed between the entrance opening and the exit opening with first and second gaps between the drift tube electrode and the entrance and exit openings, the drift tube electrode which has a first drift space through which the electron beam passes in traveling between the entrance and exit openings, modulating means for supplying an ultrahigh frequeny modulating signal to the drift tube electrode for producing velocity modulation of the electrons in the electron beam as the electrons pass through the buncher cavity and the drift tube electrode between the entrance opening and the exit opening, drift space means in the housing forming a second drift space for receiving the velocity modulated electron beam from the exit opening, the velocity modulated electron beam being bunched as it passes along the second drift space, the drift space means has a discharge opening through which the electron beam is discharged from the second drift space after being bunched therein, the modulating means containing a signal source for producing an ultrahigh frequency signal, a transmission line connected between the signal source and the drift tube electrode, and terminating means connected to the drift tube electrode for terminating the transmission line in approximately its characteristic impedance to afford a broad response band with minimum 6 variations therein

  16. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Directory of Open Access Journals (Sweden)

    Gabriela D A Guardia

    Full Text Available Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  17. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Science.gov (United States)

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  18. TECHNOLOGY FOR DEVELOPMENT OF ELECTRONIC TEXTBOOK ON HANDICRAFTS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Iryna V. Androshchuk

    2017-10-01

    Full Text Available The main approaches to defining the concept of electronic textbook have been analyzed in the article. The main advantages of electronic textbooks in the context of future teachers’ training have been outlined. They are interactivity, feedback provision, availability of navigation and search engine. The author has presented and characterized the main stages in the technology of development of an electronic textbook on Handicraft and Technology Training Methodology: determination of its role and significance in the process of mastering the discipline; justification of its structure; outline of the stages of its development in accordance with the defined structure. The characteristic feature of the developed electronic textbook is availability of macro- and microstructure. Macrostructure is viewed as a sequence of components of the electronic textbook that are manifested in its content; microstructure is considered to be an internal pattern of each component of macrostructure.

  19. Defining the Ecological Coefficient of Performance for an Aircraft Propulsion System

    Science.gov (United States)

    Şöhret, Yasin

    2018-05-01

    The aircraft industry, along with other industries, is considered responsible these days regarding environmental issues. Therefore, the performance evaluation of aircraft propulsion systems should be conducted with respect to environmental and ecological considerations. The current paper aims to present the ecological coefficient of performance calculation methodology for aircraft propulsion systems. The ecological coefficient performance is a widely-preferred performance indicator of numerous energy conversion systems. On the basis of thermodynamic laws, the methodology used to determine the ecological coefficient of performance for an aircraft propulsion system is parametrically explained and illustrated in this paper for the first time. For a better understanding, to begin with, the exergy analysis of a turbojet engine is described in detail. Following this, the outputs of the analysis are employed to define the ecological coefficient of performance for a turbojet engine. At the end of the study, the ecological coefficient of performance is evaluated parametrically and discussed depending on selected engine design parameters and performance measures. The author asserts the ecological coefficient of performance to be a beneficial indicator for researchers interested in aircraft propulsion system design and related topics.

  20. Evaluation of a Broad-Spectrum Partially Automated Adverse Event Surveillance System: A Potential Tool for Patient Safety Improvement in Hospitals With Limited Resources.

    Science.gov (United States)

    Saikali, Melody; Tanios, Alain; Saab, Antoine

    2017-11-21

    The aim of the study was to evaluate the sensitivity and resource efficiency of a partially automated adverse event (AE) surveillance system for routine patient safety efforts in hospitals with limited resources. Twenty-eight automated triggers from the hospital information system's clinical and administrative databases identified cases that were then filtered by exclusion criteria per trigger and then reviewed by an interdisciplinary team. The system, developed and implemented using in-house resources, was applied for 45 days of surveillance, for all hospital inpatient admissions (N = 1107). Each trigger was evaluated for its positive predictive value (PPV). Furthermore, the sensitivity of the surveillance system (overall and by AE category) was estimated relative to incidence ranges in the literature. The surveillance system identified a total of 123 AEs among 283 reviewed medical records, yielding an overall PPV of 52%. The tool showed variable levels of sensitivity across and within AE categories when compared with the literature, with a relatively low overall sensitivity estimated between 21% and 44%. Adverse events were detected in 23 of the 36 AE categories defined by an established harm classification system. Furthermore, none of the detected AEs were voluntarily reported. The surveillance system showed variable sensitivity levels across a broad range of AE categories with an acceptable PPV, overcoming certain limitations associated with other harm detection methods. The number of cases captured was substantial, and none had been previously detected or voluntarily reported. For hospitals with limited resources, this methodology provides valuable safety information from which interventions for quality improvement can be formulated.

  1. Broad-Application Test Reactor

    International Nuclear Information System (INIS)

    Motloch, C.G.

    1992-05-01

    This report is about a new, safe, and operationally efficient DOE reactor of nuclear research and testing proposed for the early to mid- 21st Century. Dubbed the Broad-Application Test Reactor (BATR), the proposed facility incorporates a multiple-application, multiple-mission design to support DOE programs such as naval reactors and space power and propulsion, as well as research in medical, science, isotope, and electronics arenas. DOE research reactors are aging, and implementing major replacement projects requires long lead times. Primary design drivers include safety, low risk, minimum operation cost, mission flexibility, waste minimization, and long life. Scientists and engineers at the Idaho National Engineering Laboratory are evaluating possible fuel forms, structural materials, reactor geometries, coolants, and moderators

  2. Broad line regions in Seyfert-1 galaxies

    International Nuclear Information System (INIS)

    Groningen, E. van.

    1984-01-01

    To reproduce observed emission profiles of Seyfert galaxies, rotation in an accretion disk has been proposed. In this thesis, the profiles emitted by such an accretion disk are investigated. Detailed comparison with the observed profiles yields that a considerable fraction can be fitted with a power-law function, as predicted by the model. The author analyzes a series of high quality spectra of Seyfert galaxies, obtained with the 2.5m telescope at Las Campanas. He presents detailed analyses of two objects: Mkn335 and Akn120. In both cases, strong evidence is presented for the presence of two separate broad line zones. These zones are identified with an accretion disk and an outflowing wind. The disk contains gas with very high densities and emits predominantly the lower ionization lines. He reports on the discovery of very broad wings beneath the strong forbidden line 5007. (Auth.)

  3. Methodology implementation for multi objective optimisation for nuclear fleet evolution scenarios

    International Nuclear Information System (INIS)

    Freynet, David

    2016-01-01

    The issue of the evolution French nuclear fleet can be considered through the study of nuclear transition scenarios. These studies are of paramount importance as their results can greatly affect the decision making process, given that they take into account industrial concerns, investments, time, and nuclear system complexity. Such studies can be performed with the COSI code (developed at the CEA/DEN), which enables the calculation of matter inventories and fluxes across the fuel cycle (nuclear reactors and associated facilities), especially when coupled with the CESAR depletion code. The studies today performed with COSI require the definition of the various scenarios' input parameters, in order to fulfil different objectives such as minimising natural uranium consumption, waste production and so on. These parameters concern the quantities and the scheduling of spent fuel destined for reprocessing, and the number, the type and the commissioning dates of deployed reactors.This work aims to develop, validate and apply an optimisation methodology coupled with COSI, in order to determine optimal nuclear transition scenarios for a multi-objective platform. Firstly, this methodology is based on the acceleration of scenario evaluation, enabling the use of optimisation methods in a reasonable time-frame. With this goal in mind, artificial neural network irradiation surrogate models are created with the URANIE platform (developed at the CEA/DEN) and are implemented within COSI. The next step in this work is to use, adapt and compare different optimisation methods, such as URANIE's genetic algorithm and particle swarm methods, in order to define a methodology suited to this type of study. This methodology development is based on an incremental approach which progressively adds objectives, constraints and decision variables to the optimisation problem definition. The variables added, which are related to reactor deployment and spent fuel reprocessing strategies, are chosen

  4. Extrusion-cooking to improve the animal feed quality of broad beans

    NARCIS (Netherlands)

    Moscicki, L.; Wojcik, S.; Plaur, K.; Zuilichem, van D.J.

    1984-01-01

    Extrusion-cooking of broad beans with a single-screw extruder has been investigated. Attention was focused on process requirements as well as on the nutritional effects of extrusion-cooked broad beans in a chicken feed formulation. The optimal thermal process conditions required for a product of

  5. Analysis of mammalian gene function through broad-based phenotypic screens across a consortium of mouse clinics.

    Science.gov (United States)

    de Angelis, Martin Hrabě; Nicholson, George; Selloum, Mohammed; White, Jacqui; Morgan, Hugh; Ramirez-Solis, Ramiro; Sorg, Tania; Wells, Sara; Fuchs, Helmut; Fray, Martin; Adams, David J; Adams, Niels C; Adler, Thure; Aguilar-Pimentel, Antonio; Ali-Hadji, Dalila; Amann, Gregory; André, Philippe; Atkins, Sarah; Auburtin, Aurelie; Ayadi, Abdel; Becker, Julien; Becker, Lore; Bedu, Elodie; Bekeredjian, Raffi; Birling, Marie-Christine; Blake, Andrew; Bottomley, Joanna; Bowl, Mike; Brault, Véronique; Busch, Dirk H; Bussell, James N; Calzada-Wack, Julia; Cater, Heather; Champy, Marie-France; Charles, Philippe; Chevalier, Claire; Chiani, Francesco; Codner, Gemma F; Combe, Roy; Cox, Roger; Dalloneau, Emilie; Dierich, André; Di Fenza, Armida; Doe, Brendan; Duchon, Arnaud; Eickelberg, Oliver; Esapa, Chris T; El Fertak, Lahcen; Feigel, Tanja; Emelyanova, Irina; Estabel, Jeanne; Favor, Jack; Flenniken, Ann; Gambadoro, Alessia; Garrett, Lilian; Gates, Hilary; Gerdin, Anna-Karin; Gkoutos, George; Greenaway, Simon; Glasl, Lisa; Goetz, Patrice; Da Cruz, Isabelle Goncalves; Götz, Alexander; Graw, Jochen; Guimond, Alain; Hans, Wolfgang; Hicks, Geoff; Hölter, Sabine M; Höfler, Heinz; Hancock, John M; Hoehndorf, Robert; Hough, Tertius; Houghton, Richard; Hurt, Anja; Ivandic, Boris; Jacobs, Hughes; Jacquot, Sylvie; Jones, Nora; Karp, Natasha A; Katus, Hugo A; Kitchen, Sharon; Klein-Rodewald, Tanja; Klingenspor, Martin; Klopstock, Thomas; Lalanne, Valerie; Leblanc, Sophie; Lengger, Christoph; le Marchand, Elise; Ludwig, Tonia; Lux, Aline; McKerlie, Colin; Maier, Holger; Mandel, Jean-Louis; Marschall, Susan; Mark, Manuel; Melvin, David G; Meziane, Hamid; Micklich, Kateryna; Mittelhauser, Christophe; Monassier, Laurent; Moulaert, David; Muller, Stéphanie; Naton, Beatrix; Neff, Frauke; Nolan, Patrick M; Nutter, Lauryl Mj; Ollert, Markus; Pavlovic, Guillaume; Pellegata, Natalia S; Peter, Emilie; Petit-Demoulière, Benoit; Pickard, Amanda; Podrini, Christine; Potter, Paul; Pouilly, Laurent; Puk, Oliver; Richardson, David; Rousseau, Stephane; Quintanilla-Fend, Leticia; Quwailid, Mohamed M; Racz, Ildiko; Rathkolb, Birgit; Riet, Fabrice; Rossant, Janet; Roux, Michel; Rozman, Jan; Ryder, Ed; Salisbury, Jennifer; Santos, Luis; Schäble, Karl-Heinz; Schiller, Evelyn; Schrewe, Anja; Schulz, Holger; Steinkamp, Ralf; Simon, Michelle; Stewart, Michelle; Stöger, Claudia; Stöger, Tobias; Sun, Minxuan; Sunter, David; Teboul, Lydia; Tilly, Isabelle; Tocchini-Valentini, Glauco P; Tost, Monica; Treise, Irina; Vasseur, Laurent; Velot, Emilie; Vogt-Weisenhorn, Daniela; Wagner, Christelle; Walling, Alison; Weber, Bruno; Wendling, Olivia; Westerberg, Henrik; Willershäuser, Monja; Wolf, Eckhard; Wolter, Anne; Wood, Joe; Wurst, Wolfgang; Yildirim, Ali Önder; Zeh, Ramona; Zimmer, Andreas; Zimprich, Annemarie; Holmes, Chris; Steel, Karen P; Herault, Yann; Gailus-Durner, Valérie; Mallon, Ann-Marie; Brown, Steve Dm

    2015-09-01

    The function of the majority of genes in the mouse and human genomes remains unknown. The mouse embryonic stem cell knockout resource provides a basis for the characterization of relationships between genes and phenotypes. The EUMODIC consortium developed and validated robust methodologies for the broad-based phenotyping of knockouts through a pipeline comprising 20 disease-oriented platforms. We developed new statistical methods for pipeline design and data analysis aimed at detecting reproducible phenotypes with high power. We acquired phenotype data from 449 mutant alleles, representing 320 unique genes, of which half had no previous functional annotation. We captured data from over 27,000 mice, finding that 83% of the mutant lines are phenodeviant, with 65% demonstrating pleiotropy. Surprisingly, we found significant differences in phenotype annotation according to zygosity. New phenotypes were uncovered for many genes with previously unknown function, providing a powerful basis for hypothesis generation and further investigation in diverse systems.

  6. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis.

    Science.gov (United States)

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-10-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%.

  7. Local and regional low carbon scenarios methodology, challenges and opportunities

    International Nuclear Information System (INIS)

    2010-01-01

    In its first part, this report discusses the emergence of local climate and energy policy in Europe, the implementation of nationally imposed but regionally anchored energy scenarios (i.e. in France, the Climate Air Energy Regional Schemes or SRCAE). Then it addresses the methodological and political aspects of local and regional low emission scenarios: methodologies and typologies of energy scenarios, ways to define an appropriate emission reduction and energy consumption objective, ways to deal with emission or carbon gaps, ways to make local emission inventories, ways to gather local data, ways to deal with special emission sources, ways to assess and develop local energy efficiency and renewable energy potentials, ways to take energy sufficiency into account, and the evolution from energy autonomy to 100% renewable energy territories. The last part addresses the issues of stake holder and citizen participation in the definition of long term strategies

  8. Low-level waste disposal site performance assessment with the RQ/PQ methodology. Final report

    International Nuclear Information System (INIS)

    Rogers, V.C.; Grant, M.W.; Sutherland, A.A.

    1982-12-01

    A methodology called RQ/PQ (retention quotient/performance quotient) has been developed for relating the potential hazard of radioactive waste to the natural and man-made barriers provided by a disposal facility. The methodology utilizes a systems approach to quantify the safety of low-level waste disposed in a near-surface facility. The main advantages of the RQ/PQ methodology are its simplicity of analysis and clarity of presentation while still allowing a comprehensive set of nuclides and pathways to be treated. Site performance and facility designs for low-level waste disposal can be easily investigated with relatively few parameters needed to define the problem. Application of the methodology has revealed that the key factor affecting the safety of low-level waste disposal in near surface facilities is the potential for intrusion events. Food, inhalation and well water pathways dominate in the analysis of such events. While the food and inhalation pathways are not strongly site-dependent, the well water pathway is. Finally, burial at depths of 5 m or more was shown to reduce the impacts from intrusion events

  9. RAMA Methodology for the Calculation of Neutron Fluence; Metodologia RAMA para el Calculo de la Fluencia Neutronica

    Energy Technology Data Exchange (ETDEWEB)

    Villescas, G.; Corchon, F.

    2013-07-01

    he neutron fluence plays an important role in the study of the structural integrity of the reactor vessel after a certain time of neutron irradiation. The NRC defined in the Regulatory Guide 1.190, the way must be estimated neutron fluence, including uncertainty analysis of the validation process (creep uncertainty is ? 20%). TRANSWARE Enterprises Inc. developed a methodology for calculating the neutron flux, 1,190 based guide, known as RAMA. Uncertainty values obtained with this methodology, for about 18 vessels, are less than 10%.

  10. The Broad Challenge to Democratic Leadership: The Other Crisis in Education

    Science.gov (United States)

    Miller, Vachel W.

    2012-01-01

    This article interrogates the workings of the Broad Superintendents Academy, as a specific illustration of the influence of venture philanthropy in American public education. It introduces the Broad Foundation's agenda for educational leadership training, foregrounding how it frames the problem of leadership and the implications of such training…

  11. Spatial Development Modeling Methodology Application Possibilities in Vilnius

    Directory of Open Access Journals (Sweden)

    Lina Panavaitė

    2017-05-01

    Full Text Available In order to control the continued development of high-rise buildings and their irreversible visual impact on the overall silhouette of the city, the great cities of the world introduced new methodological principles to city’s spatial development models. These methodologies and spatial planning guidelines are focused not only on the controlled development of high-rise buildings, but on the spatial modelling of the whole city by defining main development criteria and estimating possible consequences. Vilnius city is no exception, however the re-establishment of independence of Lithuania caused uncontrolled urbanization process, so most of the city development regulations emerged as a consequence of unmanaged processes of investors’ expectations legalization. The importance of consistent urban fabric as well as conservation and representation of city’s most important objects gained attention only when an actual threat of overshadowing them with new architecture along with unmanaged urbanization in the city center or urban sprawl at suburbia, caused by land-use projects, had emerged. Current Vilnius’ spatial planning documents clearly define urban structure and key development principles, however the definitions are relatively abstract, causing uniform building coverage requirements for territories with distinct qualities and simplifying planar designs which do not meet quality standards. The overall quality of urban architecture is not regulated. The article deals with current spatial modeling methods, their individual parts, principles, the criteria for quality assessment and their applicability in Vilnius. The text contains an outline of possible building coverage regulations and impact assessment criteria for new development. The article contains a compendium of requirements for high-quality spatial planning and building design.

  12. Analysis of Interbrand, BrandZ and BAV brand valuation methodologies

    Directory of Open Access Journals (Sweden)

    Krstić Bojan

    2011-01-01

    Full Text Available Brand valuation is considered for one of the most significant challenges of not only theory and practice of contemporary marketing, but other disciplines, as well. Namely, the complex nature of this issue implies the need for multidisciplinary approach and creation of methodology which goes beyond the borders of marketing as a discipline, and includes knowledge derived from accounting, finance and other areas. However, mostly one-sided approaches oriented towards determining brand value either based on research results of consumer behavior and attitudes or based on financial success of the brand are dominant in marketing and financial literature. Simultaneously with these theoretical methodologies, agencies for consultancy and marketing and other subjects have been developing their own brand valuation methods and models. Some of them can be appointed to comprehensive approach to brand valuation, which overcomes mentioned problem considering one-sided analysis of brand value. The comprehensive approach, namely, presumes brand valuation based on benefits which brand provides to both customers and enterprise that owns it, in other words - based on qualitative and quantitative measures respectively reflecting behavior and attitudes of consumers and assumed financial value of the brand, or, more precisely, brand value capitalization. According to the defined research subject, this paper is structured as follows: importance and problem of brand value are reviewed in the Introduction, and three most well-known brand valuation methodologies developed by consultancy agencies - Interbrand methodology and BrandZ and BAV models are analyzed in the next section. In the further considerations the results of comparative analysis of these methodologies are presented and implications for adequate brand valuation suggested.

  13. Methodology for safety classification of PWR type nuclear power plants items

    International Nuclear Information System (INIS)

    Oliveira, Patricia Pagetti de

    1995-01-01

    This paper contains the criteria and methodology which define a classification system of structures, systems and components in safety classes according to their importance to nuclear safety. The use of this classification system will provide a set of basic safety requirements associated with each safety class specified. These requirements, when available and applicable, shall be utilized in the design, fabrication and installation of structures, systems and components of Pressurized Water Reactor Nuclear Power Plants. (author). 13 refs, 1 tab

  14. Approaches to veterinary education--tracking versus a final year broad clinical experience. Part one: effects on career outcome.

    Science.gov (United States)

    Klosterman, E S; Kass, P H; Walsh, D A

    2009-08-01

    This is the first of two papers that provide extensive data and analysis on the two major approaches to clinical veterinary education, which either provide students with experience of a broad range of species (often defined as omni/general clinical competence), or just a few species (sometimes just one), usually termed 'tracking'. Together the two papers provide a detailed analysis of these two approaches for the first time. The responsibilities of veterinary medicine and veterinary education are rapidly increasing throughoutthe globe. It is critical for all in veterinary education to reassess the approaches that have been used, and evaluate on a school-by-school basis which may best meet its expanding and ever-deepening responsibilities.

  15. Weaknesses in Awarding Fees for the Broad Area Maritime Surveillance Contract

    Science.gov (United States)

    2010-11-02

    Table of Contents Introduction 1 Audit Objectives 1 Background on Broad Area Maritime Surveillance 1...24 Mangement Comments The Assistant Secretary of the Navy for Research, Development, and Acquisition 25... Introduction Audit Objectives This is the first in a series of reports on the contract supporting the Broad Area Maritime

  16. Methodology for the Study of the Envelope Airtightness of Residential Buildings in Spain: A Case Study

    Directory of Open Access Journals (Sweden)

    Feijó-Muñoz Jesús

    2018-03-01

    Full Text Available Air leakage and its impact on the energy performance of dwellings has been broadly studied in countries with cold climates in Europe, US, and Canada. However, there is a lack of knowledge in this field in Mediterranean countries. Current Spanish building regulations establish ventilation rates based on ideal airtight envelopes, causing problems of over-ventilation and substantial energy losses. The aim of this paper is to develop a methodology that allows the characterization of the envelope of the housing stock in Spain in order to adjust ventilation rates taking into consideration air leakage. A methodology that is easily applicable to other countries that consider studying the airtightness of the envelope and its energetic behaviour improvement is proposed. A statistical sampling method has been established to determine the dwellings to be tested, considering relevant variables concerning airtightness: climate zone, year of construction, and typology. The air leakage rate is determined using a standardized building pressurization technique according to European Standard EN 13829. A representative case study has been presented as an example of the implementation of the designed methodology and results are compared to preliminary values obtained from the database.

  17. Software life cycle methodologies and environments

    Science.gov (United States)

    Fridge, Ernest

    1991-01-01

    Products of this project will significantly improve the quality and productivity of Space Station Freedom Program software processes by: improving software reliability and safety; and broadening the range of problems that can be solved with computational solutions. Projects brings in Computer Aided Software Engineering (CASE) technology for: Environments such as Engineering Script Language/Parts Composition System (ESL/PCS) application generator, Intelligent User Interface for cost avoidance in setting up operational computer runs, Framework programmable platform for defining process and software development work flow control, Process for bringing CASE technology into an organization's culture, and CLIPS/CLIPS Ada language for developing expert systems; and methodologies such as Method for developing fault tolerant, distributed systems and a method for developing systems for common sense reasoning and for solving expert systems problems when only approximate truths are known.

  18. Long-throated flumes and broad-crested weirs

    NARCIS (Netherlands)

    Bos, M.G.

    1985-01-01

    Vital for water management are structures that can measure the flow in a wide variety of channels. Chapter 1 introduces the long-throated flume and the broad-crested weir; it explains why this family of structures can meet the boundary conditions and hydraulic demands of most measuring

  19. System study methodology. Development and potential utilization for fusion

    International Nuclear Information System (INIS)

    Djerassi, H.; Rouillard, J.; Leger, D.; Zappellini, G.; Gambi, G.

    1988-01-01

    The objective of this new methodology is to combine systemics with heuristics for engineering applications. The system method considers as a whole a set of dynamically interacting elements, organized for tasks. Heuristics tries to explicit the rules to apply in scientific research. This methodology is a powerful tool to evaluate the options to be made, compared with conventional analytical methods as a higher number of parameters can be taken into account, with higher quality standard while comparing the possible options. The system method takes into account interacting data or random relationships, by means of simulation modelling. Thus, a dynamical approach can be deduced and a sensitivity analysis can be performed for a very high number of options and basic data. Experimental values collection, analysis of the problem, search of solutions, sizing of the installation from defined functions, cost evaluation (planning and operating) and ranking of the options as regard all the constraints are the main points considered for the system's application. This method can be limited to a specific objective such as a fusion reactor safety analysis. The possibility of taking into account all the options, possible accidents, quality assurance, exhaustivity of the safety analysis, identification of the residual risk and modelisation of the results are the main advantages of this approach. The sophisticated architecture of a fusion reactor includes a large number of interacting systems. The new character of the fusion domain and the wide spectrum of the possible options strongly increase the advantages of a system study as a complete safety analysis can be defined before starting with the design

  20. Application of REPAS Methodology to Assess the Reliability of Passive Safety Systems

    Directory of Open Access Journals (Sweden)

    Franco Pierro

    2009-01-01

    Full Text Available The paper deals with the presentation of the Reliability Evaluation of Passive Safety System (REPAS methodology developed by University of Pisa. The general objective of the REPAS is to characterize in an analytical way the performance of a passive system in order to increase the confidence toward its operation and to compare the performances of active and passive systems and the performances of different passive systems. The REPAS can be used in the design of the passive safety systems to assess their goodness and to optimize their costs. It may also provide numerical values that can be used in more complex safety assessment studies and it can be seen as a support to Probabilistic Safety Analysis studies. With regard to this, some examples in the application of the methodology are reported in the paper. A best-estimate thermal-hydraulic code, RELAP5, has been used to support the analyses and to model the selected systems. Probability distributions have been assigned to the uncertain input parameters through engineering judgment. Monte Carlo method has been used to propagate uncertainties and Wilks' formula has been taken into account to select sample size. Failure criterions are defined in terms of nonfulfillment of the defined design targets.

  1. Developments in broad-beam, ion-source technology and applications

    International Nuclear Information System (INIS)

    Kaufman, H.R.; Harper, J.M.E.; Cuomo, J.J.

    1982-01-01

    Recent advances in broad-beam, ion-source technology are summarized, including low-energy ion optics, improved extraction grid fabrication, a compact ion-source design and a gridless ion-source design. Recent applications have emphasized concepts such as stress modification of vapor deposited films, very low energy ion beams to minimize the physical sputtering portion in reactive etching, and the use of multiple sources and targets to sputter deposit alloys and compounds. A comprehensive critical review by the same authors appears concurrently, describing in detail the developments in broad-beam, ion-source technology 1 and the applications of these sources. 2

  2. Social Pharmacy Research in Copenhagen—Maintaining a Broad Approach

    Directory of Open Access Journals (Sweden)

    Sofia Kälvemark Sporrong

    2016-02-01

    Full Text Available Social Pharmacy (SP is a multidisciplinary field to promote the adequate use of medicine. The field of SP is increasingly important due to a numbers of new trends all posing challenges to society. The SP group at the University of Copenhagen has for several years used a broad approach to SP teaching and research, often illustrated by the four levels: individual, group, organizational, and societal. In this paper the relevance of maintaining a broad approach to SP research is argued for and examples of the importance of such type of research is presented.

  3. Tutorials on emerging methodologies and applications in operations research

    CERN Document Server

    2005-01-01

    Operations Research emerged as a quantitative approach to problem-solving in World War II. Its founders, who were physicists, mathematicians, and engineers, quickly found peace-time uses for this new field. Moreover, we can say that Operations Research (OR) was born in the same incubator as computer science, and through the years, it has spawned many new disciplines, including systems engineering, health care management, and transportation science. Fundamentally, Operations Research crosses discipline domains to seek solutions on a range of problems and benefits diverse disciplines from finance to bioengineering. Many disciplines routinely use OR methods. Many scientific researchers, engineers, and others will find the methodological presentations in this book useful and helpful in their problem-solving efforts. OR’s strengths are modeling, analysis, and algorithm design. It provides a quantitative foundation for a broad spectrum of problems, from economics to medicine, from environmental control to sports,...

  4. RAMS (Risk Analysis - Modular System) methodology

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  5. Fourier evaluation of broad Moessbauer spectra

    International Nuclear Information System (INIS)

    Vincze, I.

    1981-01-01

    It is shown by the Fourier analysis of broad Moessbauer spectra that the even part of the distribution of the dominant hyperfine interaction (hyperfine field or quadrupole splitting) can be obtained directly without using least-square fitting procedures. Also the odd part of this distribution correlated with other hyperfine parameters (e.g. isomer shift) can be directly determined. Examples for amorphous magnetic and paramagnetic iron-based alloys are presented. (author)

  6. A powerful methodology for reactor vessel pressurized thermal shock analysis

    International Nuclear Information System (INIS)

    Boucau, J.; Mager, T.

    1994-01-01

    The recent operating experience of the Pressurized Water Reactor (PWR) Industry has focused increasing attention on the issue of reactor vessel pressurized thermal shock (PTS). More specifically, the review of the old WWER-type of reactors (WWER 440/230) has indicated a sensitive behaviour to neutron embrittlement. This led already to some remedial actions including safety injection water preheating or vessel annealing. Such measures are usually taken based on the analysis of a selected number of conservative PTS events. Consideration of all postulated cooldown events would draw attention to the impact of operator action and control system effects on reactor vessel PTS. Westinghouse has developed a methodology which couples event sequence analysis with probabilistic fracture mechanics analyses, to identify those events that are of primary concern for reactor vessel integrity. Operating experience is utilized to aid in defining the appropriate event sequences and event frequencies of occurrence for the evaluation. Once the event sequences of concern are identified, detailed deterministic thermal-hydraulic and structural evaluations can be performed to determine the conditions required to minimize the extension of postulated flaws or enhance flaw arrest in the reactor vessel. The results of these analyses can then be used to better define further modifications in vessel and plant system design and to operating procedures. The purpose of the present paper will be to describe this methodology and to show its benefits for decision making. (author). 1 ref., 3 figs

  7. Sex Differences and Within-Family Associations in the Broad Autism Phenotype

    Science.gov (United States)

    Klusek, Jessica; Losh, Molly; Martin, Gary E.

    2014-01-01

    While there is a strong sex bias in the presentation of autism, it is unknown whether this bias is also present in subclinical manifestations of autism among relatives, or the broad autism phenotype. This study examined this question and investigated patterns of co-occurrence of broad autism phenotype traits within families of individuals with…

  8. Methodology for probability of failure assessment of offshore pipelines; Metodologia qualitativa de avaliacao da probabilidade de falha de dutos rigidos submarinos estaticos

    Energy Technology Data Exchange (ETDEWEB)

    Pezzi Filho, Mario [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2005-07-01

    In this study it is presented a methodology for assessing the likelihood of failure for every failure mechanism defined for carbon steel static offshore pipelines. This methodology is aimed to comply with the Integrity Management policy established by the Company. Decision trees are used for the development of the methodology and the evaluation of the extent and the significance of these failure mechanisms. Decision trees enable also the visualization of the logical structure of algorithms which eventually will be used in risk assessment software. The benefits of the proposed methodology are presented and it is recommended that it be tested on static offshore pipelines installed in different assets for validation. (author)

  9. Extreme Variability in a Broad Absorption Line Quasar

    Energy Technology Data Exchange (ETDEWEB)

    Stern, Daniel; Jun, Hyunsung D. [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Drive, Mail Stop 169-221, Pasadena, CA 91109 (United States); Graham, Matthew J.; Djorgovski, S. G.; Donalek, Ciro; Drake, Andrew J.; Mahabal, Ashish A.; Steidel, Charles C. [California Institute of Technology, 1200 E. California Boulevard, Pasadena, CA 91125 (United States); Arav, Nahum; Chamberlain, Carter [Department of Physics, Virginia Tech, Blacksburg, VA 24061 (United States); Barth, Aaron J. [Department of Physics and Astronomy, 4129 Frederick Reines Hall, University of California, Irvine, CA 92697 (United States); Glikman, Eilat, E-mail: daniel.k.stern@jpl.nasa.gov [Department of Physics, Middlebury College, Middlebury, VT 05753 (United States)

    2017-04-20

    CRTS J084133.15+200525.8 is an optically bright quasar at z = 2.345 that has shown extreme spectral variability over the past decade. Photometrically, the source had a visual magnitude of V ∼ 17.3 between 2002 and 2008. Then, over the following five years, the source slowly brightened by approximately one magnitude, to V ∼ 16.2. Only ∼1 in 10,000 quasars show such extreme variability, as quantified by the extreme parameters derived for this quasar assuming a damped random walk model. A combination of archival and newly acquired spectra reveal the source to be an iron low-ionization broad absorption line quasar with extreme changes in its absorption spectrum. Some absorption features completely disappear over the 9 years of optical spectra, while other features remain essentially unchanged. We report the first definitive redshift for this source, based on the detection of broad H α in a Keck/MOSFIRE spectrum. Absorption systems separated by several 1000 km s{sup −1} in velocity show coordinated weakening in the depths of their troughs as the continuum flux increases. We interpret the broad absorption line variability to be due to changes in photoionization, rather than due to motion of material along our line of sight. This source highlights one sort of rare transition object that astronomy will now be finding through dedicated time-domain surveys.

  10. The Effects of FOCUS-PDCA Methodology on Emergency Department Patient Disposition Index

    Directory of Open Access Journals (Sweden)

    Hossien Jabbari beirami

    2015-05-01

    Full Text Available Introduction: Hospital emergency is an important and unique department and prolonged stay of the patients in this ward leads to a decrease in the ability to serve other patients in need. Therefore, this study aimed to evaluate the ability of FOCUS-PDCA methodology to decrease waiting time of the procedures and improve index of decision-making within 6 hours in emergency department (ED. Methods: In this interventional before-after study, the effect of FOCUS-PDCA methodology on waiting time of the procedures and decision-making was evaluated in the ED of Sina Hospital, Tabriz, Iran in a 5-month period. Initially, a team of procedure definers defined the problematic procedures and suggested practical solutions to relieve them. Then, these solutions were practiced using appropriate programming, and finally the effects of these measures were analyzed using SPSS version 11.5 and independent t-test. Results: 5 months after intervention, mean waiting time for receiving consultation was reduced from 28.1 to 17 minutes (p < 0.001 and mean time for the results of a laboratory test to be ready was reduced from 70.26 to 37.66 minutes (p = 0.006. The number of patients who stayed in the ED for more than 6 hours, which was 101 in April, decreased to 52 in November (p = 0.002. The index of patient disposition in less than 6 hours increased from 94.71% in April to 96.87% in November. Conclusion: Based on the results of this study, it seems that carrying out FOCUS-PDCA methodology can decrease waiting time of the procedures and improve patient disposition index in the ED.

  11. A methodological approach for designing a usable ontology-based GUI in healthcare.

    Science.gov (United States)

    Lasierra, N; Kushniruk, A; Alesanco, A; Borycki, E; García, J

    2013-01-01

    This paper presents a methodological approach to the design and evaluation of an interface for an ontology-based system used for designing care plans for monitoring patients at home. In order to define the care plans, physicians need a tool for creating instances of the ontology and configuring some rules. Our purpose is to develop an interface to allow clinicians to interact with the ontology. Although ontology-driven applications do not necessarily present the ontology in the user interface, it is our hypothesis that showing selected parts of the ontology in a "usable" way could enhance clinician's understanding and make easier the definition of the care plans. Based on prototyping and iterative testing, this methodology combines visualization techniques and usability methods. Preliminary results obtained after a formative evaluation indicate the effectiveness of suggested combination.

  12. International Comparisons: Issues of Methodology and Practice

    Directory of Open Access Journals (Sweden)

    Serova Irina A.

    2017-12-01

    Full Text Available The article discusses the methodology and organization of statistical observation of the level of countries’ economic development. The theoretical basis of international comparisons is singled out and on its basis the comparative evaluation of inconsistency of theoretical positions and the reasons of differences of GDP growth is carried out. Based on the complexity of the formation of homogeneous data sets in order to obtain correct comparison results, a general scheme for the relationship between the theoretical base of international comparisons and PPP constraints is defined. The possibility of obtaining a single measurement of the indicators of national economies based on the existing sampling errors, measurement uncertainties and classification errors is considered. The emphasis is placed on combining the work using the ICP and CPI with the aim of achieving comparability of data in the territorial and temporal cross-section. Using the basic characteristics of sustainable economic growth, long-term prospects for changing the ranking positions of countries with different levels of income are determined. It is shown that the clarity and unambiguity of the theoretical provisions is the defining condition for the further process of data collection and formation of correct analytical conclusions.

  13. Generation of broad-group neutron/photon cross-section libraries for shielding applications

    International Nuclear Information System (INIS)

    Ingersoll, D.T.; Roussin, R.W.; Fu, C.Y.; White, J.E.

    1989-01-01

    The generation and use of multigroup cross-section libraries with broad energy group structures is primarily for the economy of computer resources. Also, the establishment of reference broad-group libraries is desirable in order to avoid duplication of effort, both in terms of the data generation and verification, and to assure a common data base for all participants in a specific project. Uncertainties are inevitably introduced into the broad-group cross sections due to approximations in the grouping procedure. The dominant uncertainty is generally with regard to the energy weighting function used to average the pointwise or fine-group data within a single broad group. Intelligent choice of the weighting functions can reduce such uncertainties. Also, judicious selection of the energy group structure can help to reduce the sensitivity of the computed responses to the weighting function, at least for a selected set of problems. Two new multigroup cross section libraries have been recently generated from ENDF/B-V data for two specific shielding applications. The first library was prepared for use in sodium-cooled reactor systems and is available in both broad-group structures. The second library, just recently completed, was prepared for use in air-over-ground environments and is available in a broad-group (46-neutron, 23-photon) energy structure. The selection of the specific group structures and weighting functions was an important part of the generation of both libraries

  14. Methodology Development and Applications of Proliferation Resistance and Physical Protection Evaluation

    International Nuclear Information System (INIS)

    Bari, R.A.; Peterson, P.F.; Therios, I.U.; Whitlock, J.J.

    2010-01-01

    We present an overview of the program on the evaluation methodology for proliferation resistance and physical protection (PR and PP) of advanced nuclear energy systems (NESs) sponsored by the Generation IV International Forum (GIF). For a proposed NES design, the methodology defines a set of challenges, analyzes system response to these challenges, and assesses outcomes. The challenges to the NES are the threats posed by potential actors (proliferant States or sub-national adversaries). The characteristics of Generation IV systems, both technical and institutional, are used to evaluate the response of the system and to determine its resistance against proliferation threats and robustness against sabotage and terrorism threats. The outcomes of the system response are expressed in terms of a set of measures, which are the high-level PR and PP characteristics of the NES. The methodology is organized to allow evaluations to be performed at the earliest stages of system design and to become more detailed and more representative as the design progresses. It can thus be used to enable a program in safeguards by design or to enhance the conceptual design process of an NES with regard to intrinsic features for PR and PP.

  15. SLAM-seq defines direct gene-regulatory functions of the BRD4-MYC axis.

    Science.gov (United States)

    Muhar, Matthias; Ebert, Anja; Neumann, Tobias; Umkehrer, Christian; Jude, Julian; Wieshofer, Corinna; Rescheneder, Philipp; Lipp, Jesse J; Herzog, Veronika A; Reichholf, Brian; Cisneros, David A; Hoffmann, Thomas; Schlapansky, Moritz F; Bhat, Pooja; von Haeseler, Arndt; Köcher, Thomas; Obenauf, Anna C; Popow, Johannes; Ameres, Stefan L; Zuber, Johannes

    2018-05-18

    Defining direct targets of transcription factors and regulatory pathways is key to understanding their roles in physiology and disease. We combined SLAM-seq [thiol(SH)-linked alkylation for the metabolic sequencing of RNA], a method for direct quantification of newly synthesized messenger RNAs (mRNAs), with pharmacological and chemical-genetic perturbation in order to define regulatory functions of two transcriptional hubs in cancer, BRD4 and MYC, and to interrogate direct responses to BET bromodomain inhibitors (BETis). We found that BRD4 acts as general coactivator of RNA polymerase II-dependent transcription, which is broadly repressed upon high-dose BETi treatment. At doses triggering selective effects in leukemia, BETis deregulate a small set of hypersensitive targets including MYC. In contrast to BRD4, MYC primarily acts as a selective transcriptional activator controlling metabolic processes such as ribosome biogenesis and de novo purine synthesis. Our study establishes a simple and scalable strategy to identify direct transcriptional targets of any gene or pathway. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  16. Methodology for predicting the life of waste-package materials, and components using multifactor accelerated life tests

    International Nuclear Information System (INIS)

    Thomas, R.E.; Cote, R.W.

    1983-09-01

    Accelerated life tests are essential for estimating the service life of waste-package materials and components. A recommended methodology for generating accelerated life tests is described in this report. The objective of the methodology is to define an accelerated life test program that is scientifically and statistically defensible. The methodology is carried out using a select team of scientists and usually requires 4 to 12 man-months of effort. Specific agendas for the successive meetings of the team are included in the report for use by the team manager. The agendas include assignments for the team scientists and a different set of assignments for the team statistician. The report also includes descriptions of factorial tables, hierarchical trees, and associated mathematical models that are proposed as technical tools to guide the efforts of the design team

  17. Broad spectrum antiangiogenic treatment for ocular neovascular diseases.

    Directory of Open Access Journals (Sweden)

    Ofra Benny

    2010-09-01

    Full Text Available Pathological neovascularization is a hallmark of late stage neovascular (wet age-related macular degeneration (AMD and the leading cause of blindness in people over the age of 50 in the western world. The treatments focus on suppression of choroidal neovascularization (CNV, while current approved therapies are limited to inhibiting vascular endothelial growth factor (VEGF exclusively. However, this treatment does not address the underlying cause of AMD, and the loss of VEGF's neuroprotective can be a potential side effect. Therapy which targets the key processes in AMD, the pathological neovascularization, vessel leakage and inflammation could bring a major shift in the approach to disease treatment and prevention. In this study we have demonstrated the efficacy of such broad spectrum antiangiogenic therapy on mouse model of AMD.Lodamin, a polymeric formulation of TNP-470, is a potent broad-spectrum antiangiogenic drug. Lodamin significantly reduced key processes involved in AMD progression as demonstrated in mice and rats. Its suppressive effects on angiogenesis, vascular leakage and inflammation were studied in a wide array of assays including; a Matrigel, delayed-type hypersensitivity (DTH, Miles assay, laser-induced CNV and corneal micropocket assay. Lodamin significantly suppressed the secretion of various pro-inflammatory cytokines in the CNV lesion including monocyte chemotactic protein-1 (MCP-1/Ccl2. Importantly, Lodamin was found to regress established CNV lesions, unlike soluble fms-like tyrosine kinase-1 (sFlk-1. The drug was found to be safe in mice and have little toxicity as demonstrated by electroretinography (ERG assessing retinal and by histology.Lodamin, a polymer formulation of TNP-470, was identified as a first in its class, broad-spectrum antiangiogenic drug that can be administered orally or locally to treat corneal and retinal neovascularization. Several unique properties make Lodamin especially beneficial for ophthalmic

  18. Broadly Applicable Nanowafer Drug Delivery System for Treating Eye Injuries

    Science.gov (United States)

    2015-09-01

    Systems in Systemic , Dermal, Transdermal , and Ocular Drug Delivery . Crit. Rev. Ther. Drug 2008, 25, 545–584. 14. Choy, Y. B.; Park, J.-H.; McCarey, B...AWARD NUMBER: W81XWH-13-1-0146 TITLE: Broadly Applicable Nanowafer Drug Delivery System for Treating Eye Injuries PRINCIPAL INVESTIGATOR: Dr...Broadly Applicable Nanowafer Drug Delivery System for Treating Eye Injuries” 5b. GRANT NUMBER W81XWH-13-1-0146 5c. PROGRAM ELEMENT NUMBER 6

  19. Attribute Based Selection of Thermoplastic Resin for Vacuum Infusion Process: A Decision Making Methodology

    DEFF Research Database (Denmark)

    Raghavalu Thirumalai, Durai Prabhakaran; Lystrup, Aage; Løgstrup Andersen, Tom

    2012-01-01

    The composite industry looks toward a new material system (resins) based on thermoplastic polymers for the vacuum infusion process, similar to the infusion process using thermosetting polymers. A large number of thermoplastics are available in the market with a variety of properties suitable...... be beneficial. In this paper, the authors introduce a new decision making tool for resin selection based on significant attributes. This article provides a broad overview of suitable thermoplastic material systems for vacuum infusion process available in today’s market. An illustrative example—resin selection...... for vacuum infused of a wind turbine blade—is shown to demonstrate the intricacies involved in the proposed methodology for resin selection....

  20. Mechanistic basis for high stereoselectivity and broad substrate scope in the (salen)Co(III)-catalyzed hydrolytic kinetic resolution.

    Science.gov (United States)

    Ford, David D; Nielsen, Lars P C; Zuend, Stephan J; Musgrave, Charles B; Jacobsen, Eric N

    2013-10-16

    In the (salen)Co(III)-catalyzed hydrolytic kinetic resolution (HKR) of terminal epoxides, the rate- and stereoselectivity-determining epoxide ring-opening step occurs by a cooperative bimetallic mechanism with one Co(III) complex acting as a Lewis acid and another serving to deliver the hydroxide nucleophile. In this paper, we analyze the basis for the extraordinarily high stereoselectivity and broad substrate scope observed in the HKR. We demonstrate that the stereochemistry of each of the two (salen)Co(III) complexes in the rate-determining transition structure is important for productive catalysis: a measurable rate of hydrolysis occurs only if the absolute stereochemistry of each of these (salen)Co(III) complexes is the same. Experimental and computational studies provide strong evidence that stereochemical communication in the HKR is mediated by the stepped conformation of the salen ligand, and not the shape of the chiral diamine backbone of the ligand. A detailed computational analysis reveals that the epoxide binds the Lewis acidic Co(III) complex in a well-defined geometry imposed by stereoelectronic rather than steric effects. This insight serves as the basis of a complete stereochemical and transition structure model that sheds light on the reasons for the broad substrate generality of the HKR.

  1. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  2. AEGIS methodology and a perspective from AEGIS methodology demonstrations

    International Nuclear Information System (INIS)

    Dove, F.H.

    1981-03-01

    Objectives of AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) are to develop the capabilities needed to assess the post-closure safety of waste isolation in geologic formation; demonstrate these capabilities on reference sites; apply the assessment methodology to assist the NWTS program in site selection, waste package and repository design; and perform repository site analyses for the licensing needs of NWTS. This paper summarizes the AEGIS methodology, the experience gained from methodology demonstrations, and provides an overview in the following areas: estimation of the response of a repository to perturbing geologic and hydrologic events; estimation of the transport of radionuclides from a repository to man; and assessment of uncertainties

  3. Researching Lean: Methodological Implications of Loose Definitions

    Directory of Open Access Journals (Sweden)

    Mikael Brännmark

    2012-12-01

    Full Text Available Recently, Lean Production (Lean has become a prevailing management concept in Sweden. However, previous research seems to show that the Lean concept and the impact of Lean vary considerably between organizations. This paper illustrates some key methodological issues that need to be considered when researching loosely defined management concepts such as Lean. The paper is based on a review of the literature and five comparative Swedish cases studies. Our study indicates that Lean has changed over time and that operationalization and interpretations of the concept vary considerably. This study concludes that future Lean studies should include a thorough assessment of the Lean interventions, study settings, and in particular non-Lean factors mediating the outcomes of Lean-inspired change programs.

  4. Defining European Wholesale Electricity Markets. An 'And/Or' Approach

    Energy Technology Data Exchange (ETDEWEB)

    Dijkgraaf, E. [Erasmus School of Economics, Erasmus University Rotterdam, Rotterdam (Netherlands); Janssen, M.C.W. [University of Vienna, Vienna (Austria)

    2009-09-15

    An important question in the dynamic European wholesale markets for electricity is whether to define the geographical market at the level of an individual member state or more broadly. We show that if we currently take the traditional approach by considering for each member state whether there is one single other country that provides a substitute for domestic production, the market in each separate member state has still to be considered a separate market. However, if we allow for the possibility that at different moments in time there is another country that provides a substitute for domestic production, then the conclusion should be that certain member states do not constitute a separate geographical market. This is in particular true for Belgium, but also for The Netherlands, France, and to some extent also for Germany and Austria. We call this alternative approach the 'and/or' approach.

  5. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist.

    Science.gov (United States)

    Terwee, Caroline B; Mokkink, Lidwine B; Knol, Dirk L; Ostelo, Raymond W J G; Bouter, Lex M; de Vet, Henrica C W

    2012-05-01

    The COSMIN checklist is a standardized tool for assessing the methodological quality of studies on measurement properties. It contains 9 boxes, each dealing with one measurement property, with 5-18 items per box about design aspects and statistical methods. Our aim was to develop a scoring system for the COSMIN checklist to calculate quality scores per measurement property when using the checklist in systematic reviews of measurement properties. The scoring system was developed based on discussions among experts and testing of the scoring system on 46 articles from a systematic review. Four response options were defined for each COSMIN item (excellent, good, fair, and poor). A quality score per measurement property is obtained by taking the lowest rating of any item in a box ("worst score counts"). Specific criteria for excellent, good, fair, and poor quality for each COSMIN item are described. In defining the criteria, the "worst score counts" algorithm was taken into consideration. This means that only fatal flaws were defined as poor quality. The scores of the 46 articles show how the scoring system can be used to provide an overview of the methodological quality of studies included in a systematic review of measurement properties. Based on experience in testing this scoring system on 46 articles, the COSMIN checklist with the proposed scoring system seems to be a useful tool for assessing the methodological quality of studies included in systematic reviews of measurement properties.

  6. New Constraints on Quasar Broad Absorption and Emission Line Regions from Gravitational Microlensing

    Energy Technology Data Exchange (ETDEWEB)

    Hutsemékers, Damien; Braibant, Lorraine; Sluse, Dominique [Institut d' Astrophysique et de Géophysique, Université de Liège, Liège (Belgium); Anguita, Timo [Departamento de Ciencias Fisicas, Universidad Andres Bello, Santiago (Chile); Goosmann, René, E-mail: hutsemekers@astro.ulg.ac.be [Observatoire Astronomique de Strasbourg, Université de Strasbourg, Strasbourg (France)

    2017-09-29

    Gravitational microlensing is a powerful tool allowing one to probe the structure of quasars on sub-parsec scale. We report recent results, focusing on the broad absorption and emission line regions. In particular microlensing reveals the intrinsic absorption hidden in the P Cygni-type line profiles observed in the broad absorption line quasar H1413+117, as well as the existence of an extended continuum source. In addition, polarization microlensing provides constraints on the scattering region. In the quasar Q2237+030, microlensing differently distorts the Hα and CIV broad emission line profiles, indicating that the low- and high-ionization broad emission lines must originate from regions with distinct kinematical properties. We also present simulations of the effect of microlensing on line profiles considering simple but representative models of the broad emission line region. Comparison of observations to simulations allows us to conclude that the Hα emitting region in Q2237+030 is best represented by a Keplerian disk.

  7. New Constraints on Quasar Broad Absorption and Emission Line Regions from Gravitational Microlensing

    Directory of Open Access Journals (Sweden)

    Damien Hutsemékers

    2017-09-01

    Full Text Available Gravitational microlensing is a powerful tool allowing one to probe the structure of quasars on sub-parsec scale. We report recent results, focusing on the broad absorption and emission line regions. In particular microlensing reveals the intrinsic absorption hidden in the P Cygni-type line profiles observed in the broad absorption line quasar H1413+117, as well as the existence of an extended continuum source. In addition, polarization microlensing provides constraints on the scattering region. In the quasar Q2237+030, microlensing differently distorts the Hα and CIV broad emission line profiles, indicating that the low- and high-ionization broad emission lines must originate from regions with distinct kinematical properties. We also present simulations of the effect of microlensing on line profiles considering simple but representative models of the broad emission line region. Comparison of observations to simulations allows us to conclude that the Hα emitting region in Q2237+030 is best represented by a Keplerian disk.

  8. Defining Quality in Cardiovascular Imaging: A Scientific Statement From the American Heart Association.

    Science.gov (United States)

    Shaw, Leslee J; Blankstein, Ron; Jacobs, Jill E; Leipsic, Jonathon A; Kwong, Raymond Y; Taqueti, Viviany R; Beanlands, Rob S B; Mieres, Jennifer H; Flamm, Scott D; Gerber, Thomas C; Spertus, John; Di Carli, Marcelo F

    2017-12-01

    The aims of the current statement are to refine the definition of quality in cardiovascular imaging and to propose novel methodological approaches to inform the demonstration of quality in imaging in future clinical trials and registries. We propose defining quality in cardiovascular imaging using an analytical framework put forth by the Institute of Medicine whereby quality was defined as testing being safe, effective, patient-centered, timely, equitable, and efficient. The implications of each of these components of quality health care are as essential for cardiovascular imaging as they are for other areas within health care. Our proposed statement may serve as the foundation for integrating these quality indicators into establishing designations of quality laboratory practices and developing standards for value-based payment reform for imaging services. We also include recommendations for future clinical research to fulfill quality aims within cardiovascular imaging, including clinical hypotheses of improving patient outcomes, the importance of health status as an end point, and deferred testing options. Future research should evolve to define novel methods optimized for the role of cardiovascular imaging for detecting disease and guiding treatment and to demonstrate the role of cardiovascular imaging in facilitating healthcare quality. © 2017 American Heart Association, Inc.

  9. Fourier evaluation of broad Moessbauer spectra

    International Nuclear Information System (INIS)

    Vincze, I.

    1981-09-01

    It is shown by the Fourier analysis of broad Moessbauer spectra that the even part of the distribution of the dominant hyperfine interaction (hyperfine field or quadrupole splitting) can be obtained directly without using least-square fitting procedures. Also the odd part of this distribution correlated with other hyperfine parameters (e.g. isomer shift) can be directly determined. Examples covering the case of amorphous magnetic and paramagnetic iron-based alloys are presented. (author)

  10. Development of new assessment methodology for locally corroded pipe

    International Nuclear Information System (INIS)

    Lim, Hwan; Shim, Do Jun; Kim, Yun Jae; Kim, Young Jin

    2002-01-01

    In this paper, a unified methodology based on the local stress concept to estimate residual strength of locally thinned pipes is proposed. An underlying idea of the proposed methodology is that the local stress in the minimum section for locally thinned pipe is related to the reference stress, popularly used in creep problems. Then the problem remains how to define the reference stress, that is the reference load. Extensive three-dimensional Finite Element (FE) analyses were performed to simulate full-scale pipe tests conducted for various shapes of wall thinned area under internal pressure and bending moment. Based on these FE results, the reference load is proposed, which is independent of materials. A natural outcome of this method is the maximum load capacity. By comparing with existing test results, it is shown that the reference stress is related to the fracture stress, which in turn can be posed as the fracture criterion of locally thinned pipes. The proposed method is powerful as it can be easily generalised to more complex problems, such as pipe bends and tee-joints

  11. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Dorp, F van [NAGRA (Switzerland); and others

    1996-09-01

    The BIOMOVS II Working Group on Reference Biospheres has focused on the definition and testing of a methodology for developing models to analyse radionuclide behaviour in the biosphere and associated radiological exposure pathways(a Reference Biospheres Methodology). The Working Group limited the scope to the assessment of the long-term implications of solid radioactive waste disposal. Nevertheless, it is considered that many of the basic principles would be equally applicable to other areas of biosphere assessment. The recommended methodology has been chosen to be relevant to different types of radioactive waste and disposal concepts. It includes the justification, arguments and documentation for all the steps in the recommended methodology. The previous experience of members of the Reference Biospheres Working Group was that the underlying premises of a biosphere assessment have often been taken for granted at the early stages of model development, and can therefore fail to be recognized later on when questions of model sufficiency arise, for example, because of changing regulatory requirements. The intention has been to define a generic approach for the formation of an 'audit trail' and hence provide demonstration that a biosphere model is fit for its intended purpose. The starting point for the methodology has three. The Assessment Context sets out what the assessment has to achieve, eg. in terms of assessment purpose and related regulatory criteria, as well as information about the repository system and types of release from the geosphere. The Basic System Description includes the fundamental premises about future climate conditions and human behaviour which, to a significant degree, are beyond prediction. The International FEP List is a generically relevant list of Features, Events and Processes potentially important for biosphere model development. The International FEP List includes FEPs to do with the assessment context. The context examined in detail by

  12. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Dorp, F. van [NAGRA (Switzerland)] [and others

    1996-09-01

    The BIOMOVS II Working Group on Reference Biospheres has focused on the definition and testing of a methodology for developing models to analyse radionuclide behaviour in the biosphere and associated radiological exposure pathways(a Reference Biospheres Methodology). The Working Group limited the scope to the assessment of the long-term implications of solid radioactive waste disposal. Nevertheless, it is considered that many of the basic principles would be equally applicable to other areas of biosphere assessment. The recommended methodology has been chosen to be relevant to different types of radioactive waste and disposal concepts. It includes the justification, arguments and documentation for all the steps in the recommended methodology. The previous experience of members of the Reference Biospheres Working Group was that the underlying premises of a biosphere assessment have often been taken for granted at the early stages of model development, and can therefore fail to be recognized later on when questions of model sufficiency arise, for example, because of changing regulatory requirements. The intention has been to define a generic approach for the formation of an 'audit trail' and hence provide demonstration that a biosphere model is fit for its intended purpose. The starting point for the methodology has three. The Assessment Context sets out what the assessment has to achieve, eg. in terms of assessment purpose and related regulatory criteria, as well as information about the repository system and types of release from the geosphere. The Basic System Description includes the fundamental premises about future climate conditions and human behaviour which, to a significant degree, are beyond prediction. The International FEP List is a generically relevant list of Features, Events and Processes potentially important for biosphere model development. The International FEP List includes FEPs to do with the assessment context. The context examined in

  13. Establishing Normative Reference Values for Standing Broad Jump among Hungarian Youth

    Science.gov (United States)

    Saint-Maurice, Pedro F.; Laurson, Kelly R.; Kaj, Mónika; Csányi, Tamás

    2015-01-01

    Purpose: The purpose of this study was to examine age and sex trends in anaerobic power assessed by a standing broad jump and to determine norm-referenced values for youth in Hungary. Method: A sample of 2,427 Hungarian youth (1,360 boys and 1,067 girls) completed the standing broad jump twice, and the highest distance score was recorded. Quantile…

  14. Mednet: the very broad-band seismic network for the Mediterranean

    International Nuclear Information System (INIS)

    Boschi, E.; Giardini, D.; Morelli, A.

    1991-01-01

    Mednet is the very broad-band seismic network installed by the Istituto Nazionale di Geofisica (ING) in countries of the mediterranean area, with a final goal of 12-15 stations and a spacing of about 1000 km between stations. The project started in 1987 and will be completed within 1992. Mednet is motivated both by research interest and by seismic hazard monitoring; it will allow to define the structure of the mediterranean region to a high detail, to study properties of the seismic source for intermediate and large events, and to apply this knowledge to procedures of civil protection. To reach its goals, the network has been designed following the highest technical standards: STS-1/VBB sensors, Quanterra 24 bits A/D converters with 140 dB dynamic range, real-time telemetry. Five sites are now operational in Italy (L'Aquila, Bardonecchia and Villasalto) and in northern african countries (Midelt, Morocco; Gafsa, Tunisia); other sites are under construction in Pakistan (Islamabad), Irak (Rutba) and Egypt (Kottamya), while locations are examined for stations in Greece, Jugoslavia and Algeria. The centre of the mednet network is the data center (MDC) in Rome; its tasks include data collection, verification, quality control, archivial and dissemination, monitoring of station performance, event detection, routine determination of source parameters. Data distribution will follow the guidelines set by FDSN, and will be coordinated with other international network projects

  15. Non-equilibrium dynamics of disordered systems: understanding the broad continuum of relevant time scales via a strong-disorder RG in configuration space

    International Nuclear Information System (INIS)

    Monthus, Cecile; Garel, Thomas

    2008-01-01

    We show that an appropriate description of the non-equilibrium dynamics of disordered systems is obtained through a strong disorder renormalization procedure in configuration space that we define for any master equation with transitions rates W(C→C') between configurations. The idea is to eliminate iteratively the configuration with the highest exit rate W out (C)+Σ C' W(C→C') to obtain renormalized transition rates between the remaining configurations. The multiplicative structure of the new generated transition rates suggests that for a very broad class of disordered systems, the distribution of renormalized exit barriers defined as B out (C)≡-ln W out (C) will become broader and broader upon iteration, so that the strong disorder renormalization procedure should become asymptotically exact at large time scales. We have checked numerically this scenario for the non-equilibrium dynamics of a directed polymer in a two-dimensional random medium

  16. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  17. PG 1700 + 518 - a low-redshift, broad absorption line QSO

    International Nuclear Information System (INIS)

    Pettini, M.; Boksenberg, A.

    1985-01-01

    The first high-resolution optical spectra and lower resolution UV spectra of PG 1700 + 518, the only known broad-absorption-line (BAL) QSO at low emission redshift (0.288) are presented. The optical data were obtained with the Isaac Newton Telescope on the island of La Palma and the UV data with the International Ultraviolet Explorer satellite. The outstanding feature of the optical spectrum is a strong, broad Mg II absorption trough, detached from the Mg II emission line and indicative of ejection velocities of between 7000 and 18,000 km/s. Also detected were narrow (FWHM = 350 km/s) Mg II absorption lines at absolute z = 0.2698, which are probably related to the mass ejection phenomenon. It is concluded that the emission-line spectrum is similar to that of other low-redshift QSOs although there are some obvious differences from typical BAL QSOs, most notably in the unusually low level of ionization of both emission-line and broad absorption line gas. 21 references

  18. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej

    2018-02-15

    Software-Defined Networking (SDN) and OpenFlow are actively being standardized and deployed. These deployments rely on switches that come from various vendors and differ in terms of performance and available features. Understanding these differences and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a stream of rule updates, while relying on both observing the control plane view as reported by the switch and probing the data plane state to determine switch characteristics by comparing these views. We measure, report and explain the performance characteristics of flow table updates in six hardware OpenFlow switches. Our results describing rule update rates can help SDN designers make their controllers efficient. Further, we also highlight differences between the OpenFlow specification and its implementations, that if ignored, pose a serious threat to network security and correctness.

  19. Waste Package Design Methodology Report

    Energy Technology Data Exchange (ETDEWEB)

    D.A. Brownson

    2001-09-28

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report.

  20. Waste Package Design Methodology Report

    International Nuclear Information System (INIS)

    D.A. Brownson

    2001-01-01

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report

  1. Applying GRADE-CERQual to qualitative evidence synthesis findings-paper 3: how to assess methodological limitations.

    Science.gov (United States)

    Munthe-Kaas, Heather; Bohren, Meghan A; Glenton, Claire; Lewin, Simon; Noyes, Jane; Tunçalp, Özge; Booth, Andrew; Garside, Ruth; Colvin, Christopher J; Wainwright, Megan; Rashidian, Arash; Flottorp, Signe; Carlsen, Benedicte

    2018-01-25

    The GRADE-CERQual (Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Working Group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation. CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations, (2) coherence, (3) adequacy of data and (4) relevance. This paper is part of a series providing guidance on how to apply CERQual and focuses on CERQual's methodological limitations component. We developed the methodological limitations component by searching the literature for definitions, gathering feedback from relevant research communities and developing consensus through project group meetings. We tested the CERQual methodological limitations component within several qualitative evidence syntheses before agreeing on the current definition and principles for application. When applying CERQual, we define methodological limitations as the extent to which there are concerns about the design or conduct of the primary studies that contributed evidence to an individual review finding. In this paper, we describe the methodological limitations component and its rationale and offer guidance on how to assess methodological limitations of a review finding as part of the CERQual approach. This guidance outlines the information required to assess methodological limitations component, the steps that need to be taken to assess methodological limitations of data contributing to a review finding and examples of methodological limitation assessments. This paper provides guidance for review authors and others on undertaking an assessment of methodological limitations in the context of the CERQual

  2. Risk Factors for Emergence of Resistance to Broad-Spectrum Cephalosporins among Enterobacter spp.

    Science.gov (United States)

    Kaye, Keith S.; Cosgrove, Sara; Harris, Anthony; Eliopoulos, George M.; Carmeli, Yehuda

    2001-01-01

    Among 477 patients with susceptible Enterobacter spp., 49 subsequently harbored third-generation cephalosporin-resistant Enterobacter spp. Broad-spectrum cephalosporins were independent risk factors for resistance (relative risk [OR] = 2.3, P = 0.01); quinolone therapy was protective (OR = 0.4, P = 0.03). There were trends toward decreased risk for resistance among patients receiving broad-spectrum cephalosporins and either aminoglycosides or imipenem. Of the patients receiving broad-spectrum cephalosporins, 19% developed resistance. PMID:11502540

  3. Multiphysics Simulation of Welding-Arc and Nozzle-Arc System: Mathematical-Model, Solution-Methodology and Validation

    Science.gov (United States)

    Pawar, Sumedh; Sharma, Atul

    2018-01-01

    This work presents mathematical model and solution methodology for a multiphysics engineering problem on arc formation during welding and inside a nozzle. A general-purpose commercial CFD solver ANSYS FLUENT 13.0.0 is used in this work. Arc formation involves strongly coupled gas dynamics and electro-dynamics, simulated by solution of coupled Navier-Stoke equations, Maxwell's equations and radiation heat-transfer equation. Validation of the present numerical methodology is demonstrated with an excellent agreement with the published results. The developed mathematical model and the user defined functions (UDFs) are independent of the geometry and are applicable to any system that involves arc-formation, in 2D axisymmetric coordinates system. The high-pressure flow of SF6 gas in the nozzle-arc system resembles arc chamber of SF6 gas circuit breaker; thus, this methodology can be extended to simulate arcing phenomenon during current interruption.

  4. Soft Systems Methodology and Problem Framing: Development of an Environmental Problem Solving Model Respecting a New Emergent Reflexive Paradigm.

    Science.gov (United States)

    Gauthier, Benoit; And Others

    1997-01-01

    Identifies the more representative problem-solving models in environmental education. Suggests the addition of a strategy for defining a problem situation using Soft Systems Methodology to environmental education activities explicitly designed for the development of critical thinking. Contains 45 references. (JRH)

  5. African Journals Online: General Science (broad subject range)

    African Journals Online (AJOL)

    Items 51 - 86 of 86 ... The Journal of Science and Technology (JUST) aims principally at publishing articles ... basic sciences, public health, social medicine and medical politics. .... The SWJ is a peer review on-line international journal of broad ...

  6. A postmenopausal woman with sciatica from broad ligament leiomyoma: a case report.

    Science.gov (United States)

    Tsai, Ya-Chu May

    2016-10-31

    Unilateral lower abdominal pain and/or sciatic nerve pain is a common presentation in the elderly population. The prevalence of broad ligament leiomyoma is leiomyomas to be clinically significant. Thus, we highlight a case of symptomatic broad ligament leiomyoma in a postmenopausal woman whose symptoms improved after definitive treatment. A 62-year-old postmenopausal Macedonian woman was referred to our gynecological department with unexplained pain in her left leg and left iliac fossa region on walking. There was minimal relief with increasing analgesia use prescribed by the family physician. Investigations revealed an ipsilateral adnexal mass and subsequent treatment with laparoscopic broad ligament myomectomy helped to alleviate her symptoms. Our case highlights the importance of staying mindful of alternate diagnoses when presented with a common presentation of iliac fossa pain and pain in the leg. Although broad ligament leiomyomas are benign tumors, the uncommon symptomatic presentation led us to report and focus some attention on this type of tumor.

  7. Broad economic benefits of freight transportation infrastructure improvement.

    Science.gov (United States)

    2012-06-01

    This project strives to introduce a novel way to quantify the broad re-organization benefits associated with an : improvement in the freight infrastructure. Using the approach based on 1) the technique known as Field of Influence, and : 2) RAS adjust...

  8. Argument for a non-standard broad-line region

    International Nuclear Information System (INIS)

    Collin, S.

    1987-01-01

    The region emitting the broad lines (BLR) in quasars and AGN has a ''Standard Status''. It is shown that this status raises strong problems concerning the energetic budget and the thermal state of the BLR. A possible solution is proposed [fr

  9. Defining the suitability for nectar production, bee bread and honeydew in managed forests (Trentino, Italy

    Directory of Open Access Journals (Sweden)

    Miori M

    2007-01-01

    Full Text Available The project’s aim was to locate the wooden areas suitable for beekeeping activities. This has been possible thanks to the use of a multi-parametric model. This permits to define, for each of 85 forestal types of Trentino, the suitability for the production of nectar, bee bread and honeydew. According to the results, forestal types have been divided into 4 productivity classes. Datas have been reprocessed with GIS methodology so that high and medium productivity areas have been mapped. Following, new parameters have been introduced (distance from roads, slope, exposure in order to highlight in the map the economically most important areas for beekeeping activities. In the next stage the apiaries’ position in the examined areas have been registered with the GPS. These registrations have been used in order to compare the theoretical results with the actual beekeeping activities’ distribution. The experimental stage showed that this methodology represents an useful tool to support beekeeping and, more in general, forest planning.

  10. Simplified life cycle assessment models: methodological framework and applications to energy pathways

    International Nuclear Information System (INIS)

    Padey, Pierryves

    2013-01-01

    The energy transition debate is a key issue for today and the coming years. One of the challenges is to limit the environmental impacts of electricity production. Decision support tools, sufficiently accurate, simple to use, accounting for environmental aspects and favoring future energetic choices, must be implemented. However, the environmental assessment of the energy pathways is complex, and it means considering a two levels characterization. The 'energy pathway' is the first level and corresponds to its environmental distribution, to compare overall pathways. The 'system pathway' is the 2. level and compares environmental impacts of systems within each pathway. We have devised a generic methodology covering both necessary characterization levels by estimating the energy pathways environmental profiles while allowing a simple comparison of its systems environmental impacts. This methodology is based on the definition of a parameterized Life Cycle Assessment model and considers, through a Global Sensitivity Analysis, the environmental impacts of a large sample of systems representative of an energy pathway. As a second step, this methodology defines simplified models based on few key parameters identified as inducing the largest variability in the energy pathway environmental impacts. These models assess in a simple way the systems environmental impacts, avoiding any complex LCAs. This reduction methodology has been applied to the onshore wind power energy pathway in Europe and the photovoltaic energy pathway in France. (author)

  11. Can play be defined?

    DEFF Research Database (Denmark)

    Eichberg, Henning

    2015-01-01

    Can play be defined? There is reason to raise critical questions about the established academic demand that at phenomenon – also in humanist studies – should first of all be defined, i.e. de-lineated and by neat lines limited to a “little box” that can be handled. The following chapter develops....... Human beings can very well understand play – or whatever phenomenon in human life – without defining it....

  12. Systematic development and optimization of chemically defined medium supporting high cell density growth of Bacillus coagulans.

    Science.gov (United States)

    Chen, Yu; Dong, Fengqing; Wang, Yonghong

    2016-09-01

    With determined components and experimental reducibility, the chemically defined medium (CDM) and the minimal chemically defined medium (MCDM) are used in many metabolism and regulation studies. This research aimed to develop the chemically defined medium supporting high cell density growth of Bacillus coagulans, which is a promising producer of lactic acid and other bio-chemicals. In this study, a systematic methodology combining the experimental technique with flux balance analysis (FBA) was proposed to design and simplify a CDM. The single omission technique and single addition technique were employed to determine the essential and stimulatory compounds, before the optimization of their concentrations by the statistical method. In addition, to improve the growth rationally, in silico omission and addition were performed by FBA based on the construction of a medium-size metabolic model of B. coagulans 36D1. Thus, CDMs were developed to obtain considerable biomass production of at least five B. coagulans strains, in which two model strains B. coagulans 36D1 and ATCC 7050 were involved.

  13. An experimental methodology for a fuzzy set preference model

    Science.gov (United States)

    Turksen, I. B.; Willson, Ian A.

    1992-01-01

    A flexible fuzzy set preference model first requires approximate methodologies for implementation. Fuzzy sets must be defined for each individual consumer using computer software, requiring a minimum of time and expertise on the part of the consumer. The amount of information needed in defining sets must also be established. The model itself must adapt fully to the subject's choice of attributes (vague or precise), attribute levels, and importance weights. The resulting individual-level model should be fully adapted to each consumer. The methodologies needed to develop this model will be equally useful in a new generation of intelligent systems which interact with ordinary consumers, controlling electronic devices through fuzzy expert systems or making recommendations based on a variety of inputs. The power of personal computers and their acceptance by consumers has yet to be fully utilized to create interactive knowledge systems that fully adapt their function to the user. Understanding individual consumer preferences is critical to the design of new products and the estimation of demand (market share) for existing products, which in turn is an input to management systems concerned with production and distribution. The question of what to make, for whom to make it and how much to make requires an understanding of the customer's preferences and the trade-offs that exist between alternatives. Conjoint analysis is a widely used methodology which de-composes an overall preference for an object into a combination of preferences for its constituent parts (attributes such as taste and price), which are combined using an appropriate combination function. Preferences are often expressed using linguistic terms which cannot be represented in conjoint models. Current models are also not implemented an individual level, making it difficult to reach meaningful conclusions about the cause of an individual's behavior from an aggregate model. The combination of complex aggregate

  14. Observations and Simulations of Formation of Broad Plasma Depletions Through Merging Process

    Science.gov (United States)

    Huang, Chao-Song; Retterer, J. M.; Beaujardiere, O. De La; Roddy, P. A.; Hunton, D.E.; Ballenthin, J. O.; Pfaff, Robert F.

    2012-01-01

    Broad plasma depletions in the equatorial ionosphere near dawn are region in which the plasma density is reduced by 1-3 orders of magnitude over thousands of kilometers in longitude. This phenomenon is observed repeatedly by the Communication/Navigation Outage Forecasting System (C/NOFS) satellite during deep solar minimum. The plasma flow inside the depletion region can be strongly upward. The possible causal mechanism for the formation of broad plasma depletions is that the broad depletions result from merging of multiple equatorial plasma bubbles. The purpose of this study is to demonstrate the feasibility of the merging mechanism with new observations and simulations. We present C/NOFS observations for two cases. A series of plasma bubbles is first detected by C/NOFS over a longitudinal range of 3300-3800 km around midnight. Each of the individual bubbles has a typical width of approx 100 km in longitude, and the upward ion drift velocity inside the bubbles is 200-400 m/s. The plasma bubbles rotate with the Earth to the dawn sector and become broad plasma depletions. The observations clearly show the evolution from multiple plasma bubbles to broad depletions. Large upward plasma flow occurs inside the depletion region over 3800 km in longitude and exists for approx 5 h. We also present the numerical simulations of bubble merging with the physics-based low-latitude ionospheric model. It is found that two separate plasma bubbles join together and form a single, wider bubble. The simulations show that the merging process of plasma bubbles can indeed occur in incompressible ionospheric plasma. The simulation results support the merging mechanism for the formation of broad plasma depletions.

  15. Software Defined Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Foster, Ian; Blaiszik, Ben; Chard, Kyle; Chard, Ryan

    2017-07-17

    Within and across thousands of science labs, researchers and students struggle to manage data produced in experiments, simulations, and analyses. Largely manual research data lifecycle management processes mean that much time is wasted, research results are often irreproducible, and data sharing and reuse remain rare. In response, we propose a new approach to data lifecycle management in which researchers are empowered to define the actions to be performed at individual storage systems when data are created or modified: actions such as analysis, transformation, copying, and publication. We term this approach software-defined cyberinfrastructure because users can implement powerful data management policies by deploying rules to local storage systems, much as software-defined networking allows users to configure networks by deploying rules to switches.We argue that this approach can enable a new class of responsive distributed storage infrastructure that will accelerate research innovation by allowing any researcher to associate data workflows with data sources, whether local or remote, for such purposes as data ingest, characterization, indexing, and sharing. We report on early experiments with this approach in the context of experimental science, in which a simple if-trigger-then-action (IFTA) notation is used to define rules.

  16. Defining Tobacco Regulatory Science Competencies.

    Science.gov (United States)

    Wipfli, Heather L; Berman, Micah; Hanson, Kacey; Kelder, Steven; Solis, Amy; Villanti, Andrea C; Ribeiro, Carla M P; Meissner, Helen I; Anderson, Roger

    2017-02-01

    In 2013, the National Institutes of Health and the Food and Drug Administration funded a network of 14 Tobacco Centers of Regulatory Science (TCORS) with a mission that included research and training. A cross-TCORS Panel was established to define tobacco regulatory science (TRS) competencies to help harmonize and guide their emerging educational programs. The purpose of this paper is to describe the Panel's work to develop core TRS domains and competencies. The Panel developed the list of domains and competencies using a semistructured Delphi method divided into four phases occurring between November 2013 and August 2015. The final proposed list included a total of 51 competencies across six core domains and 28 competencies across five specialized domains. There is a need for continued discussion to establish the utility of the proposed set of competencies for emerging TRS curricula and to identify the best strategies for incorporating these competencies into TRS training programs. Given the field's broad multidisciplinary nature, further experience is needed to refine the core domains that should be covered in TRS training programs versus knowledge obtained in more specialized programs. Regulatory science to inform the regulation of tobacco products is an emerging field. The paper provides an initial list of core and specialized domains and competencies to be used in developing curricula for new and emerging training programs aimed at preparing a new cohort of scientists to conduct critical TRS research. © The Author 2016. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. The benefits of defining "snacks".

    Science.gov (United States)

    Hess, Julie M; Slavin, Joanne L

    2018-04-18

    Whether eating a "snack" is considered a beneficial or detrimental behavior is largely based on how "snack" is defined. The term "snack food" tends to connote energy-dense, nutrient-poor foods high in nutrients to limit (sugar, sodium, and/or saturated fat) like cakes, cookies, chips and other salty snacks, and sugar-sweetened beverages. Eating a "snack food" is often conflated with eating a "snack," however, leading to an overall perception of snacks as a dietary negative. Yet the term "snack" can also refer simply to an eating occasion outside of breakfast, lunch, or dinner. With this definition, the evidence to support health benefits or detriments to eating a "snack" remains unclear, in part because relatively few well-designed studies that specifically focus on the impact of eating frequency on health have been conducted. Despite these inconsistencies and research gaps, in much of the nutrition literature, "snacking" is still referred to as detrimental to health. As discussed in this review, however, there are multiple factors that influence the health impacts of snacking, including the definition of "snack" itself, the motivation to snack, body mass index of snack eaters, and the food selected as a snack. Without a definition of "snack" and a body of research using methodologically rigorous protocols, determining the health impact of eating a "snack" will continue to elude the nutrition research community and prevent the development of evidence-based policies about snacking that support public health. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Defining Documentary Film

    DEFF Research Database (Denmark)

    Juel, Henrik

    2006-01-01

    A discussion of various attemts at defining documentary film regarding form, content, truth, stile, genre or reception - and a propoposal of a positive list of essential, but non-exclusive characteristica of documentary film......A discussion of various attemts at defining documentary film regarding form, content, truth, stile, genre or reception - and a propoposal of a positive list of essential, but non-exclusive characteristica of documentary film...

  19. Framework for applying RI-ISI methodology for Indian PHWRs

    International Nuclear Information System (INIS)

    Vinod, Gopika; Saraf, R.K.; Ghosh, A.K.; Kushwaha, H.S.

    2006-01-01

    Risk Informed In-Service Inspection (RI-ISI) aims at categorizing the components for In-Service inspection based on their contribution to Risk. For defining the contribution of risk from components, their failure probabilities and its subsequent effect on Core Damage Frequency (CDF) needs to be evaluated using Probabilistic Safety Assessment methodology. During the last several years, both the U.S. Nuclear Regulatory Commission (NRC) and the nuclear industry have recognized that Probabilistic Safety Assessment (PSA) has evolved to be more useful in supplementing traditional engineering approaches in reactor regulation. The paper highlights the various stages involved in applying RI-ISI and then compares the findings with existing ISI practices. (author)

  20. Flow characteristics at trapezoidal broad-crested side weir

    Directory of Open Access Journals (Sweden)

    Říha Jaromír

    2015-06-01

    Full Text Available Broad-crested side weirs have been the subject of numerous hydraulic studies; however, the flow field at the weir crest and in front of the weir in the approach channel still has not been fully described. Also, the discharge coefficient of broad-crested side weirs, whether slightly inclined towards the stream or lateral, still has yet to be clearly determined. Experimental research was carried out to describe the flow characteristics at low Froude numbers in the approach flow channel for various combinations of in- and overflow discharges. Three side weir types with different oblique angles were studied. Their flow characteristics and discharge coefficients were analyzed and assessed based on the results obtained from extensive measurements performed on a hydraulic model. The empirical relation between the angle of side weir obliqueness, Froude numbers in the up- and downstream channels, and the coefficient of obliqueness was derived.

  1. Visual attention spreads broadly but selects information locally.

    Science.gov (United States)

    Shioiri, Satoshi; Honjyo, Hajime; Kashiwase, Yoshiyuki; Matsumiya, Kazumichi; Kuriki, Ichiro

    2016-10-19

    Visual attention spreads over a range around the focus as the spotlight metaphor describes. Spatial spread of attentional enhancement and local selection/inhibition are crucial factors determining the profile of the spatial attention. Enhancement and ignorance/suppression are opposite effects of attention, and appeared to be mutually exclusive. Yet, no unified view of the factors has been provided despite their necessity for understanding the functions of spatial attention. This report provides electroencephalographic and behavioral evidence for the attentional spread at an early stage and selection/inhibition at a later stage of visual processing. Steady state visual evoked potential showed broad spatial tuning whereas the P3 component of the event related potential showed local selection or inhibition of the adjacent areas. Based on these results, we propose a two-stage model of spatial attention with broad spread at an early stage and local selection at a later stage.

  2. Methodology of a systematic review.

    Science.gov (United States)

    Linares-Espinós, E; Hernández, V; Domínguez-Escrig, J L; Fernández-Pello, S; Hevia, V; Mayor, J; Padilla-Fernández, B; Ribal, M J

    2018-05-03

    The objective of evidence-based medicine is to employ the best scientific information available to apply to clinical practice. Understanding and interpreting the scientific evidence involves understanding the available levels of evidence, where systematic reviews and meta-analyses of clinical trials are at the top of the levels-of-evidence pyramid. The review process should be well developed and planned to reduce biases and eliminate irrelevant and low-quality studies. The steps for implementing a systematic review include (i) correctly formulating the clinical question to answer (PICO), (ii) developing a protocol (inclusion and exclusion criteria), (iii) performing a detailed and broad literature search and (iv) screening the abstracts of the studies identified in the search and subsequently of the selected complete texts (PRISMA). Once the studies have been selected, we need to (v) extract the necessary data into a form designed in the protocol to summarise the included studies, (vi) assess the biases of each study, identifying the quality of the available evidence, and (vii) develop tables and text that synthesise the evidence. A systematic review involves a critical and reproducible summary of the results of the available publications on a particular topic or clinical question. To improve scientific writing, the methodology is shown in a structured manner to implement a systematic review. Copyright © 2018 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  3. A proposed methodology for computational fluid dynamics code verification, calibration, and validation

    Science.gov (United States)

    Aeschliman, D. P.; Oberkampf, W. L.; Blottner, F. G.

    Verification, calibration, and validation (VCV) of Computational Fluid Dynamics (CFD) codes is an essential element of the code development process. The exact manner in which code VCV activities are planned and conducted, however, is critically important. It is suggested that the way in which code validation, in particular, is often conducted--by comparison to published experimental data obtained for other purposes--is in general difficult and unsatisfactory, and that a different approach is required. This paper describes a proposed methodology for CFD code VCV that meets the technical requirements and is philosophically consistent with code development needs. The proposed methodology stresses teamwork and cooperation between code developers and experimentalists throughout the VCV process, and takes advantage of certain synergisms between CFD and experiment. A novel approach to uncertainty analysis is described which can both distinguish between and quantify various types of experimental error, and whose attributes are used to help define an appropriate experimental design for code VCV experiments. The methodology is demonstrated with an example of laminar, hypersonic, near perfect gas, 3-dimensional flow over a sliced sphere/cone of varying geometrical complexity.

  4. Defining palliative care in cystic fibrosis: A Delphi study.

    Science.gov (United States)

    Dellon, E P; Goggin, J; Chen, E; Sabadosa, K; Hempstead, S E; Faro, A; Homa, K

    2018-05-01

    The goal of palliative care is to improve quality of life for people with serious illness. We aimed to create a cystic fibrosis (CF)-specific definition of palliative care. A working group of 36 CF care providers, researchers, palliative care providers, quality improvement experts, individuals with CF, and CF caregivers completed a series of questionnaires to rate the value of each of 22 attributes of palliative care, rank top attributes to construct definitions of palliative care, and then rate proposed definitions. An average of 28 participants completed each of four questionnaires, with consistent distribution of stakeholder roles across questionnaires. Many identified overlaps in routine CF care and palliative care and highlighted the importance of a definition that feels relevant across the lifespan. Modified Delphi methodology was used to define palliative care in CF. The definition will be used as the foundation for development of CF-specific palliative care guidelines. Copyright © 2017 European Cystic Fibrosis Society. Published by Elsevier B.V. All rights reserved.

  5. Methodologies for evaluation of AECB regulatory program

    International Nuclear Information System (INIS)

    Yarranton, G.A.; Gray, B.J.; Yarranton, M.

    1986-05-01

    AECB (Atomic Energy Control Board) commissioned this report to obtain information about methods of planning and conducting evaluation of its regulatory program. The report begins with a bibliography consisting of 280 abstracts assembled from an extensive search of international literature. Each cited publication describes or uses methods applicable to the evaluation of regulatory programs. The report continues with a review of the methodologies found in the literature. It identifies the most relevant references for each step in program evaluation: the commissioning of evaluation; the identification of evaluation issues; the defining of questions; the answering of questions; the reporting of reslts, and the implementation of recommendations. Finally, the report examines the applicability, advantages and disadvantages of the different evaluation methods and makes recommendations about the selection of methods and their application to the AECB program

  6. Development of a methodology for defining whole-building energy design targets for commercial buildings: Phase 2, Development concept stage report

    Energy Technology Data Exchange (ETDEWEB)

    Jones, J.W. (American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc., Atlanta, GA (USA)); Deringer, J.J. (Deringer Group, Riva, MD (USA)); Hall, J.D. (American Inst. of Architects, Washington, DC (USA)) (comps.)

    1990-09-01

    The Whole-Building Energy Design Targets project is being conducted for the US Department of Energy (DOE) by the Pacific Northwest Laboratory (PNL). The objective of the project is to develop a flexible methodology for setting energy performance guidelines with which architects, engineers, planners, and owners can assess energy efficiency in commercial building design. This volume, the third in the four-volume report on the Targets project concept stage, contains the minutes of the workshops as well as summaries of the expert's written comments prepared at the close of each workshop. In Section 2, the building energy simulation workshop is summarized. Section 3 provides a summary of the building cost workshop.

  7. Defining Quantum Control Flow

    OpenAIRE

    Ying, Mingsheng; Yu, Nengkun; Feng, Yuan

    2012-01-01

    A remarkable difference between quantum and classical programs is that the control flow of the former can be either classical or quantum. One of the key issues in the theory of quantum programming languages is defining and understanding quantum control flow. A functional language with quantum control flow was defined by Altenkirch and Grattage [\\textit{Proc. LICS'05}, pp. 249-258]. This paper extends their work, and we introduce a general quantum control structure by defining three new quantu...

  8. Broad-line high-excitation gas in the elliptical galaxy NGC5128

    International Nuclear Information System (INIS)

    Phillips, M.M.; Taylor, K.; Axon, D.J.; Atherton, P.D.; Hook, R.N.

    1984-01-01

    A faint, but extensive component of broad-line ionized gas has been discovered in the peculiar giant elliptical galaxy NGC5128. This component has a radically different spatial distribution from the well-studied rotating photoionized gas associated with the dust lane although the velocity fields of the two components are similar. The origin of the broad-line gas is considered as its possible relation to the active nucleus and the X-ray jet discussed. (author)

  9. Methodological proposal for evaluating and selecting environment al indicators, Case: Chachafruto Stream Basin, Rionegro, Antioquia, Colombia

    International Nuclear Information System (INIS)

    Osorio Velez, Luis Fernando; Pineda Correa, Marcelena

    2000-01-01

    The investigation mainly responds to the necessity of developing a methodology that allows getting indicators to make easier the process of making decisions in the environmental administration field. The elaboration was relied on the support of the Corporacion Autonoma Regional Rionegro - Nare CORNARE in Antioquia - Colombia, through a subscribed agreement between this entity and the authors. The development of the investigation was carried out in two parts: the first involves the search of methodological paths for the evaluation and selection of applicable indicators to basins, Through the creation of parameters that allow qualifying these indicators in direct form; this is followed by a statistical treatment that allows selecting indicators in a rational way. For application purposes, some criterions are defined in a such way that the selected indicators can be evaluated; with these criterions the matrix is constructed; this allows to view and qualify the indicators, starting from ranges and scales previously defined; once the evaluation and qualification matrix are carried out, the selection process is begun from multivariate analysis. The second part consists in the methodology application expounded to a pattern of indicators previously defined and a determined territory (Basin of the Chachafruto Stream, Rionegro - Antioquia - Colombia); starting from the listing of the whole reported indicators in the consulted literature, the ability of themselves to reflex the basin problems is analyzed, based on parameters such as local contextualization and focuses of the topic. Once the first selection is done, the indicators are submitted to the qualification and evaluation matrix and its results are analyzed by multivariate analysis, in order to condense the environmental indicators to be used. Subsequently, for the evaluation of selected indicators, it is necessary to compile diverse thematic information and to identify the existent databases for the select territory

  10. The sun protection factor (SPF) inadequately defines broad spectrum photoprotection: demonstration using skin reconstructed in vitro exposed to UVA, UVBor UV-solar simulated radiation.

    Science.gov (United States)

    Bernerd, Françoise; Vioux, Corinne; Lejeune, François; Asselineau, Daniel

    2003-01-01

    Wavelength specific biological damage has been previously identified in human skin reconstructed in vitro. Sunburn cell and pyrimidine dimers were found after UVB exposure, and alterations of dermal fibroblasts after UVA exposure. These damages permitted us to discriminate UVB and UVA single absorbers. The present study shows that these biological effects can be obtained simultaneously by a combined UVB + UVA exposure using ultraviolet solar simulated light (UV-SSR), which represents a relevant UV source. In addition, the protection afforded by two broad spectrum sunscreen complex formulations was assessed after topical application. These two formulations displayed the same sun protection factor but different UVA protection factors determined by the persistent pigment darkening (PPD) method. Dose response experiments of UVA or UV-SSR showed that the preparation with the highest PF-UVA provided a better protection with regard to dermal damage compared to the other formulation. Using an original UVB source to obtain the UVB portion of SSR spectrum, the preparations provided the same protection. This study strikingly illustrates the fact that the photoprotection afforded by two sunscreen formulations having similar SPF values is not equal with regard to dermal damage related to photoaging.

  11. Instrumentation and control upgrade evaluation methodology: Final report. Volume 2: Workbook

    Energy Technology Data Exchange (ETDEWEB)

    Bliss, M.; Brown, E.; Florio, F.; Stofko, M.

    1996-07-01

    This workbook accompanies the methodology manual (EPRI TR-104963-V1) which describes how to develop an Upgrade Evaluation Report (UER). A UER is an evaluation that is performed by a nuclear power plant to decide the most cost-effective upgrade to perform (if any) for a previously identified Upgrade Candidate System. A UER defines the utility`s mission and objectives in regards to upgrade candidates, as well as the systems initial costs, benefits to each upgrade, and an initial upgrade schedule to cost-effectively implement system upgrades.

  12. Baseline methodologies for clean development mechanism projects

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M.K. (ed.); Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-15

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  13. Baseline methodologies for clean development mechanism projects

    International Nuclear Information System (INIS)

    Lee, M.K.; Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-01

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  14. Estimating carbon dioxide fluxes from temperate mountain grasslands using broad-band vegetation indices

    Directory of Open Access Journals (Sweden)

    G. Wohlfahrt

    2010-02-01

    Full Text Available The broad-band normalised difference vegetation index (NDVI and the simple ratio (SR were calculated from measurements of reflectance of photosynthetically active and short-wave radiation at two temperate mountain grasslands in Austria and related to the net ecosystem CO2 exchange (NEE measured concurrently by means of the eddy covariance method. There was no significant statistical difference between the relationships of midday mean NEE with narrow- and broad-band NDVI and SR, measured during and calculated for that same time window, respectively. The skill of broad-band NDVI and SR in predicting CO2 fluxes was higher for metrics dominated by gross photosynthesis and lowest for ecosystem respiration, with NEE in between. A method based on a simple light response model whose parameters were parameterised based on broad-band NDVI allowed to improve predictions of daily NEE and is suggested to hold promise for filling gaps in the NEE time series. Relationships of CO2 flux metrics with broad-band NDVI and SR however generally differed between the two studied grassland sites indicting an influence of additional factors not yet accounted for.

  15. Clues to Quasar Broad Line Region Geometry and Kinematics

    DEFF Research Database (Denmark)

    Vestergaard, Marianne; Wilkes, B. J.; Barthel, P. D.

    2000-01-01

    width to show significant inverse correlations with the fractional radio core-flux density, R, the radio axis inclination indicator. Highly inclined systems have broader line wings, consistent with a high-velocity field perpendicular to the radio axis. By contrast, the narrow line-core shows...... no such relation with R, so the lowest velocity CIV-emitting gas has an inclination independent velocity field. We propose that this low-velocity gas is located at higher disk-altitudes than the high-velocity gas. A planar origin of the high-velocity CIV-emission is consistent with the current results...... and with an accretion disk-wind emitting the broad lines. A spherical distribution of randomly orbiting broad-line clouds and a polar high-ionization outflow are ruled out....

  16. The Desired Image of the Future Economy of the Industrial Region: Development Trends and Evaluation Methodology

    Directory of Open Access Journals (Sweden)

    Olga Aleksandrovna Romanova

    2017-09-01

    Full Text Available In the article, the authors emphasize that industrial regions play an important role in the increasing of technological independence of Russia. We show that the decline in the share of processing industries in the gross regional product can not be treated as a negative de-industrialization of the economy. The article proves that the increase in the speed of changements, instability of socio-economic systems, the diverse risks predetermine the need to develop new methodological approaches to predictive research. The studies aimed at developing a technology for the design of the desired image of the future and the methodology for its evaluation are of high importance. For the initial stage of the research, the authors propose the methodological approach for assessing the desired image of the future of metallurgy as one of the most important industry of the region. We propose the term of «technological image of the regional metallurgy». We show that repositioning the image of the regional metallurgical complex is quite a long process. This have determined the need to define the stages of repositioning. The proposed methodology of the evaluation of desired future includes the methodological provisions to quantify the characteristics of goals achieved at the respective stages of the repositioning of the metallurgy. The methodological approach to the design of the desired image of the future implies the following stages: the identification of the priority areas of the technological development of regional metallurgy on the basis of bibliometric and patent analysis; the evaluation of dynamics of the development of the structure of metal products domestic consumption based on comparative analysis and relevant analytical methods as well as its forecasting; the design of the factor model, allowing to identify the parameters quantifying the technological image of the regional metallurgy based on the principal components method,; systematization of

  17. Defining urban and rural areas: a new approach

    Science.gov (United States)

    Arellano, Blanca; Roca, Josep

    2017-10-01

    The separation between the countryside and the city, from rural and urban areas, has been one of the central themes of the literature on urban and territorial studies. The seminal work of Kingsley Davis [10] in the 1950s introduced a wide and fruitful debate which, however, has not yet concluded in a rigorous definition that allows for comparative studies at the national and subnational levels of a scientific nature. In particular, the United Nations (UN) definition of urban and rural population is overly linked to political and administrative factors that make it difficult to use data adequately to understand the human settlement structure of different countries. The present paper seeks to define a more rigorous methodology for the identification of rural and urban areas. For this purpose it uses the night lights supplied by the SNPP satellite, and more specifically by the VIIRS sensor for the determination of the urbanization gradient, and by means of the same construct a more realistic indicator than the statistics provided by the UN. The arrival of electrification to nearly every corner of the planet is certainly the first and most meaningful indicator of artificialization of land. In this sense, this paper proposes a new methodology designed to identify highly impacted (urbanized) landscapes worldwide based on the analysis of satellite imagery of night-time lights. The application of this methodology on a global scale identifies the land highly impacted by light, the urbanization process, and allows an index to be drawn up of Land Impacted by Light per capita (LILpc) as an indicator of the level of urbanization. The methodology used in this paper can be summarized in the following steps: a) a logistic regression between US Urban Areas (UA), as a dependent variable, and night-time light intensity, as an explanatory variable, allows us to establish a nightlight intensity level for the determination of Areas Highly Impacted by Light (AHIL); b) the delimitation of

  18. In Their Own Words? Methodological Considerations in the Analysis of Terrorist Autobiographies

    Directory of Open Access Journals (Sweden)

    Mary Beth Altier

    2012-01-01

    Full Text Available Despite the growth of terrorism literature in the aftermath of the 9/11 attacks, there remain several methodological challenges to studying certain aspects of terrorism. This is perhaps most evident in attempts to uncover the attitudes, motivations, and intentions of individuals engaged in violent extremism and how they are sometimes expressed in problematic behavior. Such challenges invariably stem from the fact that terrorists and the organizations to which they belong represent clandestine populations engaged in illegal activity. Unsurprisingly, these qualities make it difficult for the researcher to identify and locate willing subjects of study—let alone a representative sample. In this research note, we suggest the systematic analysis of terrorist autobiographies offers a promising means of investigating difficult-to-study areas of terrorism-related phenomena. Investigation of autobiographical accounts not only offers additional data points for the study of individual psychological issues, but also provides valuable perspectives on the internal structures, processes, and dynamics of terrorist organizations more broadly. Moreover, given most autobiographies cover critical events and personal experiences across the life course, they provide a unique lens into how terrorists perceive their world and insight into their decision-making processes. We support our advocacy of this approach by highlighting its methodological strengths and shortcomings.

  19. A PROPOSED METHODOLOGY FOR ESTIMATING ECOREGIONAL VALUES FOR OUTDOOR RECREATION IN THE UNITED STATES

    OpenAIRE

    Bhat, Gajanan; Bergstrom, John C.; Bowker, James Michael; Cordell, H. Ken

    1996-01-01

    This paper provides a methodology for the estimation of recreational demand functions and values using an ecoregional approach. Ten ecoregions in the continental US were defined based on similarly functioning ecosystem characters. The individual travel cost method was employed to estimate the recreational demand functions for activities such as motorboating and waterskiing, developed and primative camping, coldwater fishing, sightseeing and pleasure driving, and big game hunting for each ecor...

  20. Methodology of Comparative Analysis of Public School Teachers' Continuing Professional Development in Great Britain, Canada and the USA

    Science.gov (United States)

    Mukan, Nataliya; Kravets, Svitlana

    2015-01-01

    In the article the methodology of comparative analysis of public school teachers' continuing professional development (CPD) in Great Britain, Canada and the USA has been presented. The main objectives are defined as theoretical analysis of scientific and pedagogical literature, which highlights different aspects of the problem under research;…

  1. Broad Spectrum Sanitizing Wipes with Food Additives, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Microcide proposes to develop novel multipurpose non-toxic sanitizing wipes that are aqueous based, have shelf life of 3-5 years, have broad spectrum microbicidal...

  2. Defining nuclear security in the 21st century

    Energy Technology Data Exchange (ETDEWEB)

    Doyle, James E [Los Alamos National Laboratory

    2009-01-01

    A conference devoted to Reducing the Risks from Radioactive and Nuclear Materials presupposes that such risks exist. Few would disagree, but what are they? While debate on the nature and severity of risks associated with nuclear energy will always remain, it is easy to define a set of risks that are almost universally acknowledged. These include: (1) Nuclear warfare between states; (2) Continued proliferation of nuclear weapons and weapons-grade nuclear materials to states and non-state actors; (3) Terrorists or non-state actor acquisition or use nuclear weapons or nuclear materials; (4) Terrorists or non-state actors attack on a nuclear facility; and (5) Loss or diversion of nuclear weapons or materials by a state to unauthorized uses. These are listed in no particular order of likelihood or potential consequence. They are also very broadly stated, each one could be broken down into a more detailed set of discrete risks or threats. The fact that there is a strong consensus on the existence of these risks is evidence that we remain in an era of nuclear insecurity. This becomes even clearer when we note that most major trends influencing the probability of these risks continue to run in a negative direction.

  3. Optimizing Taq polymerase concentration for improved signal-to-noise in the broad range detection of low abundance bacteria.

    Directory of Open Access Journals (Sweden)

    Rudolph Spangler

    Full Text Available BACKGROUND: PCR in principle can detect a single target molecule in a reaction mixture. Contaminating bacterial DNA in reagents creates a practical limit on the use of PCR to detect dilute bacterial DNA in environmental or public health samples. The most pernicious source of contamination is microbial DNA in DNA polymerase preparations. Importantly, all commercial Taq polymerase preparations inevitably contain contaminating microbial DNA. Removal of DNA from an enzyme preparation is problematical. METHODOLOGY/PRINCIPAL FINDINGS: This report demonstrates that the background of contaminating DNA detected by quantitative PCR with broad host range primers can be decreased greater than 10-fold through the simple expedient of Taq enzyme dilution, without altering detection of target microbes in samples. The general method is: For any thermostable polymerase used for high-sensitivity detection, do a dilution series of the polymerase crossed with a dilution series of DNA or bacteria that work well with the test primers. For further work use the concentration of polymerase that gave the least signal in its negative control (H(2O while also not changing the threshold cycle for dilutions of spiked DNA or bacteria compared to higher concentrations of Taq polymerase. CONCLUSIONS/SIGNIFICANCE: It is clear from the studies shown in this report that a straightforward procedure of optimizing the Taq polymerase concentration achieved "treatment-free" attenuation of interference by contaminating bacterial DNA in Taq polymerase preparations. This procedure should facilitate detection and quantification with broad host range primers of a small number of bona fide bacteria (as few as one in a sample.

  4. Document understanding for a broad class of documents

    NARCIS (Netherlands)

    Aiello, Marco; Monz, Christof; Todoran, Leon; Worring, Marcel

    2002-01-01

    We present a document analysis system able to assign logical labels and extract the reading order in a broad set of documents. All information sources, from geometric features and spatial relations to the textual features and content are employed in the analysis. To deal effectively with these

  5. Methodology of sustainability accounting

    Directory of Open Access Journals (Sweden)

    O.H. Sokil

    2017-03-01

    Full Text Available Modern challenges of the theory and methodology of accounting are realized through the formation and implementation of new concepts, the purpose of which is to meet the needs of users in standard and unique information. The development of a methodology for sustainability accounting is a key aspect of the management of an economic entity. The purpose of the article is to form the methodological bases of accounting for sustainable development and determine its goals, objectives, object, subject, methods, functions and key aspects. The author analyzes the theoretical bases of the definition and considers the components of the traditional accounting methodology. Generalized structural diagram of the methodology for accounting for sustainable development is offered in the article. The complex of methods and principles of sustainable development accounting for systematized and non-standard provisions has been systematized. The new system of theoretical and methodological provisions of accounting for sustainable development is justified in the context of determining its purpose, objective, subject, object, methods, functions and key aspects.

  6. Comparison Of Irms Delhi Methodology With Who Methodology On Immunization Coverage

    Directory of Open Access Journals (Sweden)

    Singh Padam

    1996-01-01

    Full Text Available Research question: What are the merits of IRMS Model over WHO Model for Coverage Evaluation Survey? Which method is superior and appropriate for coverage evolution survey of immunization in our setting? Objective: To compare IRMS Delhi methodology with WHO methodology on Immunization Coverage. Study Design: Cross-Sectional Setting: Urban and Rural both. Participants: Mothers& Children Sample Size: 300 children between 1-2 years and 300 mothers in rural areas and 75 children and 75 mothers in urban areas. Study Variables: Rural, Urban, Cast-Group, Size of the stratum, Literacy, Sex and Cost effectiveness. Outcome Variables: Coverage level of immunization. Analysis: Routine Statistical Analysis. Results: IRMS developed methodology scores better rating over WHO methodology, especially when coverage evolution is attempted in medium size villages with existence of socio-economic seggregation-which remains the main characteristic of the Indian villages.

  7. Development of a freshwater lens in the inverted Broad Fourteens Basin, Netherlands offshore

    NARCIS (Netherlands)

    Bouw, Laurien; Oude Essink, Gualbert

    2003-01-01

    The Mesozoic Broad Fourteens Basin is a northwest-southeast trending structural element, situated in the southern North-Sea,Netherlands offshore. Biodegraded and water-washed oils in the southern Broad Fourteens Basin indicate topography-driven meteoric water flow during Late Cretaceous inversion.

  8. Design of a Tank Cleaning Blend through a Systematic Emulsified Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Krogh, Peter; Depner, Bo

    Commercial and industrial detergents, formulated liquid blends, have recently become extremely sophisticated, in order to address a broad range of cleaning tasks and to deliver superior performances with a minimum of effort and time. These products, by definition, consist of different chemicals, ......, the  whole  design  procedure speeds  up, saving time  and money,  and the optimum formulation is identified, since a broad range of alternatives are investigated. The approach adopted for the design of emulsion-based chemical products consists in a systematic model-based methodology...... for  consideration  in  a  product  design methodology, rule-based selection criteria are applied. These are centered on structured databases, where some relevant properties (e.g. safety or toxicity-related), if not available,are predicted through dedicated pure component property models. Once......, classified according to their function and associated properties, has been developed. Also, a model library consisting of pure component and mixture property models has been developed so that the needed functional properties can be reliably predicted when their data cannot be found in the database. The  abovementioned  methodology  and related tools  are  generic,  in the sense that  many  different  emulsified...

  9. Scalability of a Methodology for Generating Technical Trading Rules with GAPs Based on Risk-Return Adjustment and Incremental Training

    Science.gov (United States)

    de La Cal, E. A.; Fernández, E. M.; Quiroga, R.; Villar, J. R.; Sedano, J.

    In previous works a methodology was defined, based on the design of a genetic algorithm GAP and an incremental training technique adapted to the learning of series of stock market values. The GAP technique consists in a fusion of GP and GA. The GAP algorithm implements the automatic search for crisp trading rules taking as objectives of the training both the optimization of the return obtained and the minimization of the assumed risk. Applying the proposed methodology, rules have been obtained for a period of eight years of the S&P500 index. The achieved adjustment of the relation return-risk has generated rules with returns very superior in the testing period to those obtained applying habitual methodologies and even clearly superior to Buy&Hold. This work probes that the proposed methodology is valid for different assets in a different market than previous work.

  10. Selection methodology for LWR safety programs and proposals. Volume 2. Methodology application

    International Nuclear Information System (INIS)

    Ritzman, R.L.; Husseiny, A.A.

    1980-08-01

    The results of work done to update and apply a methodology for selecting (prioritizing) LWR safety technology R and D programs are described. The methodology is based on multiattribute utility (MAU) theory. Application of the methodology to rank-order a group of specific R and D programs included development of a complete set of attribute utility functions, specification of individual attribute scaling constants, and refinement and use of an interactive computer program (MAUP) to process decision-maker inputs and generate overall (multiattribute) program utility values. The output results from several decision-makers are examined for consistency and conclusions and recommendations regarding general use of the methodology are presented. 3 figures, 18 tables

  11. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Science.gov (United States)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  12. Extra Ovarian Serous Cystadenocarcinoma in the Broad Ligament ...

    African Journals Online (AJOL)

    The embryonic remnants of the gonadal ridge and the genital duct apparatus, the Mullerian apparatus, remain atretic throughout the life of a woman. The definitive organs arising from these, the Ovary, Fallopian tubes, Uterus, Cervix and the Broad ligaments share common coelomic origin. Epithelial metaplasia in any of ...

  13. The relation of the broad band with the E2g phonon and superconductivity in the Mg(B1-xCx)2 compound

    International Nuclear Information System (INIS)

    Parisiades, P.; Lampakis, D.; Palles, D.; Liarokapis, E.; Karpinski, J.

    2007-01-01

    We have carried out an extensive micro-Raman study on Mg(B 1-x C x ) 2 single crystals, for carbon concentrations up to x=0.15. The E 2g symmetry broad band for pure MgB 2 at ∼600cm -1 disappears even for small doping levels (x=0.027) and two well-defined peaks in the high-energy side of this band play a major role in the Raman spectra of the substituted compounds. We propose that a two-mode behavior of the compound might be present, induced by the coupling of the observed phonons with the electronic bands

  14. System study methodology development and potential utilization for fusion

    International Nuclear Information System (INIS)

    Djerassi, H.; Rouillard, J.; Leger, D.; Sarto, S.; Zappellini, G.; Gambi, G.

    1989-01-01

    The objective of this new methodology is to combine systemics with heuristics for engineering applications. The system method considers as a whole a set of dynamically interacting elements, organized for tasks. Heuristics tries to describe the rules to apply in scientific research. This methodology is a powerful tool for evaluating the options, compared with conventional analytical methods as a higher number of parameters can be taken into account, with a higher quality standard while comparing the possible options. The system method takes into account interacting data or random relationships by means of simulation modelling. Thus, a dynamical approach can be deduced and a sensitivity analysis can be performed for a very high number of options and basic data. This method can be limited to a specific objective such as a fusion reactor safety analysis, taking into account other major constraints such as the economical environment. The sophisticated architecture of a fusion reactor includes a large number of interacting systems. The new character of the fusion domain and the wide spectrum of the possible options strongly increase the advantages of a system study as a complete safety analysis can be defined before starting with the design. (orig.)

  15. Summarizing systematic reviews: methodological development, conduct and reporting of an umbrella review approach.

    Science.gov (United States)

    Aromataris, Edoardo; Fernandez, Ritin; Godfrey, Christina M; Holly, Cheryl; Khalil, Hanan; Tungpunkom, Patraporn

    2015-09-01

    help assess risk of bias in systematic reviews and meta-analyses was also developed and tested. Relevant details to extract from included reviews and how to best present the findings of both quantitative and qualitative systematic reviews in a reader friendly format are provided. Umbrella reviews provide a ready means for decision makers in healthcare to gain a clear understanding of a broad topic area. The umbrella review methodology described here is the first to consider reviews that report other than quantitative evidence derived from randomized controlled trials. The methodology includes an easy to use and informative summary of evidence table to readily provide decision makers with the available, highest level of evidence relevant to the question posed.

  16. Kaupapa Maori Methodology: Trusting the Methodology through Thick and Thin

    Science.gov (United States)

    Hiha, Anne Aroha

    2016-01-01

    Kaupapa Maori is thoroughly theorised in academia in Aotearoa and those wishing to use it as their research methodology can find support through the writing of a number of Maori academics. What is not so well articulated, is the experiential voice of those who have used Kaupapa Maori as research methodology. My identity as a Maori woman…

  17. Methodological approach to participatory formulation of agricultural technical assistance plans with territorial approach

    Directory of Open Access Journals (Sweden)

    Holmes Rodríguez Espinosa

    2015-10-01

    Full Text Available The collective identification of needs and shared decision-making in projects’ formulation for agricultural development is a process that requires the identification of participatory methodologies to promote active and reflective engagement of producers. The aim of this study was to evaluate a methodological approach for participatory formulation of technical assistance plan with territorial approach. Matrix analysis for the identification and prioritization of the most limiting technical assistance factors for milk production was performed and alternative solutions were defined, through participatory workshops with farmers. The results show the advantages of a collective reflection with stakeholders and quantitative tools reducing subjectivity in decision-making, improving participation in their own development and identifying acceptable alternatives to farmers and viable to the municipality in order to improve lack of pasture and forage management, implementation of good agricultural practices GAP and rational use of agrochemicals.

  18. A methodology for assessing the environmental and health impact of options for the back-end of the nuclear fuel cycle

    International Nuclear Information System (INIS)

    Ouzounian, G.H.; Devezeaux de Lavergne, J.G.; Devin, P.; Lioure, A.; Mouney, H.; Le Boulch, D.

    2001-01-01

    Research programs conducted in France in the framework of the 1991 act offer various options for management of the back- end of the fuel cycle. Proposals to be debated in 2006 will rely not only on broad scientific and technical knowledge, but also on the compilation and integration of results, with syntheses and analyses intended to highlight the advantages and the limitations of each of the waste management paths. This presentation introduces a methodology derived from the life cycle analysis as well as some preliminary results. (author)

  19. A regressive methodology for estimating missing data in rainfall daily time series

    Science.gov (United States)

    Barca, E.; Passarella, G.

    2009-04-01

    The "presence" of gaps in environmental data time series represents a very common, but extremely critical problem, since it can produce biased results (Rubin, 1976). Missing data plagues almost all surveys. The problem is how to deal with missing data once it has been deemed impossible to recover the actual missing values. Apart from the amount of missing data, another issue which plays an important role in the choice of any recovery approach is the evaluation of "missingness" mechanisms. When data missing is conditioned by some other variable observed in the data set (Schafer, 1997) the mechanism is called MAR (Missing at Random). Otherwise, when the missingness mechanism depends on the actual value of the missing data, it is called NCAR (Not Missing at Random). This last is the most difficult condition to model. In the last decade interest arose in the estimation of missing data by using regression (single imputation). More recently multiple imputation has become also available, which returns a distribution of estimated values (Scheffer, 2002). In this paper an automatic methodology for estimating missing data is presented. In practice, given a gauging station affected by missing data (target station), the methodology checks the randomness of the missing data and classifies the "similarity" between the target station and the other gauging stations spread over the study area. Among different methods useful for defining the similarity degree, whose effectiveness strongly depends on the data distribution, the Spearman correlation coefficient was chosen. Once defined the similarity matrix, a suitable, nonparametric, univariate, and regressive method was applied in order to estimate missing data in the target station: the Theil method (Theil, 1950). Even though the methodology revealed to be rather reliable an improvement of the missing data estimation can be achieved by a generalization. A first possible improvement consists in extending the univariate technique to

  20. Detection of broad ultraviolet Fe II lines in the spectrum of NGC 1068

    International Nuclear Information System (INIS)

    Snijders, M.A.J.; Netzer, Hagai; Boksenberg, A.

    1986-01-01

    Ultraviolet observations of the nucleus of NGC 1068, obtained by the IUE over a period of 5 yr, are combined to give a high signal-to-noise spectrum of this source. The ultraviolet stellar continuum, obtained by comparison with ground-based data, is subtracted to show the nuclear non-stellar component. The resulting spectrum shows clearly the presence of strong broad FeII emission bands similar to those observed in many broad-line objects. Broad profiles are also seen in other strong emission lines. These observations confirm the recent discovery of an optical Seyfert type 1 spectrum in NGC 1068. (author)

  1. Improving process methodology for measuring plutonium burden in human urine using fission track analysis

    International Nuclear Information System (INIS)

    Krahenbuhl, M.P.; Slaughter, D.M.

    1998-01-01

    The aim of this paper is to clearly define the chemical and nuclear principles governing Fission Track Analysis (FTA) to determine environmental levels of 239 Pu in urine. The paper also addresses deficiencies in FTA methodology and introduces improvements to make FTA a more reliable research tool. Our refined methodology, described herein, includes a chemically-induced precipitation phase, followed by anion exchange chromatography and employs a chemical tracer, 236 Pu. We have been able to establish an inverse correlation between Pu recovery and sample volume and our data confirms that increases in sample volume do not result in higher accuracy or lower detection limits. We conclude that in subsequent studies, samples should be limited to approximately two liters. The Pu detection limit for a sample of this volume is 2.8 μBq/l. (author)

  2. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  3. Methodology for Scaling Fusion Power Plant Availability

    International Nuclear Information System (INIS)

    Waganer, Lester M.

    2011-01-01

    Normally in the U.S. fusion power plant conceptual design studies, the development of the plant availability and the plant capital and operating costs makes the implicit assumption that the plant is a 10th of a kind fusion power plant. This is in keeping with the DOE guidelines published in the 1970s, the PNL report1, 'Fusion Reactor Design Studies - Standard Accounts for Cost Estimates. This assumption specifically defines the level of the industry and technology maturity and eliminates the need to define the necessary research and development efforts and costs to construct a one of a kind or the first of a kind power plant. It also assumes all the 'teething' problems have been solved and the plant can operate in the manner intended. The plant availability analysis assumes all maintenance actions have been refined and optimized by the operation of the prior nine or so plants. The actions are defined to be as quick and efficient as possible. This study will present a methodology to enable estimation of the availability of the one of a kind (one OAK) plant or first of a kind (1st OAK) plant. To clarify, one of the OAK facilities might be the pilot plant or the demo plant that is prototypical of the next generation power plant, but it is not a full-scale fusion power plant with all fully validated 'mature' subsystems. The first OAK facility is truly the first commercial plant of a common design that represents the next generation plant design. However, its subsystems, maintenance equipment and procedures will continue to be refined to achieve the goals for the 10th OAK power plant.

  4. Why might you use narrative methodology? A story about narrative

    Directory of Open Access Journals (Sweden)

    Lynn McAlpine

    2016-04-01

    Full Text Available Narrative is one of many qualitative methodologies that can be brought to bear in collecting and analysing data and reporting results, though it is not as frequently used as say in case studies. This article provides a window into its use, from the perspective of a researcher who has used it consistently over the past decade to examine early career researcher experience – doctoral students, and those who have completed their degrees and are advancing their careers. This experience has contributed to a robust understanding of the potential of narrative, as well as its limitations. This paper first lays out the broad landscape of narrative research and then makes transparent the thinking, processes and procedures involved in the ten-year narrative study including the potential for creativity that narrative invites. The goal is to engage other researchers to consider exploring the use of narrative – if it aligns with their epistemological stance.

  5. A novel methodology for in-process monitoring of flow forming

    Science.gov (United States)

    Appleby, Andrew; Conway, Alastair; Ion, William

    2017-10-01

    Flow forming (FF) is an incremental cold working process with near-net-shape forming capability. Failures by fracture due to high deformation can be unexpected and sometimes catastrophic, causing tool damage. If process failures can be identified in real time, an automatic cut-out could prevent costly tool damage. Sound and vibration monitoring is well established and commercially viable in the machining sector to detect current and incipient process failures, but not for FF. A broad-frequency microphone was used to record the sound signature of the manufacturing cycle for a series of FF parts. Parts were flow formed using single and multiple passes, and flaws were introduced into some of the parts to simulate the presence of spontaneously initiated cracks. The results show that this methodology is capable of identifying both introduced defects and spontaneous failures during flow forming. Further investigation is needed to categorise and identify different modes of failure and identify further potential applications in rotary forming.

  6. ANALYSIS OF THEORETICAL AND METHODOLOGICAL APPROACHES TO DESIGN OF ELECTRONIC TEXTBOOKS FOR STUDENTS OF HIGHER AGRICULTURAL EDUCATIONAL INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    Olena Yu. Balalaieva

    2017-06-01

    Full Text Available The article deals with theoretical and methodological approaches to the design of electronic textbook, in particular systems, competence, activity, personality oriented, technological one, that in complex reflect the general trends in the formation of a new educational paradigm, distinctive features of which lie in constructing the heuristic searching model of the learning process, focusing on developmental teaching, knowledge integration, skills development for the independent information search and processing, technification of the learning process. The approach in this study is used in a broad sense as a synthesis of the basic ideas, views, principles that determine the overall research strategy. The main provisions of modern approaches to design are not antagonistic, they should be applied in a complex, taking into account the advantages of each of them and leveling shortcomings for the development of optimal concept of electronic textbook. The model of electronic textbook designing and components of methodology for its using based on these approaches are described.

  7. Broad phenotypic spectrum in familial adenomatous polyposis; from early onset and severe phenotypes to late onset of attenuated polyposis with the first manifestation at age 72

    DEFF Research Database (Denmark)

    Nilbert, Mef; Kristoffersson, Ulf; Ericsson, Mats

    2008-01-01

    ABSTRACT: Background Familial adenomatous polyposis (FAP) is typically characterized by multiple colonic polyps and frequent extracolonic features. Whereas the number of colonic polyps has been linked to the APC gene mutation, possible genotype-phenotype correlations largely remain to be defined...... cancer at age 72 as the first manifestation of attenuated FAP. Conclusion With an increasing number of FAP families diagnosed, a broad and variable tumor spectrum and a high frequency of extracolonic manifestations are gradually recognized. We report novel APC mutations and present two FAP cases...

  8. A methodology for the preliminary scoping of future changes in ecosystem services, with an illustration from the future midwestern landscapes study

    Science.gov (United States)

    The product is a white paper defining a methodology for the preliminary scoping of future changes in ecosystem services, with an Illustration from the Future Midwestern Landscapes Study. The scoping method develops a hierarchy of relevant societal values, identifies the ecosyste...

  9. Shuttle TPS thermal performance and analysis methodology

    Science.gov (United States)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  10. Ruling the Commons. Introducing a new methodology for the analysis of historical commons

    Directory of Open Access Journals (Sweden)

    Tine de Moor

    2016-10-01

    Full Text Available Despite significant progress in recent years, the evolution of commons over the long run remains an under-explored area within commons studies. During the last years an international team of historians have worked under the umbrella of the Common Rules Project in order to design and test a new methodology aimed at advancing our knowledge on the dynamics of institutions for collective action – in particular commons. This project aims to contribute to the current debate on commons on three different fronts. Theoretically, it explicitly draws our attention to issues of change and adaptation in the commons – contrasting with more static analyses. Empirically, it highlights the value of historical records as a rich source of information for longitudinal analysis of the functioning of commons. Methodologically, it develops a systematic way of analyzing and comparing commons’ regulations across regions and time, setting a number of variables that have been defined on the basis of the “most common denominators” in commons regulation across countries and time periods. In this paper we introduce the project, describe our sources and methodology, and present the preliminary results of our analysis.

  11. Introducing an ILS methodology into research reactors

    International Nuclear Information System (INIS)

    Lorenzo, N. de; Borsani, R.C.

    2003-01-01

    subsequent design stages. Staff should be allocated to operate the system after assessments based on the workload and safety issues. A methodology for a Plant Tasks Analysis (used as the input for a Manning Analysis Assessment) to define a cost-effective organisational structure is presented. Training is a key issue to support a well-designed plant. This paper describes general training aspects to be considered in the ILS approach. General considerations to tailor a Training Plan are presented as well as for developing training tools such as Plant Simulators and 3D Electronic Models. Manuals, procedures and instructions (relevant for system operation and maintenance) are generally developed by designers or operators focussing on technical characteristics rather than considering the documentation framework and training needs. Methodology and general recommendations regarding documents structure and scope to achieve world class plant documents are also presented. Plant Maintenance should be consistent with in house capabilities regarding the appropriate Level of Repair of each plant item. Reliability, Availability, Maintainability and Supportability assessment methodology is presented in order to focus maintenance activities on relevant issues. Spare parts management is a critical issue and hence is also included in this logistical approach. References regarding optimisation of these and related issues are included. All the mentioned factors are optimally integrated from the beginning of the process application in order to achieve the major outcomes with the available resources. (author)

  12. Current status and future directions of development of PR/PP evaluation methodology

    International Nuclear Information System (INIS)

    Kim, D. Y.; Kwon, E. H.; Kim, H. D.

    2012-01-01

    A mandatory design requirement for the introduction of generation IV nuclear energy systems (NESs) is defined as the characteristic of a nuclear energy system that impedes the diversion or undeclared production of nuclear material, or misuse of technology, by State in order to acquire nuclear weapons or other nuclear explosive devices. The same report also defines physical protection (PP) as the use of technical, administrative, and operational measures to prevent the theft of nuclear/radioactive material for the purpose of producing nuclear weapons, producing nuclear devices for nuclear terrorism, or using the facility or transportation system for radiological sabotage. Since the early 1970s right after the Indian nuclear test, the international community has recognized the limits of political and diplomatic means to prevent overt proliferation by states and looked for ways to incorporate technical features that are inherent in NESs. As a first step, active research has been conducted to develop a methodology to evaluate PR and PP components of NESs and has now been reduced to two main R and D streams: the Generation IV International Forum (GIF) and International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO). (Currently, GIF and INPRO are leading the debate as major projects for PR and PP evaluation methods.) This paper presents an overview of the R and D accomplishments during the development of PR and PP evaluation methodology. It also suggests some directions for future research

  13. Current status and future directions of development of PR/PP evaluation methodology

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. Y.; Kwon, E. H.; Kim, H. D. [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    A mandatory design requirement for the introduction of generation IV nuclear energy systems (NESs) is defined as the characteristic of a nuclear energy system that impedes the diversion or undeclared production of nuclear material, or misuse of technology, by State in order to acquire nuclear weapons or other nuclear explosive devices. The same report also defines physical protection (PP) as the use of technical, administrative, and operational measures to prevent the theft of nuclear/radioactive material for the purpose of producing nuclear weapons, producing nuclear devices for nuclear terrorism, or using the facility or transportation system for radiological sabotage. Since the early 1970s right after the Indian nuclear test, the international community has recognized the limits of political and diplomatic means to prevent overt proliferation by states and looked for ways to incorporate technical features that are inherent in NESs. As a first step, active research has been conducted to develop a methodology to evaluate PR and PP components of NESs and has now been reduced to two main R and D streams: the Generation IV International Forum (GIF) and International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO). (Currently, GIF and INPRO are leading the debate as major projects for PR and PP evaluation methods.) This paper presents an overview of the R and D accomplishments during the development of PR and PP evaluation methodology. It also suggests some directions for future research.

  14. RainMan - A methodology for the evaluation of decommissioning waste

    International Nuclear Information System (INIS)

    Bitetti, B.; Mantero, G.; Orlandi, S.; Scarsi, G.; Brusa, L.; Ruggeri, G.; Dionisi, M.; Farina, A.; Grossi, G.

    2002-01-01

    The main objective of this study, promoted by ANPA, the Italian Nuclear Regulatory Body, carried out with ANSALDO and in close co-operation with SOGIN, was to define a methodology for the evaluation of the inventory of the amount of radioactive waste produced during the NPPs decommissioning activities, in terms of both volume and radioactivity content, and estimate the solid materials suitable for release from the regulatory control. The simulation code RainMan, developed within this project, allows, according to a selected scenario, for the evaluation of the solid materials that could be cleared and the volumes of the L-MLW that should be sent to a disposal facility. (author)

  15. The precipitation synthesis of broad-spectrum UV absorber nanoceria

    International Nuclear Information System (INIS)

    Nurhasanah, Iis; Sutanto, Heri; Puspaningrum, Nurul Wahyu

    2013-01-01

    In this paper the possibility of nanoceria as broad-spectrum UV absorber was evaluated. Nanoceria were synthesized by precipitation process from cerium nitrate solution and ammonium hydroxide as precipitant agent. Isopropanol was mixed with water as solvent to prevent hard agglomeration. The structure of resulting nanoceria was characterized by x-ray diffractometer (XRD). The transparency in the visible light and efficiency of protection in UV A region were studied using ultraviolet-visible (UV - Vis) spectrophotometer. The results show that nanoceria possess good tranparency in visible light and high UV light absorption. The critical absorption wavelenght of 368 nm was obtained which is desirable for excellent broad-spectrum protection absorbers. Moreover, analysis of photodegradation nanoceria to methylene blue solution shows poor photocatalytic activity. It indicates that nanoceria suitable for used as UV absorber in personal care products

  16. Silver Nanoparticles with Broad Multiband Linear Optical Absorption

    KAUST Repository

    Bakr, Osman M.

    2009-07-06

    A simple one-pot method produces silver nanoparticles coated with aryl thiols that show intense, broad nonplasmonic optical properties. The synthesis works with many aryl-thiol capping ligands, including water-soluble 4-mercaptobenzoic acid. The nanoparticles produced show linear absorption that is broader, stronger, and more structured than most conventional organic and inorganic dyes.

  17. Socioeconomic evaluation of broad-scale land management strategies.

    Science.gov (United States)

    Lisa K. Crone; Richard W. Haynes

    2001-01-01

    This paper examines the socioeconomic effects of alternative management strategies for Forest Service and Bureau of Land Management lands in the interior Columbia basin. From a broad-scale perspective, there is little impact or variation between alternatives in terms of changes in total economic activity or social conditions in the region. However, adopting a finer...

  18. Silver Nanoparticles with Broad Multiband Linear Optical Absorption

    KAUST Repository

    Bakr, Osman M.; Amendola, Vincenzo; Aikens, Christine M.; Wenseleers, Wim; Li, Rui; Dal Negro, Luca; Schatz, George C.; Stellacci, Francesco

    2009-01-01

    A simple one-pot method produces silver nanoparticles coated with aryl thiols that show intense, broad nonplasmonic optical properties. The synthesis works with many aryl-thiol capping ligands, including water-soluble 4-mercaptobenzoic acid. The nanoparticles produced show linear absorption that is broader, stronger, and more structured than most conventional organic and inorganic dyes.

  19. Languages for Software-Defined Networks

    Science.gov (United States)

    2013-02-01

    switches, firewalls, and middleboxes) with closed and proprietary configuration inter- faces. Software - Defined Networks ( SDN ) are poised to change...how- ever, have seen growing interest in software - defined networks ( SDNs ), in which a logically-centralized controller manages the packet-processing...switches, firewalls, and middleboxes) with closed and proprietary configuration interfaces. Software - Defined Networks ( SDN ) are poised to change this

  20. Stakeholder participation in comparative effectiveness research: defining a framework for effective engagement.

    Science.gov (United States)

    Deverka, Patricia A; Lavallee, Danielle C; Desai, Priyanka J; Esmail, Laura C; Ramsey, Scott D; Veenstra, David L; Tunis, Sean R

    2012-03-01

    AIMS: Stakeholder engagement is fundamental to comparative effectiveness research (CER), but lacks consistent terminology. This paper aims to define stakeholder engagement and present a conceptual model for involving stakeholders in CER. MATERIALS #ENTITYSTARTX00026; METHODS: The definitions and model were developed from a literature search, expert input and experience with the Center for Comparative Effectiveness Research in Cancer Genomics, a proof-of-concept platform for stakeholder involvement in priority setting and CER study design. RESULTS: Definitions for stakeholder and stakeholder engagement reflect the target constituencies and their role in CER. The 'analytic-deliberative' conceptual model for stakeholder engagement illustrates the inputs, methods and outputs relevant to CER. The model differentiates methods at each stage of the project; depicts the relationship between components; and identifies outcome measures for evaluation of the process. CONCLUSION: While the definitions and model require testing before being broadly adopted, they are an important foundational step and will be useful for investigators, funders and stakeholder groups interested in contributing to CER.

  1. Broad ion beam serial section tomography

    Energy Technology Data Exchange (ETDEWEB)

    Winiarski, B., E-mail: b.winiarski@manchester.ac.uk [School of Materials, University of Manchester, Manchester M13 9PL (United Kingdom); Materials Division, National Physical Laboratory, Teddington TW11 0LW (United Kingdom); Gholinia, A. [School of Materials, University of Manchester, Manchester M13 9PL (United Kingdom); Mingard, K.; Gee, M. [Materials Division, National Physical Laboratory, Teddington TW11 0LW (United Kingdom); Thompson, G.E.; Withers, P.J. [School of Materials, University of Manchester, Manchester M13 9PL (United Kingdom)

    2017-01-15

    Here we examine the potential of serial Broad Ion Beam (BIB) Ar{sup +} ion polishing as an advanced serial section tomography (SST) technique for destructive 3D material characterisation for collecting data from volumes with lateral dimensions significantly greater than 100 µm and potentially over millimetre sized areas. Further, the associated low level of damage introduced makes BIB milling very well suited to 3D EBSD acquisition with very high indexing rates. Block face serial sectioning data registration schemes usually assume that the data comprises a series of parallel, planar slices. We quantify the variations in slice thickness and parallelity which can arise when using BIB systems comparing Gatan PECS and Ilion BIB systems for large volume serial sectioning and 3D-EBSD data acquisition. As a test case we obtain 3D morphologies and grain orientations for both phases of a WC-11%wt. Co hardmetal. In our case we have carried out the data acquisition through the manual transfer of the sample between SEM and BIB which is a very slow process (1–2 slice per day), however forthcoming automated procedures will markedly speed up the process. We show that irrespective of the sectioning method raw large area 2D-EBSD maps are affected by distortions and artefacts which affect 3D-EBSD such that quantitative analyses and visualisation can give misleading and erroneous results. Addressing and correcting these issues will offer real benefits when large area (millimetre sized) automated serial section BIBS is developed. - Highlights: • In this work we examine how microstructures can be reconstructed in three-dimensions (3D) by serial argon broad ion beam (BIB) milling, enabling much larger volumes (>250×250×100µm{sup 3}) to be acquired than by serial section focused ion beam-scanning electron microscopy (FIB-SEM). • The associated low level of damage introduced makes BIB milling very well suited to 3D-EBSD acquisition with very high indexing rates. • We explore

  2. Scrum methodology in banking environment

    OpenAIRE

    Strihová, Barbora

    2015-01-01

    Bachelor thesis "Scrum methodology in banking environment" is focused on one of agile methodologies called Scrum and description of the methodology used in banking environment. Its main goal is to introduce the Scrum methodology and outline a real project placed in a bank focused on software development through a case study, address problems of the project, propose solutions of the addressed problems and identify anomalies of Scrum in software development constrained by the banking environmen...

  3. FUTURE BIOLOGY TEACHERS’ METHODOLOGICAL TRAINING FOR THE STUDENTS’ ENVIRONMENTAL EDUCATION IN UKRAINE AND ABROAD

    Directory of Open Access Journals (Sweden)

    Nataliia Hrytsai

    2017-04-01

    Full Text Available The environmental education is an important element of general education related to the mastery of the scientific principles of interaction between nature and society. The Biology teacher should be prepared to implement the environmental education in Biology lessons at school, to organize the methodologically studying activities for students. The author has been studied different aspects of environmental education in secondary schools of Ukraine and abroad by foreign scientists (N. Andreeva, L. Rybalko, M. Skiba, O. Tsurul, T. Chistiakova. However, until now the content of the biologist-students’ methodological training in schoolchildren’s environmental education has not been studied yet. The purpose of the article is to reveal the contents and features of methodological training of future Biology teachers for the schoolchildren’s environmental education at Ukrainian and foreign Universities. The research methods are the theoretical analysis of scientific literature on the issue, the study of future Biology teachers’ methodological training in Ukraine and abroad, comparisons, generalizations and making conclusions. The article reveals the nature of environmental education, defines its mission and place in future Biology teachers’ training. The author has analysed the curricula of future Biology teachers’ training at the Universities of Ukraine and abroad, the content of teaching courses that include issues of environmental education. The importance of implementing ecological approach into future Biology teachers’ methodological training is emphasized. The author suggests subjects of methodological direction that raise the future Biology teachers’ level for implementing environmental education into secondary schools. It is established that Biology teachers’ proper training to the students’ environmental education as a basic one in high school curricula is necessary for specialty 014 Secondary education (Biology at pedagogical

  4. Defining care products to finance health care in the Netherlands.

    Science.gov (United States)

    Westerdijk, Machiel; Zuurbier, Joost; Ludwig, Martijn; Prins, Sarah

    2012-04-01

    A case-mix project started in the Netherlands with the primary goal to define a complete set of health care products for hospitals. The definition of the product structure was completed 4 years later. The results are currently being used for billing purposes. This paper focuses on the methodology and techniques that were developed and applied in order to define the casemix product structure. The central research question was how to develop a manageable product structure, i.e., a limited set of hospital products, with acceptable cost homogeneity. For this purpose, a data warehouse with approximately 1.5 million patient records from 27 hospitals was build up over a period of 3 years. The data associated with each patient consist of a large number of a priori independent parameters describing the resource utilization in different stages of the treatment process, e.g., activities in the operating theatre, the lab and the radiology department. Because of the complexity of the database, it was necessary to apply advanced data analysis techniques. The full analyses process that starts from the database and ends up with a product definition consists of four basic analyses steps. Each of these steps has revealed interesting insights. This paper describes each step in some detail and presents the major results of each step. The result consists of 687 product groups for 24 medical specialties used for billing purposes.

  5. System Anthropological Psychology: Methodological Foundations

    Directory of Open Access Journals (Sweden)

    Vitaliy Y. Klochko

    2012-01-01

    Full Text Available The article considers methodological foundations of the system anthropologicalpsychology (SAP as a scientific branch developed by a well-represented groupof Siberian scientists. SAP is a theory based on axiomatics of cultural-historicalpsychology of L.S. Vygotsky and transspective analysis as a specially developedmeans to define the tendencies of science developing as a self-organizing system.Transspective analysis has revealed regularities in a constantly growing complexityof professional-psychological thinking along the course of emergence ofscientific cognition. It has proved that the field of modern psychology is shapedby theories constructed with ideation of different grades of complexity. The concept“dynamics of the paradigm of science” is introduced; it allows transitions tobe acknowledged from ordinary-binary logic characteristics of the classical scienceto a binary-ternary logic, adequate to non-classical science and then to aternary-multidimensional logic, which is now at the stage of emergence. The latteris employed in SAP construction. It involves the following basic methodologicalprinciples: the principle of directed (selective interaction and the principle ofgenerative effect of selective interaction. The concept of “complimentary interaction”applied in natural as well as humanitarian sciences is reconsidered in thecontext of psychology. The conclusion is made that the principle of selectivity anddirectedness of interaction is relevant to the whole Universe embracing all kindsof systems including the living ones. Different levels of matter organization representingsemantic structures of various complexity use one and the same principleof meaning making through which the Universe ensures its sustainability asa self-developing phenomenon. This methodology provides an explanation fornature and stages of emergence of multidimensional life space of an individual,which comes as a foundation for generation of such features of

  6. Definition of 1992 Technology Aircraft Noise Levels and the Methodology for Assessing Airplane Noise Impact of Component Noise Reduction Concepts

    Science.gov (United States)

    Kumasaka, Henry A.; Martinez, Michael M.; Weir, Donald S.

    1996-01-01

    This report describes the methodology for assessing the impact of component noise reduction on total airplane system noise. The methodology is intended to be applied to the results of individual study elements of the NASA-Advanced Subsonic Technology (AST) Noise Reduction Program, which will address the development of noise reduction concepts for specific components. Program progress will be assessed in terms of noise reduction achieved, relative to baseline levels representative of 1992 technology airplane/engine design and performance. In this report, the 1992 technology reference levels are defined for assessment models based on four airplane sizes - an average business jet and three commercial transports: a small twin, a medium sized twin, and a large quad. Study results indicate that component changes defined as program final goals for nacelle treatment and engine/airframe source noise reduction would achieve from 6-7 EPNdB reduction of total airplane noise at FAR 36 Stage 3 noise certification conditions for all of the airplane noise assessment models.

  7. Broad Halpha Wing Formation in the Planetary Nebula IC 4997.

    Science.gov (United States)

    Lee; Hyung

    2000-02-10

    The young and compact planetary nebula IC 4997 is known to exhibit very broad wings with a width exceeding 5000 km s-1 around Halpha. We propose that the broad wings are formed through Rayleigh-Raman scattering that involves atomic hydrogen, by which Lybeta photons with a velocity width of a few 102 km s-1 are converted to optical photons and fill the Halpha broad wing region. The conversion efficiency reaches 0.6 near the line center, where the scattering optical depth is much larger than 1, and rapidly decreases in the far wings. Assuming that close to the central star there exists an unresolved inner compact core of high density, nH approximately 109-1010 cm-3, we use the photoionization code "CLOUDY" to show that sufficient Lybeta photons for scattering are produced. Using a top-hat-incident profile for the Lybeta flux and a scattering region with a H i column density NHi=2x1020 cm-2 and a substantial covering factor, we perform a profile-fitting analysis in order to obtain a satisfactory fit to the observed flux. We briefly discuss the astrophysical implications of the Rayleigh-Raman processes in planetary nebulae and other emission objects.

  8. Characterization of non-host resistance in broad bean to the wheat stripe rust pathogen

    Directory of Open Access Journals (Sweden)

    Cheng Yulin

    2012-06-01

    Full Text Available Abstract Background Non-host resistance (NHR confers plant species immunity against the majority of microbial pathogens and represents the most robust and durable form of plant resistance in nature. As one of the main genera of rust fungi with economic and biological importance, Puccinia infects almost all cereals but is unable to cause diseases on legumes. Little is known about the mechanism of this kind of effective defense in legumes to these non-host pathogens. Results In this study, the basis of NHR in broad bean (Vicia faba L. against the wheat stripe rust pathogen, Puccinia striiformis f. sp. tritici (Pst, was characterized. No visible symptoms were observed on broad bean leaves inoculated with Pst. Microscopic observations showed that successful location of stomata and haustoria formation were significantly reduced in Pst infection of broad bean. Attempted infection induced the formation of papillae, cell wall thickening, production of reactive oxygen species, callose deposition and accumulation of phenolic compounds in plant cell walls. The few Pst haustoria that did form in broad bean cells were encased in reactive oxygen and callose materials and those cells elicited cell death. Furthermore, a total of seven defense-related genes were identified and found to be up-regulated during the Pst infection. Conclusions The results indicate that NHR in broad bean against Pst results from a continuum of layered defenses, including basic incompatibility, structural and chemical strengthening of cell wall, posthaustorial hypersensitive response and induction of several defense-related genes, demonstrating the multi-layered feature of NHR. This work also provides useful information for further determination of resistance mechanisms in broad bean to rust fungi, especially the adapted important broad bean rust pathogen, Uromyces viciae-fabae, because of strong similarity and association between NHR of plants to unadapted pathogens and basal

  9. Development of a methodology for classifying software errors

    Science.gov (United States)

    Gerhart, S. L.

    1976-01-01

    A mathematical formalization of the intuition behind classification of software errors is devised and then extended to a classification discipline: Every classification scheme should have an easily discernible mathematical structure and certain properties of the scheme should be decidable (although whether or not these properties hold is relative to the intended use of the scheme). Classification of errors then becomes an iterative process of generalization from actual errors to terms defining the errors together with adjustment of definitions according to the classification discipline. Alternatively, whenever possible, small scale models may be built to give more substance to the definitions. The classification discipline and the difficulties of definition are illustrated by examples of classification schemes from the literature and a new study of observed errors in published papers of programming methodologies.

  10. Study of the hydration of globular proteins by broad NMR lines method

    Energy Technology Data Exchange (ETDEWEB)

    Blicharska, B [Uniwersytet Jagiellonski, Krakow (Poland). Instytut Fizyki

    1973-01-01

    Spectra of proteins and polypeptides obtained by means of a NMR broad line spectrometer consist of broad and thin lines. These broad and thin lines are attributed to proteins and to water absorbed on the surfaces of proteins respectively. The behaviour of the thin line in the spectra of lyophilizated albumin of the egg white has been studied in the temperature range from -42 to 20/sup 0/C. The amount of water has been found by the simple method of weighing and has been equal about 7% of the total weight. It has been found that the water absorbed on the surface of the lyophilizated proteins gives a thinner line in comparison to the water absorbed on molecules of proteins in water solutions and that the correlation time is about 10/sup 3/ times greater.

  11. Experimental Economics: Some Methodological Notes

    OpenAIRE

    Fiore, Annamaria

    2009-01-01

    The aim of this work is presenting in a self-contained paper some methodological aspects as they are received in the current experimental literature. The purpose has been to make a critical review of some very influential papers dealing with methodological issues. In other words, the idea is to have a single paper where people first approaching experimental economics can find summarised (some) of the most important methodological issues. In particular, the focus is on some methodological prac...

  12. Proposal of methodology of tsunami accident sequence analysis induced by earthquake using DQFM methodology

    International Nuclear Information System (INIS)

    Muta, Hitoshi; Muramatsu, Ken

    2017-01-01

    Since the Fukushima-Daiichi nuclear power station accident, the Japanese regulatory body has improved and upgraded the regulation of nuclear power plants, and continuous effort is required to enhance risk management in the mid- to long term. Earthquakes and tsunamis are considered as the most important risks, and the establishment of probabilistic risk assessment (PRA) methodologies for these events is a major issue of current PRA. The Nuclear Regulation Authority (NRA) addressed the PRA methodology for tsunamis induced by earthquakes, which is one of the methodologies that should be enhanced step by step for the improvement and maturity of PRA techniques. The AESJ standard for the procedure of seismic PRA for nuclear power plants in 2015 provides the basic concept of the methodology; however, details of the application to the actual plant PRA model have not been sufficiently provided. This study proposes a detailed PRA methodology for tsunamis induced by earthquakes using the DQFM methodology, which contributes to improving the safety of nuclear power plants. Furthermore, this study also states the issues which need more research. (author)

  13. [Methodology for Identification of Inverse Drug Distribution, Spain].

    Science.gov (United States)

    López Pérez, M Arantzazu; Muñoz Arias, Mariano; Vázquez Mourelle, Raquel

    2016-04-04

    The phenomenon of reverse drug trafficking in the legal supply chain is an unlawful practice to serious risks to public health. The aims was to identify proactively pharmacies that carry out these illegal activities. An analysis was performed through the crossing billing data to SAS of 52 million packs of medicines for the 496 pharmacies in the province over a period of 29 months with the drug packaging data supplied by the distribution entities of the province with the implementation of specific indicator defined called 'percentage overbought' allows us to detect those pharmacies at high risk of being involved in this illicit trade. It was tested in two pharmacies one rural and other urban a detour of 5.130 medicine containers and an illicit profit obtained from € 9,591.78 for the first and 9.982 packaging and € 26,885.11 for the second; they had gone unnoticed in previous inspections. The methodology implemented to define a profile of infringing pharmacies high risk in these illicit practices, identify new ones that had not been sanctioned, weigh the drugs for illegal trade and to identify new drugs subject to diversion; also added as a challenge, it helps to adjust accurately and effectively calculate the illicit profit obtained.

  14. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  15. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  16. Network meta-analysis-highly attractive but more methodological research is needed

    Directory of Open Access Journals (Sweden)

    Singh Sonal

    2011-06-01

    Full Text Available Abstract Network meta-analysis, in the context of a systematic review, is a meta-analysis in which multiple treatments (that is, three or more are being compared using both direct comparisons of interventions within randomized controlled trials and indirect comparisons across trials based on a common comparator. To ensure validity of findings from network meta-analyses, the systematic review must be designed rigorously and conducted carefully. Aspects of designing and conducting a systematic review for network meta-analysis include defining the review question, specifying eligibility criteria, searching for and selecting studies, assessing risk of bias and quality of evidence, conducting a network meta-analysis, interpreting and reporting findings. This commentary summarizes the methodologic challenges and research opportunities for network meta-analysis relevant to each aspect of the systematic review process based on discussions at a network meta-analysis methodology meeting we hosted in May 2010 at the Johns Hopkins Bloomberg School of Public Health. Since this commentary reflects the discussion at that meeting, it is not intended to provide an overview of the field.

  17. A new methodology for the computer-aided construction of fault trees

    International Nuclear Information System (INIS)

    Salem, S.L.; Apostolakis, G.E.; Okrent, D.

    1977-01-01

    A methodology for systematically constructing fault trees for general complex systems is developed. A means of modeling component behaviour via decision tables is presented, and a procedure, and a procedure for constructing and editing fault trees, either manually or by computer, is developed. The techniques employed result in a complete fault tree in standard form. In order to demonstrate the methodology, the computer program CAT was developed and is used to construct trees for a nuclear system. By analyzing and comparing these fault trees, several conclusions are reached. First, such an approach can be used to produce fault trees that accurately describe system behaviour. Second, multiple trees can be rapidly produced by defining various TOP events, including system success. Finally, the accuracy and utility of such trees is shown to depend upon the careful development of the decision table models by the analyst, and of the overall system definition itself. Thus the method is seen to be a tool for assisting in the work of fault tree construction rather than a replacement for the careful work of the fault tree analyst. (author)

  18. Interest and limits of the six sigma methodology in medical laboratory.

    Science.gov (United States)

    Scherrer, Florian; Bouilloux, Jean-Pierre; Calendini, Ors'Anton; Chamard, Didier; Cornu, François

    2017-02-01

    The mandatory accreditation of clinical laboratories in France provides an incentive to develop real tools to measure performance management methods and to optimize the management of internal quality controls. Six sigma methodology is an approach commonly applied to software quality management and discussed in numerous publications. This paper discusses the primary factors that influence the sigma index (the choice of the total allowable error, the approach used to address bias) and compares the performance of different analyzers on the basis of the sigma index. Six sigma strategy can be applied to the policy management of internal quality control in a laboratory and demonstrates through a comparison of four analyzers that there is no single superior analyzer in clinical chemistry. Similar sigma results are obtained using approaches toward bias based on the EQAS or the IQC. The main difficulty in using the six sigma methodology lies in the absence of official guidelines for the definition of the total error acceptable. Despite this drawback, our comparison study suggests that difficulties with defined analytes do not vary with the analyzer used.

  19. Active teaching-learning methodologies: medical students' views of problem-based learning

    Directory of Open Access Journals (Sweden)

    José Roberto Bittencourt Costa

    Full Text Available The prevailing undergraduate medical training process still favors disconnection and professional distancing from social needs. The Brazilian Ministries of Education and Health, through the National Curriculum Guidelines, the Incentives Program for Changes in the Medical Curriculum (PROMED, and the National Program for Reorientation of Professional Training in Health (PRO-SAÚDE, promoted the stimulus for an effective connection between medical institutions and the Unified National Health System (SUS. In accordance to the new paradigm for medical training, the Centro Universitário Serra dos Órgãos (UNIFESO established a teaching plan in 2005 using active methodologies, specifically problem-based learning (PBL. Research was conducted through semi-structured interviews with third-year undergraduate students at the UNIFESO Medical School. The results were categorized as proposed by Bardin's thematic analysis, with the purpose of verifying the students' impressions of the new curriculum. Active methodologies proved to be well-accepted by students, who defined them as exciting and inclusive of theory and practice in medical education.

  20. Development of a methodology for the detection of hospital financial outliers using information systems.

    Science.gov (United States)

    Okada, Sachiko; Nagase, Keisuke; Ito, Ayako; Ando, Fumihiko; Nakagawa, Yoshiaki; Okamoto, Kazuya; Kume, Naoto; Takemura, Tadamasa; Kuroda, Tomohiro; Yoshihara, Hiroyuki

    2014-01-01

    Comparison of financial indices helps to illustrate differences in operations and efficiency among similar hospitals. Outlier data tend to influence statistical indices, and so detection of outliers is desirable. Development of a methodology for financial outlier detection using information systems will help to reduce the time and effort required, eliminate the subjective elements in detection of outlier data, and improve the efficiency and quality of analysis. The purpose of this research was to develop such a methodology. Financial outliers were defined based on a case model. An outlier-detection method using the distances between cases in multi-dimensional space is proposed. Experiments using three diagnosis groups indicated successful detection of cases for which the profitability and income structure differed from other cases. Therefore, the method proposed here can be used to detect outliers. Copyright © 2013 John Wiley & Sons, Ltd.