WorldWideScience

Sample records for dependability assessment methods

  1. Building Envelope Performance Assessments in Harsh Climates: Methods for Geographically Dependent Design

    OpenAIRE

    2006-01-01

    The lifetime of the built environment depends strongly on the severity of local climatic conditions. A well-functioning and reliable infrastructure is a precondition for economic growth and social development. The climate and topography of Norway puts great demands on the design and localization of buildings. The relationship between materials, structures and climatic impact is highly complex; illustrating the need for new and improved methods for vulnerability assessment of building envelope...

  2. Texture-depending performance of an in situ method assessing deep seepage

    Science.gov (United States)

    Hohenbrink, Tobias L.; Lischeid, Gunnar

    2014-04-01

    Deep seepage estimation is important for water balance investigations of groundwater and the vadose zone. A simplified Buckingham-Darcy method to assess time series of deep seepage fluxes was proposed by Schindler and Müller (1998). In the method dynamics of water fluxes are calculated by a soil hydraulic conductivity function. Measured soil moistures and matric heads are used as input data. Resulting time series of flux dynamics are scaled to realistic absolute levels by calibrating the method with the areal water balance. An assumption of the method is that water fluxes at different positions exhibit identical dynamics although their absolute values can differ. The aim of this study was to investigate uncertainties of that method depending on the particle size distribution and textural heterogeneity in non-layered soils. We performed a numerical experiment using the two-dimensional Richards Equation. A basic model of transient water fluxes beneath the root and capillary zone was setup and used to simulate time series of soil moisture, matric head, and seepage fluxes for 4221 different cases of particle size distribution and intensities of textural heterogeneity. Soil hydraulic parameters were predicted by the pedotransfer function Rosetta. Textural heterogeneity was modeled with Miller and Miller scaling factors arranged in spatial random fields. Seepage fluxes were calculated with the Buckingham-Darcy method from simulated soil moisture and matric head time series and compared with simulated reference fluxes. The median of Root Mean Square Error was about 0.026 cm d-1 and the median of maximum cross correlation was 0.96 when the method was calibrated adequately. The method's performance was mainly influenced by (i) the soil textural class and (ii) the time period used for flux calibration. It performed best in sandy loam while hotspots of errors occurred in sand and silty texture. Calibrating the method with time periods that exhibit high variance of seepage

  3. Systems dependability assessment

    CERN Document Server

    Aubry, Jean-François

    2015-01-01

    Presents recent developments of probabilistic assessment of systems dependability based on stochastic models, including graph theory, finite state automaton and language theory, for both dynamic and hybrid contexts.

  4. An Emergy-Based Hybrid Method for Assessing Sustainability of the Resource-Dependent Region

    Directory of Open Access Journals (Sweden)

    Lulu Qu

    2017-01-01

    Full Text Available As the natural resources are getting exhausted, the concept of sustainable development of regions has received increasing attention, especially for resource-dependent cities. In this paper, an innovative method based on emergy analysis and the Human Impact Population Affluence Technology (IPAT model is developed to analyze the quantitative relationship of economic growth, energy consumption and its overall sustainability level. Taiyuan, a traditional, resource-dependent city in China, is selected as the case study region. The main results show that the total emergy of Taiyuan increased from 9.023 × 1023 sej in 2007 to 9.116 × 1023 sej in 2014, with a 38% decline in non-renewable emergy and an increase of imported emergy up to 125%. The regional emergy money ratio (EMB was reduced by 48% from 5.31 × 1013 sej/$ in 2007 to 2.74 × 1013 sej/$ in 2014, indicating that the increasing speed of consuming resources and energy was faster than the increase of GDP, and that Taiyuan’s money purchasing power declined. The lower emergy sustainability index (ESI indicates that Taiyuan was explored and produced large quantities of mineral resources, which puts more stress on the environment as a consequence, and that this is not sustainable in the long run. The IPAT analysis demonstrates that Taiyuan sticks to the efforts of energy conservation and environmental protection. In order to promote regional sustainable development, it is necessary to have an integrated effort. Policy insights suggest that resourceful regions should improve energy and resource efficiency, optimize energy and resourceful structure and carry out extensive public participation.

  5. Comparative assessment of performance and genome dependence among phylogenetic profiling methods

    Directory of Open Access Journals (Sweden)

    Wu Jie

    2006-09-01

    Full Text Available Abstract Background The rapidly increasing speed with which genome sequence data can be generated will be accompanied by an exponential increase in the number of sequenced eukaryotes. With the increasing number of sequenced eukaryotic genomes comes a need for bioinformatic techniques to aid in functional annotation. Ideally, genome context based techniques such as proximity, fusion, and phylogenetic profiling, which have been so successful in prokaryotes, could be utilized in eukaryotes. Here we explore the application of phylogenetic profiling, a method that exploits the evolutionary co-occurrence of genes in the assignment of functional linkages, to eukaryotic genomes. Results In order to evaluate the performance of phylogenetic profiling in eukaryotes, we assessed the relative performance of commonly used profile construction techniques and genome compositions in predicting functional linkages in both prokaryotic and eukaryotic organisms. When predicting linkages in E. coli with a prokaryotic profile, the use of continuous values constructed from transformed BLAST bit-scores performed better than profiles composed of discretized E-values; the use of discretized E-values resulted in more accurate linkages when using S. cerevisiae as the query organism. Extending this analysis by incorporating several eukaryotic genomes in profiles containing a majority of prokaryotes resulted in similar overall accuracy, but with a surprising reduction in pathway diversity among the most significant linkages. Furthermore, the application of phylogenetic profiling using profiles composed of only eukaryotes resulted in the loss of the strong correlation between common KEGG pathway membership and profile similarity score. Profile construction methods, orthology definitions, ontology and domain complexity were explored as possible sources of the poor performance of eukaryotic profiles, but with no improvement in results. Conclusion Given the current set of

  6. IC50-based approaches as an alternative method for assessment of time-dependent inhibition of CYP3A4.

    Science.gov (United States)

    Burt, Howard J; Galetin, Aleksandra; Houston, J Brian

    2010-05-01

    The predictive utility of two in vitro methods (empirical IC(50)-based and mechanistic k(inact)/K(I)) for the assessment of time-dependent cytochrome P450 3A4 (CYP3A4) inhibition has been compared. IC(50) values were determined at multiple pre-incubation time points over 30 min for five CYP3A4 time-dependent inhibitors (verapamil, diltiazem, erythromycin, clarithromycin, and azithromycin). The ability of IC(50) data obtained following pre-incubation to predict k(inact)/K(I) parameters was investigated and its utility was assessed relative to the conventional k(inact)/K(I) model using 50 reported clinical drug-drug interactions (DDIs). Models with either hepatic or hepatic with intestinal components were explored. For low/medium potency time-dependent inhibitors, 81% of the predicted k(inact)/K(I(unbound)) from IC(50) data were within an order of magnitude of the actual values, in contrast to 50% of potent inhibitors. An underprediction trend and > 50% of false-negatives were observed when IC(50) data were used in the DDI hepatic prediction model; incorporation of the intestine improved the prediction accuracy. On the contrary, 86% of the DDI studies were predicted within twofold using k(inact)/K(I) mechanistic approach and the combined hepatic and intestinal model. Use of the empirical IC(50) approach as an alternative to the mechanistic k(inact)/K(I) model for in vivo DDI prediction is limited and is best restricted to preliminary investigations.

  7. Assessment of endothelium: Dependent vasodilation with a non-invasive method in patients with preeclampsia compared to normotensive pregnant women

    Directory of Open Access Journals (Sweden)

    Seyedeh Zahra Allameh

    2014-01-01

    Full Text Available Background: To assess the endothelial function via noninvasive method, in pregnant women with preeclampsia compared to to normotensive pregnant women. Materials and Methods: Brachial artery diameter was measured via ultrasound, in 28 women with preeclampcia in case group and normotensive pregnant women in control group, at rest, after inflation of sphygmomanometer cuff up to 250-300 mmHg, immediately after deflation of the cuff, 60-90 minutes later and 5 min after administration of sublingual trinitroglycerin (TNG. Results of these measurements as well as demographic characteristics of participants in both groups were recorded in special forms. Data were analyzed via Statistical Package for Social Sciences (SPSS version 16, using t-test and repeated measures analysis of variance (ANOVA. P-value < 0.05 was considered statistically significant. The results were presented as mean ± standard deviation (SD. Results: The mean of brachial artery diameter at rest in the case and control groups was 4.49 ± 0.39 and 4.08 ± 0.38 mm, respectively (P = 0.1. Also the results showed that the brachial artery diameter, immediately after deflation of the cuff, was 4.84 ± 0.4 and 4.37 ± 0.30 mm in the case and control groups (P < 0.001, respectively. The mean brachial artery diameter, 60-90 s after deflation of the cuff, was 4.82 ± 0.41 and 4.42 ± 0.38 mm in the case and control groups (P < 0.00, respectively. The brachial artery diameter, 5 min after sublingual NO administration, was 4.95 ± 0.6 and 4.40 ± 0.45 mm in case and control groups (P < 0.001, respectively. Applying of repeated measures ANOVA showed that the mean difference between case and control groups was statistically significant (P < 0.001. Conclusion: Current study concluded that there is no difference in endothelium-dependent vasodilation between women with preeclampsia and pregnant women with normal blood pressure.

  8. Using Educational Data Mining Methods to Assess Field-Dependent and Field-Independent Learners' Complex Problem Solving

    Science.gov (United States)

    Angeli, Charoula; Valanides, Nicos

    2013-01-01

    The present study investigated the problem-solving performance of 101 university students and their interactions with a computer modeling tool in order to solve a complex problem. Based on their performance on the hidden figures test, students were assigned to three groups of field-dependent (FD), field-mixed (FM), and field-independent (FI)…

  9. Methods for assessing geodiversity

    Science.gov (United States)

    Zwoliński, Zbigniew; Najwer, Alicja; Giardino, Marco

    2017-04-01

    The accepted systematics of geodiversity assessment methods will be presented in three categories: qualitative, quantitative and qualitative-quantitative. Qualitative methods are usually descriptive methods that are suited to nominal and ordinal data. Quantitative methods use a different set of parameters and indicators to determine the characteristics of geodiversity in the area being researched. Qualitative-quantitative methods are a good combination of the collection of quantitative data (i.e. digital) and cause-effect data (i.e. relational and explanatory). It seems that at the current stage of the development of geodiversity research methods, qualitative-quantitative methods are the most advanced and best assess the geodiversity of the study area. Their particular advantage is the integration of data from different sources and with different substantive content. Among the distinguishing features of the quantitative and qualitative-quantitative methods for assessing geodiversity are their wide use within geographic information systems, both at the stage of data collection and data integration, as well as numerical processing and their presentation. The unresolved problem for these methods, however, is the possibility of their validation. It seems that currently the best method of validation is direct filed confrontation. Looking to the next few years, the development of qualitative-quantitative methods connected with cognitive issues should be expected, oriented towards ontology and the Semantic Web.

  10. Iii. Sleep assessment methods.

    Science.gov (United States)

    Sadeh, Avi

    2015-03-01

    Sleep is a complex phenomenon that could be understood and assessed at many levels. Sleep could be described at the behavioral level (relative lack of movements and awareness and responsiveness) and at the brain level (based on EEG activity). Sleep could be characterized by its duration, by its distribution during the 24-hr day period, and by its quality (e.g., consolidated versus fragmented). Different methods have been developed to assess various aspects of sleep. This chapter covers the most established and common methods used to assess sleep in infants and children. These methods include polysomnography, videosomnography, actigraphy, direct observations, sleep diaries, and questionnaires. The advantages and disadvantages of each method are highlighted.

  11. Assessment of nicotine dependence in subjects with vascular dementia

    OpenAIRE

    2015-01-01

    Background: Nicotine Dependence is an important public health issue. Nicotine Dependence is a risk factor for vascular diseases like Myocardial Infarction and Vascular dementia. The rate of nicotine dependence in Indian subjects with Vascular Dementia is not known. Hence we decided to assess Nicotine Dependence in subjects with Vascular Dementia Methods: Nicotine Dependence in subjects with Vascular Dementia was assessed among subjects presenting to Memory Clinic of a tertiary car...

  12. Guidance on Dependence Assessment in SPAR-H

    Energy Technology Data Exchange (ETDEWEB)

    April M. Whaley

    2012-06-01

    As part of the effort to develop the SPAR-H user guidance, particular attention was paid to the assessment of dependence in order to address user questions about proper application of dependence. This paper presents a discussion of dependence from a psychological perspective and provides guidance on applying this information during the qualitative analysis of dependence to ensure more realistic and appropriate dependence assessments with the SPAR-H method. While this guidance was developed with SPAR-H in mind, it may be informative to other human reliability analysis methods that also use a THERP-based dependence approach, particularly if applied at the human failure event level.

  13. Remanufacturability and Assessment Method

    Institute of Scientific and Technical Information of China (English)

    ZHU Sheng; CUI Pei-zhi; YAO Ju-kun

    2004-01-01

    Remanufacturing represents the combination of "three Rs" (reduce, reuse, recycle) into a single activity, which restores a wide variety of worn-out, discarded durable products to like-new condition, prolonging theit useful life,protecting the environment. The application of surface engineering has promoted the development of remanufacturing.Remanufacturing is an environmentally and economically sound way to achieve many of the goals of sustainable development. Through developing study of product design for remanufacturing, it can promote the surface technologies application in the remanufacturing, and markedly enhance the efficiency of remanufacturing. This paper gives some relative concepts about product design for remanufacturing, such as remanufacturability, and brings forward the remanufacturability research contents and assessment method of used product.

  14. Remanufacturability and Assessment Method

    Institute of Scientific and Technical Information of China (English)

    ZHUSheng; CUIPei-zhi; YAOJu-kun

    2004-01-01

    Remanufacturing represents the combination of "three Rs" (reduce, reuse, recycle) into a single activity, which restores a wide variety of worn-out, discarded durable products to like-new condition, prolonging their useful life, protecting the environment. The application of surface engineering has promoted the development of remanufacturing. Remanufacturing is an environmentally and economically sound way to achieve many of the goals of sustainable development. Through developing study of product design for remanufacturing, it can promote the surface technologies application in the remanufacturing, and markedly enhance the efficiency of remanufacturing. This paper gives some relative concepts about product design for remanufacturing, such as remanufacturability, and brings forward the remanufacturability research contents and assessment method of used product.

  15. Assessment of Dependency, Agreeableness, and Their Relationship

    Science.gov (United States)

    Lowe, Jennifer Ruth; Edmundson, Maryanne; Widiger, Thomas A.

    2009-01-01

    Agreeableness is central to the 5-factor model conceptualization of dependency. However, 4 meta-analyses of the relationship of agreeableness with dependency have failed to identify a consistent relationship. It was the hypothesis of the current study that these findings might be due in part to an emphasis on the assessment of adaptive, rather…

  16. Assessment of the time-dependent inhibition (TDI) potential of test compounds with human liver microsomes by IC50 shift method using a nondilution approach.

    Science.gov (United States)

    de Ron, Lian; Rajaraman, Ganesh

    2012-09-01

    Time-dependent inhibition (TDI) of hepatic cytochrome P450 (CYP) enzymes is increasingly implicated in the majority of clinically relevant drug-drug interactions (DDIs). A time-dependent inhibitor or its reactive metabolite irreversibly inactivates CYP enzymes, thereby inhibiting the metabolism of other drugs. As the majority of marketed drugs are metabolized by CYP enzymes, their inhibition has important clinical consequences, such as in decreasing the metabolic clearance of a co-administered drug (victim drug). This could be life threatening, as such an effect narrows the therapeutic index for drugs such as warfarin and other potentially toxic agents. Therefore, it is essential to examine new chemical entities for their potential to cause TDI to minimize adverse drug reactions during human studies and use. This unit presents an in vitro procedure utilizing a nondilution method in human liver microsomes for determining the TDI potential of test compounds.

  17. Methods & Strategies: Deep Assessment

    Science.gov (United States)

    Haas, Alison; Hollimon, Shameka; Lee, Okhee

    2015-01-01

    The "Next Generation Science Standards" ("NGSS") push students to have "a deeper understanding of content" (NGSS Lead States 2013, Appendix A, p. 4). However, with the reality of high-stakes assessments that rely primarily on multiple-choice questions, how can a science teacher analyze students' written responses…

  18. Information System Quality Assessment Methods

    OpenAIRE

    Korn, Alexandra

    2014-01-01

    This thesis explores challenging topic of information system quality assessment and mainly process assessment. In this work the term Information System Quality is defined as well as different approaches in a quality definition for different domains of information systems are outlined. Main methods of process assessment are overviewed and their relationships are described. Process assessment methods are divided into two categories: ISO standards and best practices. The main objective of this w...

  19. Information System Quality Assessment Methods

    OpenAIRE

    2014-01-01

    This thesis explores challenging topic of information system quality assessment and mainly process assessment. In this work the term Information System Quality is defined as well as different approaches in a quality definition for different domains of information systems are outlined. Main methods of process assessment are overviewed and their relationships are described. Process assessment methods are divided into two categories: ISO standards and best practices. The main objective of this w...

  20. Assessment of nicotine dependence in subjects with vascular dementia

    Directory of Open Access Journals (Sweden)

    Mina Chandra

    2015-06-01

    Full Text Available Background: Nicotine dependence is an important public health issue. Nicotine dependence is a risk factor for vascular diseases like myocardial infarction and vascular dementia. The rate of nicotine dependence in Indian subjects with vascular dementia is not known. Hence we decided to assess nicotine dependence in subjects with vascular dementia. Methods: Nicotine dependence in subjects with vascular dementia was assessed among subjects presenting to memory clinic of a tertiary care hospital over a period of 16 months. Data regarding sociodemographic profile and severity of nicotine dependence as per Fagerstrom nicotine dependence scale for smoking and smokeless tobacco was analysed using SPSS version 17. Results: Our study shows that in 159 subjects with vascular dementia continuing nicotine dependence was seen in nearly 12% of the subjects. Though the rates are less than the population prevalence for India, it is still relevant as nicotine is not just a risk factor for development of vascular dementia but severe nicotine dependence and longer duration of nicotine use were found to be poor prognostic factors associated with severe dementia. Further as all subjects continued to be nicotine dependent despite having been advised to quit tobacco, suggesting the need for a more comprehensive tobacco cessation intervention be offered to subjects with vascular dementia to improve outcomes. Conclusion: In subjects with vascular dementia continuing nicotine dependence is an important risk factor which must be addressed. [Int J Res Med Sci 2015; 3(3.000: 711-714

  1. Time-dependent problems and difference methods

    CERN Document Server

    Gustafsson, Bertil; Oliger, Joseph

    2013-01-01

    Praise for the First Edition "". . . fills a considerable gap in the numerical analysis literature by providing a self-contained treatment . . . this is an important work written in a clear style . . . warmly recommended to any graduate student or researcher in the field of the numerical solution of partial differential equations."" -SIAM Review Time-Dependent Problems and Difference Methods, Second Edition continues to provide guidance for the analysis of difference methods for computing approximate solutions to partial differential equations for time-de

  2. Assessment of seismic loss dependence using copula.

    Science.gov (United States)

    Goda, Katsuichiro; Ren, Jiandong

    2010-07-01

    The catastrophic nature of seismic risk is attributed to spatiotemporal correlation of seismic losses of buildings and infrastructure. For seismic risk management, such correlated seismic effects must be adequately taken into account, since they affect the probability distribution of aggregate seismic losses of spatially distributed structures significantly, and its upper tail behavior can be of particular importance. To investigate seismic loss dependence for two closely located portfolios of buildings, simulated seismic loss samples, which are obtained from a seismic risk model of spatially distributed buildings by taking spatiotemporally correlated ground motions into account, are employed. The characterization considers a loss frequency model that incorporates one dependent random component acting as a common shock to all buildings, and a copula-based loss severity model, which facilitates the separate construction of marginal loss distribution functions and nonlinear copula function with upper tail dependence. The proposed method is applied to groups of wood-frame buildings located in southwestern British Columbia. Analysis results indicate that the dependence structure of aggregate seismic losses can be adequately modeled by the right heavy tail copula or Gumbel copula, and that for the considered example, overall accuracy of the proposed method is satisfactory at probability levels of practical interest (at most 10% estimation error of fractiles of aggregate seismic loss). The developed statistical seismic loss model may be adopted in dynamic financial analysis for achieving faster evaluation with reasonable accuracy.

  3. Assessment methods of body composition

    Directory of Open Access Journals (Sweden)

    Karaba-Jakovljević Dea

    2016-01-01

    Full Text Available Body composition assessment has an important role in many fields of medicine, in evaluation of health status of the individual, as well as in sports sciences as a part of physiological profile of athletes. There are several methods for body composition assessment, which provide indirect data on the body structure. For instance in anthropometry, simple techniques such as skinfold measurements provide simply, quick and nonexpensive assessment of body fat mass. Bioelectric impedance analysis (BIA is described as a method with rising validity, especially for measurement in regional body composition. The value of BIA in routine clinical terms is still limited, while DXA has potential of becoming new golden standard for body composition assessment. More sophisiticated methods such is MRI have advantage over other techniques for estimation of regional body composition, since it provides the only accurate and viable approach for the estimation of intra-abdominal adipose tissue. This method is limited to experimental studies on smaller group of individuals, since it is expensive and not available to routine assessment. Combination of more methods may be the best approach for obtaining accurate results and informations about health status of individual.

  4. CHIP-MYTH: a novel interactive proteomics method for the assessment of agonist-dependent interactions of the human β₂-adrenergic receptor.

    Science.gov (United States)

    Kittanakom, Saranya; Barrios-Rodiles, Miriam; Petschnigg, Julia; Arnoldo, Anthony; Wong, Victoria; Kotlyar, Max; Heisler, Lawrence E; Jurisica, Igor; Wrana, Jeffrey L; Nislow, Corey; Stagljar, Igor

    2014-03-21

    G-protein coupled receptors (GPCRs) are involved in a variety of disease processes and comprise major drug targets. However, the complexity of integral membrane proteins such as GPCRs makes the identification of their interacting partners and subsequent drug development challenging. A comprehensive understanding of GPCR protein interaction networks is needed to design effective therapeutic strategies to inhibit these drug targets. Here, we developed a novel split-ubiquitin membrane yeast two-hybrid (MYTH) technology called CHIP-MYTH, which allows the unbiased characterization of interaction partners of full-length GPCRs in a drug-dependent manner. This was achieved by coupling DNA microarray technology to the MYTH approach, which allows a quantitative evaluation of interacting partners of a given integral membrane protein in the presence or absence of drug. As a proof of principle, we applied the CHIP-MYTH approach to the human β2-adrenergic receptor (β2AR), a target of interest in the treatment of asthma, chronic obstructive pulmonary disease (COPD), neurological disease, cardiovascular disease, and obesity. A CHIP-MYTH screen was performed in the presence or absence of salmeterol, a long-acting β2AR-agonist. Our results suggest that β2AR activation with salmeterol can induce the dissociation of heterotrimeric G-proteins, Gαβγ, into Gα and Gβγ subunits, which in turn activates downstream signaling cascades. Using CHIP-MYTH, we confirmed previously known and identified novel β2AR interactors involved in GPCR-mediated signaling cascades. Several of these interactions were confirmed in mammalian cells using LUminescence-based Mammalian IntERactome (LUMIER) and co-immunoprecipitation assays. In summary, the CHIP-MYTH approach is ideal for conducting comprehensive protein-protein interactions (PPI) screenings of full-length GPCRs in the presence or absence of drugs, thus providing a valuable tool to further our understanding of GPCR-mediated signaling

  5. Qualitative methods for assessing risk

    Energy Technology Data Exchange (ETDEWEB)

    Mahn, J.A. [Sandia National Labs., Albuquerque, NM (United States); Hannaman, G.W. [Science Applications International Corp., San Diego, CA (United States); Kryska, P. [Science Applications International Corp., Albuquerque, NM (United States)

    1995-04-01

    The Department of Energy`s (DOE) non-nuclear facilities generally require only a qualitative accident analysis to assess facility risks in accordance with DOE Order 5481.1B, Safety Analysis and Review System. Achieving a meaningful qualitative assessment of risk necessarily requires the use of suitable non-numerical assessment criteria. Typically, the methods and criteria for assigning facility-specific accident scenarios to the qualitative severity and likelihood classification system in the DOE order requires significant judgment in many applications. Systematic methods for more consistently assigning the total accident scenario frequency and associated consequences are required to substantiate and enhance future risk ranking between various activities at Sandia National Laboratories (SNL). SNL`s Risk Management and National Environmental Policy Act (NEPA) Department has developed an improved methodology for performing qualitative risk assessments in accordance wi the DOE order requirements. Products of this effort are an improved set of qualitative description that permit (1) definition of the severity for both technical and programmatic consequences that may result from a variety of accident scenarios, and (2) qualitative representation of the likelihood of occurrence. These sets of descriptions are intended to facilitate proper application of DOE criteria for assessing facility risks.

  6. Qualitative methods for assessing risk

    Energy Technology Data Exchange (ETDEWEB)

    Mahn, J.A. [Sandia National Labs., Albuquerque, NM (United States); Hannaman, G.W. [Science Applications International Corp., San Diego, CA (United States); Kryska, P. [Science Applications International Corp., Albuquerque, NM (United States)

    1995-04-01

    The Department of Energy`s (DOE) non-nuclear facilities generally require only a qualitative accident analysis to assess facility risks in accordance with DOE Order 5481.1B, Safety Analysis and Review System. Achieving a meaningful qualitative assessment of risk necessarily requires the use of suitable non-numerical assessment criteria. Typically, the methods and criteria for assigning facility-specific accident scenarios to the qualitative severity and likelihood classification system in the DOE order requires significant judgment in many applications. Systematic methods for more consistently assigning the total accident scenario frequency and associated consequences are required to substantiate and enhance future risk ranking between various activities at Sandia National Laboratories (SNL). SNL`s Risk Management and National Environmental Policy Act (NEPA) Department has developed an improved methodology for performing qualitative risk assessments in accordance wi the DOE order requirements. Products of this effort are an improved set of qualitative description that permit (1) definition of the severity for both technical and programmatic consequences that may result from a variety of accident scenarios, and (2) qualitative representation of the likelihood of occurrence. These sets of descriptions are intended to facilitate proper application of DOE criteria for assessing facility risks.

  7. Advanced methods of fatigue assessment

    CERN Document Server

    Radaj, Dieter

    2013-01-01

    The book in hand presents advanced methods of brittle fracture and fatigue assessment. The Neuber concept of fictitious notch rounding is enhanced with regard to theory and application. The stress intensity factor concept for cracks is extended to pointed and rounded corner notches as well as to locally elastic-plastic material behaviour. The averaged strain energy density within a circular sector volume around the notch tip is shown to be suitable for strength-assessments. Finally, the various implications of cyclic plasticity on fatigue crack growth are explained with emphasis being laid on the DJ-integral approach.   This book continues the expositions of the authors’ well known reference work in German language ‘Ermüdungsfestigkeit – Grundlagen für Ingenieure’ (Fatigue strength – fundamentals for engineers).

  8. LNG Safety Assessment Evaluation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Muna, Alice Baca [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); LaFleur, Angela Christine [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-05-01

    Sandia National Laboratories evaluated published safety assessment methods across a variety of industries including Liquefied Natural Gas (LNG), hydrogen, land and marine transportation, as well as the US Department of Defense (DOD). All the methods were evaluated for their potential applicability for use in the LNG railroad application. After reviewing the documents included in this report, as well as others not included because of repetition, the Department of Energy (DOE) Hydrogen Safety Plan Checklist is most suitable to be adapted to the LNG railroad application. This report was developed to survey industries related to rail transportation for methodologies and tools that can be used by the FRA to review and evaluate safety assessments submitted by the railroad industry as a part of their implementation plans for liquefied or compressed natural gas storage ( on-board or tender) and engine fueling delivery systems. The main sections of this report provide an overview of various methods found during this survey. In most cases, the reference document is quoted directly. The final section provides discussion and a recommendation for the most appropriate methodology that will allow efficient and consistent evaluations to be made. The DOE Hydrogen Safety Plan Checklist was then revised to adapt it as a methodology for the Federal Railroad Administration’s use in evaluating safety plans submitted by the railroad industry.

  9. Improved power performance assessment methods

    Energy Technology Data Exchange (ETDEWEB)

    Frandsen, S.; Antoniou, I.; Dahlberg, J.A. [and others

    1999-03-01

    The uncertainty of presently-used methods for retrospective assessment of the productive capacity of wind farms is unacceptably large. The possibilities of improving the accuracy have been investigated and are reported. A method is presented that includes an extended power curve and site calibration. In addition, blockage effects with respect to reference wind speed measurements are analysed. It is found that significant accuracy improvements are possible by the introduction of more input variables such as turbulence and wind shear, in addition to mean wind speed and air density. Also, the testing of several or all machines in the wind farm - instead of only one or two - may provide a better estimate of the average performance. (au)

  10. Clinical Considerations in the Assessment of Adolescent Chemical Dependency.

    Science.gov (United States)

    Winters, Ken

    1990-01-01

    Discusses relevant research findings of clinical assessment of adolescent chemical dependency so that service providers can better address these concerns. Three major issues are discussed: the definition of adolescent chemical dependency, clinical domains of assessment (chemical use problem severity, precipitating and perpetuating risk factors,…

  11. Time-dependent reliability analysis and condition assessment of structures

    Energy Technology Data Exchange (ETDEWEB)

    Ellingwood, B.R. [Johns Hopkins Univ., Baltimore, MD (United States)

    1997-01-01

    Structures generally play a passive role in assurance of safety in nuclear plant operation, but are important if the plant is to withstand the effect of extreme environmental or abnormal events. Relative to mechanical and electrical components, structural systems and components would be difficult and costly to replace. While the performance of steel or reinforced concrete structures in service generally has been very good, their strengths may deteriorate during an extended service life as a result of changes brought on by an aggressive environment, excessive loading, or accidental loading. Quantitative tools for condition assessment of aging structures can be developed using time-dependent structural reliability analysis methods. Such methods provide a framework for addressing the uncertainties attendant to aging in the decision process.

  12. A New Method for Local Dependence Map and Its Applications

    Directory of Open Access Journals (Sweden)

    Burcu H. ÜÇER

    2009-01-01

    Full Text Available Objective: This work introduces a new method to construct local dependence map based on the estimate for the linear local dependence function H(x,y, which is generalization of Pearson correlation coefficient. The new local dependence map demonstrates a practical tool for local dependence structure between two random variables. The analysis of theoretical concepts is verified by an application based on real datasets in endocrinology. Material and Methods: The method, local dependence map, requires the estimation new local dependence function which is based on regression concepts. After this local dependence function must be converted with local permutation tests in local dependence map which make the local dependence function more interpretable by identifying the regions of positive, negative and zero local dependence. Results: Based on the proposed method and we give two examples based on the real data C-peptide, insulin and TSH, FT3, FT4 from endocrinology in order to show the advantageous of the current dependence maps. They show interesting local dependence features on the other hand overall correlation coefficient is not much informative. Conclusion: Scalar dependence measures such as correlation coefficient are often used as a measure of dependence for data in medical and biological science. However, they cannot reflect the complex dependence structure of two variables. Hence we are now concerned exclusively with the statistical aspects of the dependence structure in dependence maps that will be constructed for the dataset. In this work a new method to construct local dependence map based on the regression concept for the linear local dependence function H(x,y, which is generalization of Pearson correlation coefficient, is established. The proposed new local dependence map is devoted to two examples based on the real data C-peptide, insulin and TSH, FT3, FT4 from endocrinology in order to illustrate the usefulness of the current dependence

  13. Dependence assessment in human reliability analysis based on D numbers and AHP

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Xinyi; Deng, Xinyang [School of Computer and Information Science, Southwest University, Chongqing 400715 (China); Deng, Yong, E-mail: ydeng@swu.edu.cn [School of Computer and Information Science, Southwest University, Chongqing 400715 (China); Institute of Fundamental and Frontier Sciences, University of Electronic Science and Technology of China, Chengdu, Sichuan 610054 (China); Mahadevan, Sankaran [School of Engineering, Vanderbilt University, Nashville, TN 37235 (United States)

    2017-03-15

    Highlights: • D numbers and AHP are combined to implement dependence assessment in HRA. • A new tool, called D numbers, is used to deal with the uncertainty in HRA. • The proposed method can well address the fuzziness and subjectivity in linguistic assessment. • The proposed method is well applicable in dependence assessment which inherently has a linguistic assessment process. - Abstract: Since human errors always cause heavy loss especially in nuclear engineering, human reliability analysis (HRA) has attracted more and more attention. Dependence assessment plays a vital role in HRA, measuring the dependence degree of human errors. Many researches have been done while still have improvement space. In this paper, a dependence assessment model based on D numbers and analytic hierarchy process (AHP) is proposed. Firstly, identify the factors used to measure the dependence level of two human operations. Besides, in terms of the suggested dependence level, determine and quantify the anchor points for each factor. Secondly, D numbers and AHP are adopted in model. Experts evaluate the dependence level of human operations for each factor. Then, the evaluation results are presented as D numbers and fused by D number’s combination rule that can obtain the dependence probability of human operations for each factor. The weights of factors can be determined by AHP. Thirdly, based on the dependence probability for each factor and its corresponding weight, the dependence probability of two human operations and its confidence can be obtained. The proposed method can well address the fuzziness and subjectivity in linguistic assessment. The proposed method is well applicable to assess the dependence degree of human errors in HRA which inherently has a linguistic assessment process.

  14. Hierarchical Simulation to Assess Hardware and Software Dependability

    Science.gov (United States)

    Ries, Gregory Lawrence

    1997-01-01

    This thesis presents a method for conducting hierarchical simulations to assess system hardware and software dependability. The method is intended to model embedded microprocessor systems. A key contribution of the thesis is the idea of using fault dictionaries to propagate fault effects upward from the level of abstraction where a fault model is assumed to the system level where the ultimate impact of the fault is observed. A second important contribution is the analysis of the software behavior under faults as well as the hardware behavior. The simulation method is demonstrated and validated in four case studies analyzing Myrinet, a commercial, high-speed networking system. One key result from the case studies shows that the simulation method predicts the same fault impact 87.5% of the time as is obtained by similar fault injections into a real Myrinet system. Reasons for the remaining discrepancy are examined in the thesis. A second key result shows the reduction in the number of simulations needed due to the fault dictionary method. In one case study, 500 faults were injected at the chip level, but only 255 propagated to the system level. Of these 255 faults, 110 shared identical fault dictionary entries at the system level and so did not need to be resimulated. The necessary number of system-level simulations was therefore reduced from 500 to 145. Finally, the case studies show how the simulation method can be used to improve the dependability of the target system. The simulation analysis was used to add recovery to the target software for the most common fault propagation mechanisms that would cause the software to hang. After the modification, the number of hangs was reduced by 60% for fault injections into the real system.

  15. A REVIEW OF BUILDING SUSTAINABILITY ASSESSMENT METHODS

    Directory of Open Access Journals (Sweden)

    Jernej Markelj

    2013-01-01

    Full Text Available Building sustainability assessment methods have been globally in use for more than two decades. Their use initially spread slowly through the larger developed countries, but in recent years we are seeing a rapid development of new, regionally adapted methods. The use of building assessment methods is limited because they developed on the basis of national legislation and local characteristics. In this article, the most important international building sustainability assessment methods BREEAM, LEED, DGNB and SBTool are investigated. With the help of content analysis we closely examine the aim, course and cost of the assessment, the number of certified projects, different assessment schemes, aspects of evaluation and the final certificate presentation. The result is a mutual comparison of individual assessment methods. In the discussion we present some predictions for the further development of building sustainability assessment methods. In the final part we review the situation and the possibilities for the implementation of these methods in Slovenia.

  16. On Applicability of Formal Methods and Tools to Dependable Services

    Science.gov (United States)

    Ishikawa, Fuyuki; Honiden, Shinichi

    As a variety of digital services are provided through networks, more and more efforts are made to ensure dependability of software behavior implementing services. Formal methods and tools have been considered as promising means to support dependability in complex software systems during the development. On the other hand, there have been serious doubts on practical applicability of formal methods. This paper overviews the present state of formal methods and discusses their applicability, especially focusing on two representative methods (SPIN and B Method) and their recent industrial applications. This paper also discusses applications of formal methods to dependable networked software.

  17. Functional assessment of ubiquitin-depended processes under microgravity conditions

    Science.gov (United States)

    Zhabereva, Anastasia; Shenkman, Boris S.; Gainullin, Murat; Gurev, Eugeny; Kondratieva, Ekaterina; Kopylov, Arthur

    , were separated by SDS-PAGE and subjected for mass spectrometry-based analysis.With the described workflow, we identified more than 200 proteins including of 26S proteasome subunits, members of SUMO (Small Ubiquitin-like Modifier) family and ubiquitylated substrates. On the whole, our results provide an unbiased view of ubiquitylation state under microgravity conditions and thereby demonstrate the utility of proposed combination of analytical methods for functional assessment of ubiquitin-depended processes. Acknowledgment - We thank teams of Institute of Biomedical Problems of Russian Academy of Sciences and TsSKB “Progress” Samara for organization and preparation for spaceflight. This work is partially supported by the Russian Foundation for Basic Research (grant12-04-01836).

  18. Lessons Learned from Dependency Usage in HERA: Implications for THERP-Related HRA Methods

    Energy Technology Data Exchange (ETDEWEB)

    April M. Whaley; Ronald L. Boring; Harold S. Blackman; Patrick H. McCabe; Bruce P. Hallbert

    2007-08-01

    Dependency occurs when the probability of success or failure on one action changes the probability of success or failure on a subsequent action. Dependency may serve as a modifier on the human error probabilities (HEPs) for successive actions in human reliability analysis (HRA) models. Discretion should be employed when determining whether or not a dependency calculation is warranted: dependency should not be assigned without strongly grounded reasons. Human reliability analysts may sometimes assign dependency in cases where it is unwarranted. This inappropriate assignment is attributed to a lack of clear guidance to encompass the range of scenarios human reliability analysts are addressing. Inappropriate assignment of dependency produces inappropriately elevated HEP values. Lessons learned about dependency usage in the Human Event Repository and Analysis (HERA) system may provide clarification and guidance for analysts using first-generation HRA methods. This paper presents the HERA approach to dependency assessment and discusses considerations for dependency usage in HRA, including the cognitive basis for dependency, direction for determining when dependency should be assessed, considerations for determining the dependency level, temporal issues to consider when assessing dependency, (e.g., considering task sequence versus overall event sequence, and dependency over long periods of time), and diagnosis and action influences on dependency.

  19. Screeners and brief assessment methods.

    Science.gov (United States)

    Pérez Rodrigo, Carmen; Morán Fagúndez, Luis Juan; Riobó Serván, Pilar; Aranceta Bartrina, Javier

    2015-02-26

    In the last two decades easy-to-use simple instruments have been developed and validated to assess specific aspects of the diet or a general profile that can be compared with a reference dietary pattern as the Mediterranean Diet or with the recommendations of the Dietary Guidelines. Brief instruments are rapid, simple and easy to use tools that can be implemented by unskilled personnel without specific training. These tools are useful both in clinical settings and in Primary Health Care or in the community as a tool for triage, as a screening tool to identify individuals or groups of people at risk who require further care or even they have been used in studies to investigate associations between specific aspects of the diet and health outcomes. They are also used in interventions focused on changing eating behaviors as a diagnostic tool, for self-evaluation purposes, or to provide tailored advice in web based interventions or mobile apps. There are some specific instruments for use in children, adults, elderly or specific population groups. Copyright AULA MEDICA EDICIONES 2015. Published by AULA MEDICA. All rights reserved.

  20. ASSESSMENT METHODS OF INTERNAL AUDIT

    Directory of Open Access Journals (Sweden)

    Elena RUSE

    2015-12-01

    Full Text Available Internal audit services are more and more needed within economic entities, because on one hand they are directly subordinated to the general manager, on the other hand there is an increase in credit to its recommendations, estimating that internal audit is more than just a simple compliance check based on an established referral system. Our research focuses on evaluating the impact of theory and practice in the application of internal audit process. The added value brought by internal audit function to the economic entity it is pretty difficult to establish and requires effective ways and criteria of measured. In this regard, we will try to present ways to analyze internal audit’s activity by reference to some performance indicators or other specific methods. We used as research techniques: literature review, applied research and constructive research.

  1. Development on Vulnerability Assessment Methods of PPS

    Institute of Scientific and Technical Information of China (English)

    MIAO; Qiang; ZHANG; Wen-liang; BU; Li-xin; YIN; Hong-he; LI; Xin-jun; FANG; Xin

    2013-01-01

    Through investigating information from domestic and abroad,joint the domestic assessment experience,we present a set of physical protection system(PPS)vulnerability assessment methods for on-operating nuclear power plants and for on-designing nuclear facilities.The methods will help to strengthen and upgrade the security measures of the nuclear facilities,improve the effectiveness and

  2. Enhancing Institutional Assessment Efforts through Qualitative Methods

    Science.gov (United States)

    Van Note Chism, Nancy; Banta, Trudy W.

    2007-01-01

    Qualitative methods can do much to describe context and illuminate the why behind patterns encountered in institutional assessment. Alone, or in combination with quantitative methods, they should be the approach of choice for many of the most important assessment questions. (Contains 1 table.)

  3. Enhancing Institutional Assessment Efforts through Qualitative Methods

    Science.gov (United States)

    Van Note Chism, Nancy; Banta, Trudy W.

    2007-01-01

    Qualitative methods can do much to describe context and illuminate the why behind patterns encountered in institutional assessment. Alone, or in combination with quantitative methods, they should be the approach of choice for many of the most important assessment questions. (Contains 1 table.)

  4. Impact of dependencies in risk assessments of power distribution systems

    OpenAIRE

    Alvehag, Karin

    2008-01-01

     Society has become increasingly dependent on electricity, so power system reliability is of crucial importance. However, while underinvestment leads to an unacceptable number of power outages, overinvestment will result in costs that are too high for society. The challenge is to find a socioeconomically adequate level of risk. In this risk assessment, not only the probability of power outages, but also the severity of their consequences should be included.   A risk assessment can be performe...

  5. Introduction to numerical methods for time dependent differential equations

    CERN Document Server

    Kreiss, Heinz-Otto

    2014-01-01

    Introduces both the fundamentals of time dependent differential equations and their numerical solutions Introduction to Numerical Methods for Time Dependent Differential Equations delves into the underlying mathematical theory needed to solve time dependent differential equations numerically. Written as a self-contained introduction, the book is divided into two parts to emphasize both ordinary differential equations (ODEs) and partial differential equations (PDEs). Beginning with ODEs and their approximations, the authors provide a crucial presentation of fundamental notions, such as the t

  6. Self-assessment: an alternative method of assessing speaking skills

    Directory of Open Access Journals (Sweden)

    Ekaterini Chalkia

    2012-02-01

    Full Text Available The present study focuses on self-assessment as an alternative method of assessing the speaking skills of a group of sixth graders of a Greek State Primary School. The paper consists of two parts. In the first part, traditional and alternative assessment approaches are compared and a literature review on self-assessment is presented. In the second part the methodology and the findings of the study are presented. The study was carried out by means of a questionnaire and observation notes. This was done in order to draw conclusions on the benefits of self-assessment, the difficulties students faced while carrying out self-assessment as well as to reveal the extent to which students improved their speaking skills after being involved in self-assessment. The findings revealed that the students were positive towards self-assessment. Although self-assessment was of limited duration, it turned out to be a worthwhile activity as it fostered motivation and sensitized the students to take a more active role in the learning process. It also enabled them to notice their strengths and weaknesses and improve their speaking skills. The study also revealed the practical difficulties the students faced in carrying out their self-assessment. Finally, the study concludes with recommendations for further research into this specific assessment method.

  7. Assessing environmental dependence using asset and income measures

    DEFF Research Database (Denmark)

    Charlery, Lindy Callen; Walelign, Solomon Zena

    2015-01-01

    on income and asset measures. Using a composite asset index, we were able to distinguish the asset poor from the asset non-poor. We then combined income data with the asset index, enabling us to disentangle the stochastic and structural nature of poverty. The distribution of poor and non-poor households......Understanding rural environmental dependence in a rural population is an important factor in the framing of environmental policy with the dual aim of tackling poverty and conserving nature. Firstly, this study compares the assessment of environmental dependence between poverty groupings based...... based on income measures was significantly different from that based on asset measures. The income poor are substantially more dependent on environmental resources than the income non-poor (about 15% difference) while strikingly minimal difference was observed in environmental dependence between...

  8. Risk assessment theory, methods, and applications

    CERN Document Server

    Rausand, Marvin

    2011-01-01

    With its balanced coverage of theory and applications along with standards and regulations, Risk Assessment: Theory, Methods, and Applications serves as a comprehensive introduction to the topic. The book serves as a practical guide to current risk analysis and risk assessment, emphasizing the possibility of sudden, major accidents across various areas of practice from machinery and manufacturing processes to nuclear power plants and transportation systems. The author applies a uniform framework to the discussion of each method, setting forth clear objectives and descriptions, while also shedding light on applications, essential resources, and advantages and disadvantages. Following an introduction that provides an overview of risk assessment, the book is organized into two sections that outline key theory, methods, and applications. * Introduction to Risk Assessment defines key concepts and details the steps of a thorough risk assessment along with the necessary quantitative risk measures. Chapters outline...

  9. Methods to assess iron and iodine status

    NARCIS (Netherlands)

    Zimmermann, M.B.

    2008-01-01

    Four methods are recommended for assessment of iodine nutrition: urinary iodine concentration, the goitre rate, and blood concentrations of thyroid stimulating hormone and thyroglobulin. These indicators are complementary, in that urinary iodine is a sensitive indicator of recent iodine intake

  10. ASSESSMENT OF THE EFFICIENCY OF DISINFECTION METHOD ...

    African Journals Online (AJOL)

    eobe

    ABSTRACT. The efficiencies of three disinfection methods namely boiling, water guard and pur purifier were assessed. ... Water is an indispensable resource for supporting life systems [2- ...... developing country context: improving decisions.

  11. Personality, Assessment Methods and Academic Performance

    Science.gov (United States)

    Furnham, Adrian; Nuygards, Sarah; Chamorro-Premuzic, Tomas

    2013-01-01

    This study examines the relationship between personality and two different academic performance (AP) assessment methods, namely exams and coursework. It aimed to examine whether the relationship between traits and AP was consistent across self-reported versus documented exam results, two different assessment techniques and across different…

  12. Formal Method of Description Supporting Portfolio Assessment

    Science.gov (United States)

    Morimoto, Yasuhiko; Ueno, Maomi; Kikukawa, Isao; Yokoyama, Setsuo; Miyadera, Youzou

    2006-01-01

    Teachers need to assess learner portfolios in the field of education. However, they need support in the process of designing and practicing what kind of portfolios are to be assessed. To solve the problem, a formal method of describing the relations between the lesson forms and portfolios that need to be collected and the relations between…

  13. The STIG: A new SDI assessment method

    NARCIS (Netherlands)

    Nushi, B.; Van Loenen, B.; Crompvoets, J.

    2015-01-01

    To stimulate the Spatial Data Infrastructures (SDI) development effectively and efficiently, it is key to assess the progress and benefits of the SDI. Currently, several SDI assessment methods exist. However, these are still in an infant stage and none of these appear to meet the requirements of pra

  14. CONVERGENCE PROPERTIES OF THE DEPENDENT PRP CONJUGATE GRADIENT METHODS

    Institute of Scientific and Technical Information of China (English)

    Shujun LIAN; Changyu WANG; Lixia CAO

    2006-01-01

    In this paper, a new region of βκ with respect to βPRPκ is given. With two Armijo-type line searches, the authors investigate the global convergence properties of the dependent PRP conjugate gradient methods, which extend the global convergence results of PRP conjugate gradient method proved by Grippo and Lucidi (1997) and Dai and Yuan (2002).

  15. Dependency in elderly people newly diagnosed with cancer - A mixed-method study

    DEFF Research Database (Denmark)

    Esbensen, Bente Appel; Thomé, Bibbi; Thomsen, Thordis

    2012-01-01

    PURPOSE: This study, based on data from an empirical investigation, combines quantitative and qualitative approaches in a mixed-method design to explore dependency in elderly people newly diagnosed with cancer. METHODS AND SAMPLE: 101 elderly people newly diagnosed with cancer were included...... be achieved by assessing, Activities of Daily Living (ADL) in the elderly. Receiving assistance from children seems to increase perceived dependency and to affect QoL negatively. CONCLUSIONS: The results of this mixed-method study indicate that dependency had a negative influence on the elderly with cancer...

  16. Spectral methods for time dependent partial differential equations

    Science.gov (United States)

    Gottlieb, D.; Turkel, E.

    1983-01-01

    The theory of spectral methods for time dependent partial differential equations is reviewed. When the domain is periodic Fourier methods are presented while for nonperiodic problems both Chebyshev and Legendre methods are discussed. The theory is presented for both hyperbolic and parabolic systems using both Galerkin and collocation procedures. While most of the review considers problems with constant coefficients the extension to nonlinear problems is also discussed. Some results for problems with shocks are presented.

  17. Assessment and Evaluation Methods for Access Services

    Science.gov (United States)

    Long, Dallas

    2014-01-01

    This article serves as a primer for assessment and evaluation design by describing the range of methods commonly employed in library settings. Quantitative methods, such as counting and benchmarking measures, are useful for investigating the internal operations of an access services department in order to identify workflow inefficiencies or…

  18. Dependability modeling and assessment in UML-based software development.

    Science.gov (United States)

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  19. Critical evaluation of soil contamination assessment methods for trace metals.

    Science.gov (United States)

    Desaules, André

    2012-06-01

    Correctly distinguishing between natural and anthropogenic trace metal contents in soils is crucial for assessing soil contamination. A series of assessment methods is critically outlined. All methods rely on assumptions of reference values for natural content. According to the adopted reference values, which are based on various statistical and geochemical procedures, there is a considerable range and discrepancy in the assessed soil contamination results as shown by the five methods applied to three weakly contaminated sites. This is a serious indication of their high methodological specificity and bias. No method with off-site reference values could identify any soil contamination in the investigated trace metals (Pb, Cu, Zn, Cd, Ni), while the specific and sensitive on-site reference methods did so for some sites. Soil profile balances are considered to produce the most plausible site-specific results, provided the numerous assumptions are realistic and the required data reliable. This highlights the dilemma between model and data uncertainty. Data uncertainty, however, is a neglected issue in soil contamination assessment so far. And the model uncertainty depends much on the site-specific realistic assumptions of pristine natural trace metal contents. Hence, the appropriate assessment of soil contamination is a subtle optimization exercise of model versus data uncertainty and specification versus generalization. There is no general and accurate reference method and soil contamination assessment is still rather fuzzy, with negative implications for the reliability of subsequent risk assessments.

  20. Assessing body composition: the skinfold method.

    Science.gov (United States)

    Talbot, L A; Lister, Z

    1995-12-01

    1. Excess body fat contributes to many chronic diseases. Using a case scenario, an initial screening assessment is performed on two clients. The occupational health nurse provides feedback on current lifestyle behaviors and educates the clients about relevant lifestyle changes. 2. Tables illustrate the step by step procedures for measuring body fat using the skinfold thickness method. Photographs show the multiple body sites used in the skinfold analysis. 3. Commonly asked client questions related to body fat are discussed in detail, and the use of body fat assessment as a screening method for the health promotion professional is described.

  1. USING COPULAS TO MODEL DEPENDENCE IN SIMULATION RISK ASSESSMENT

    Energy Technology Data Exchange (ETDEWEB)

    Dana L. Kelly

    2007-11-01

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition, substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.

  2. Methods to assess iron and iodine status

    NARCIS (Netherlands)

    Zimmermann, M.B.

    2008-01-01

    Four methods are recommended for assessment of iodine nutrition: urinary iodine concentration, the goitre rate, and blood concentrations of thyroid stimulating hormone and thyroglobulin. These indicators are complementary, in that urinary iodine is a sensitive indicator of recent iodine intake (days

  3. A New Method to Assess Eye Dominance

    Science.gov (United States)

    Valle-Inclan, Fernando; Blanco, Manuel J.; Soto, David; Leiros, Luz

    2008-01-01

    People usually show a stable preference for one of their eyes when monocular viewing is required ("sighting dominance") or under dichoptic stimulation conditions ("sensory eye-dominance"). Current procedures to assess this "eye dominance" are prone to error. Here we present a new method that provides a continuous measure of eye dominance and…

  4. Methods of Assessment for Affected Family Members

    Science.gov (United States)

    Orford, Jim; Templeton, Lorna; Velleman, Richard; Copello, Alex

    2010-01-01

    The article begins by making the point that a good assessment of the needs and circumstances of family members is important if previous neglect of affected family members is to be reversed. The methods we have used in research studies are then described. They include a lengthy semi-structured interview covering seven topic areas and standard…

  5. [Dependence of uniformity on the radionuclide in SPECT: test methods].

    Science.gov (United States)

    Kalnischke, Heiko; Grebe, Gerhard; Zander, Andreas; Munz, Dieter Ludwig; Geworski, Lilli

    2004-01-01

    The aim of this study was to investigate test methods to clarify whether the non-uniformity of a gamma camera depends on individual radionuclides, and whether it is necessary to measure a separate correction matrix for each radionuclide used in single photon emission computed tomography (SPECT). Two methods were devised to verify the nuclide-dependence of the gamma camera. In order to test the energy correction of the detectors, the first approach was based on the evaluation of the intrinsic non-uniformity and on the production of images with asymmetrical energy window. The second method was based on the production of correction matrices for different radionuclides, as well as on the subsequent application to phantom data that were also generated with different radionuclides. The investigation of a dualhead gamma camera produced the same results with both methods. One detector head was found to be weakly dependent on the radionuclide, due to the insufficient quality of energy correction. In this case, the phantom or patient data should be corrected using a uniformity correction matrix measured with the same radionuclide. The second detector remained nuclide-independent; in this case the uniformity correction matrix acquired for only one radionuclide was sufficient.

  6. Assessment of 'dry skin': current bioengineering methods and test designs.

    Science.gov (United States)

    Fischer, T W; Wigger-Alberti, W; Elsner, P

    2001-01-01

    Dry skin is a frequent problem in dermatology and a sign of dysfunction of the epidermis, especially of the stratum corneum as the morphological equivalent of the skin barrier. It may occur as an individual disposition or as the leading symptom of atopic dermatitis or ichthyosis. Besides the visual examination of the skin, various bioengineering methods have been developed to assess the different pathological and adaptive changes in the skin. In addition to the assessment of skin humidity, barrier function and desquamation, the quantification of skin surface topography and the mechanical properties of skin are suitable methods to characterize a dry skin condition. For clinical assessment of moisturizing products and emollients the parameters of investigation have to be defined and integrated in an adapted study design depending on the composition and content of the active agent in the test product. Newly developed cosmetic products have to be investigated for safety and efficacy. Modern bioengineering methods are suitable to fulfill these challenges.

  7. An efficient method for evaluating energy-dependent sum rules

    CERN Document Server

    Dinur, Nir Nevo; Bacca, Sonia; Barnea, Nir

    2014-01-01

    Energy-dependent sum rules are useful tools in many fields of physics. In nuclear physics, they typically involve an integration of the response function over the nuclear spectrum with a weight function composed of integer powers of the energy. More complicated weight functions are also encountered, e.g., in nuclear polarization corrections of atomic spectra. Using the Lorentz integral transform method and the Lanczos algorithm, we derive a computationally efficient technique for evaluating such sum rules that avoids the explicit calculation of both the continuum states and the response function itself. Our numerical results for electric dipole sum rules of the Helium-4 nucleus with various energy-dependent weights show rapid convergence with respect to the number of Lanczos steps. This demonstrates the usefulness of the method in a variety of electroweak reactions.

  8. Geophysics Methods in Electrometric Assessment of Dams

    Energy Technology Data Exchange (ETDEWEB)

    Davydov, V. A., E-mail: davydov-va@yandex.ru; Baidikov, S. V., E-mail: badikek@mail.ru; Gorshkov, V. Yu., E-mail: vitalaa@yandex.ru; Malikov, A. V., E-mail: alex.mal.1986@mail.ru [Russian Academy of Sciences, Geophysical Institute, Ural Branch (Russian Federation)

    2016-07-15

    The safety assessment of hydraulic structures is proposed to be conducted via geoelectric measurements, which are capable of assessing the health of earth dams in their natural bedding without intervention in their structure. Geoelectric measurements are shown as being capable of pinpointing hazardous parts of a dam, including areas of elevated seepage. Applications of such methods are shown for a number of mini-dams in the Sverdlovsk region. Aparameter (effective longitudinal conductivity) that may be used to monitor the safety of hydraulic structures is proposed. Quantitative estimates of this parameter are given in terms of the degree of safely.

  9. Statistical methods for assessment of blend homogeneity

    DEFF Research Database (Denmark)

    Madsen, Camilla

    2002-01-01

    In this thesis the use of various statistical methods to address some of the problems related to assessment of the homogeneity of powder blends in tablet production is discussed. It is not straight forward to assess the homogeneity of a powder blend. The reason is partly that in bulk materials......, it is shown how to set up parametric acceptance criteria for the batch that gives a high confidence that future samples with a probability larger than a specified value will pass the USP threeclass criteria. Properties and robustness of proposed changes to the USP test for content uniformity are investigated...

  10. Parametric methods for estimating covariate-dependent reference limits.

    Science.gov (United States)

    Virtanen, Arja; Kairisto, Veli; Uusipaikka, Esa

    2004-01-01

    Age-specific reference limits are required for many clinical laboratory measurements. Statistical assessment of calculated intervals must be performed to obtain reliable reference limits. When parametric, covariate-dependent limits are derived, normal distribution theory usually is applied due to its mathematical simplicity and relative ease of fitting. However, it is not always possible to transform data and achieve a normal distribution. Therefore, models other than those based on normal distribution theory are needed. Generalized linear model theory offers one such alternative. Regardless of the statistical model used, the assumptions behind the model should always be examined.

  11. Time-dependent coupled-cluster method for atomic nuclei

    CERN Document Server

    Pigg, D A; Nam, H; Papenbrock, T

    2012-01-01

    We study time-dependent coupled-cluster theory in the framework of nuclear physics. Based on Kvaal's bi-variational formulation of this method [S. Kvaal, arXiv:1201.5548], we explicitly demonstrate that observables that commute with the Hamiltonian are conserved under time evolution. We explore the role of the energy and of the similarity-transformed Hamiltonian under real and imaginary time evolution and relate the latter to similarity renormalization group transformations. Proof-of-principle computations of He-4 and O-16 in small model spaces, and computations of the Lipkin model illustrate the capabilities of the method.

  12. Dynamic ADI methods for elliptic equations with gradient dependent coefficients

    Energy Technology Data Exchange (ETDEWEB)

    Doss, S.

    1977-04-01

    The dynamic alternating direction implicit (DADI) methods, previously introduced and applied to elliptic problems with linear and nonlinear coefficients (a(u)), are applied here to elliptic problems with nonlinear gradient-dependent coefficients (a(grad u)), such as the minimal surface equation, the capillary surface equation, and the magnetostatic equation. Certain improvements of these methods are developed, and they are extended to ''3-directional'' or ''3-dimensional'' situations. 28 figures, 6 tables.

  13. Methods of geodiversity assessment and theirs application

    Science.gov (United States)

    Zwoliński, Zbigniew; Najwer, Alicja; Giardino, Marco

    2016-04-01

    The concept of geodiversity has rapidly gained the approval of scientists around the world (Wiedenbein 1993, Sharples 1993, Kiernan 1995, 1996, Dixon 1996, Eberhard 1997, Kostrzewski 1998, 2011, Gray 2004, 2008, 2013, Zwoliński 2004, Serrano, Ruiz- Flano 2007, Gordon et al. 2012). However, the problem recognition is still at an early stage, and in effect not explicitly understood and defined (Najwer, Zwoliński 2014). Nevertheless, despite widespread use of the concept, little progress has been made in its assessment and mapping. Less than the last decade can be observing investigation of methods for geodiversity assessment and its visualisation. Though, many have acknowledged the importance of geodiversity evaluation (Kozłowski 2004, Gray 2004, Reynard, Panizza 2005, Zouros 2007, Pereira et al. 2007, Hjort et al. 2015). Hitherto, only a few authors have undertaken that kind of methodological issues. Geodiversity maps are being created for a variety of purposes and therefore their methods are quite manifold. In the literature exists some examples of the geodiversity maps applications for the geotourism purpose, basing mainly on the geological diversity, in order to point the scale of the area's tourist attractiveness (Zwoliński 2010, Serrano and Gonzalez Trueba 2011, Zwoliński and Stachowiak 2012). In some studies, geodiversity maps were created and applied to investigate the spatial or genetic relationships with the richness of particular natural environmental components (Burnett et al. 1998, Silva 2004, Jačková, Romportl 2008, Hjort et al. 2012, 2015, Mazurek et al. 2015, Najwer et al. 2014). There are also a few examples of geodiversity assessment in order to geoconservation and efficient management and planning of the natural protected areas (Serrano and Gonzalez Trueba 2011, Pellitero et al. 2011, 2014, Jaskulska et al. 2013, Melelli 2014, Martinez-Grana et al. 2015). The most popular method of assessing the diversity of abiotic components of the natural

  14. Landfill mining: Developing a comprehensive assessment method.

    Science.gov (United States)

    Hermann, Robert; Wolfsberger, Tanja; Pomberger, Roland; Sarc, Renato

    2016-11-01

    In Austria, the first basic technological and economic examinations of mass-waste landfills with the purpose to recover secondary raw materials have been carried out by the 'LAMIS - Landfill Mining Österreich' pilot project. A main focus of its research, and the subject of this article, is the first conceptual design of a comprehensive assessment method for landfill mining plans, including not only monetary factors (like costs and proceeds) but also non-monetary ones, such as the concerns of adjoining owners or the environmental impact. Detailed reviews of references, the identification of influences and system boundaries to be included in planning landfill mining, several expert workshops and talks with landfill operators have been performed followed by a division of the whole assessment method into preliminary and main assessment. Preliminary assessment is carried out with a questionnaire to rate juridical feasibility, the risk and the expenditure of a landfill mining project. The results of this questionnaire are compiled in a portfolio chart that is used to recommend, or not, further assessment. If a detailed main assessment is recommended, defined economic criteria are rated by net present value calculations, while ecological and socio-economic criteria are examined in a utility analysis and then transferred into a utility-net present value chart. If this chart does not support making a definite statement on the feasibility of the project, the results must be further examined in a cost-effectiveness analysis. Here, the benefit of the particular landfill mining project per capital unit (utility-net present value ratio) is determined to make a final distinct statement on the general benefit of a landfill mining project.

  15. Assessing proprioception: A critical review of methods

    Directory of Open Access Journals (Sweden)

    Jia Han

    2016-03-01

    Full Text Available To control movement, the brain has to integrate proprioceptive information from a variety of mechanoreceptors. The role of proprioception in daily activities, exercise, and sports has been extensively investigated, using different techniques, yet the proprioceptive mechanisms underlying human movement control are still unclear. In the current work we have reviewed understanding of proprioception and the three testing methods: threshold to detection of passive motion, joint position reproduction, and active movement extent discrimination, all of which have been used for assessing proprioception. The origin of the methods, the different testing apparatus, and the procedures and protocols used in each approach are compared and discussed. Recommendations are made for choosing an appropriate technique when assessing proprioceptive mechanisms in different contexts.

  16. Method and apparatus to assess compartment syndrome

    Science.gov (United States)

    Ueno, Toshiaki (Inventor); Hargens, Alan R. (Inventor); Yost, William T. (Inventor)

    2008-01-01

    A method and apparatus for measuring pressure buildup in a body compartment that encases muscular tissue. The method includes assessing the body compartment configuration and identifying the effect of pulsatile components on at least one compartment dimension. This process is used in preventing tissue necrosis, and in decisions of whether to perform surgery on the body compartment for prevention of Compartment Syndrome. An apparatus is used for measuring excess pressure in the body compartment having components for imparting ultrasonic waves such as a transducer, placing the transducer to impart the ultrasonic waves, capturing the reflected imparted ultrasonic waves, and converting them to electrical signals, a pulsed phase-locked loop device for assessing a body compartment configuration and producing an output signal, and means for mathematically manipulating the output signal to thereby categorize pressure build-up in the body compartment from the mathematical manipulations.

  17. Method of assessing heterogeneity in images

    Energy Technology Data Exchange (ETDEWEB)

    Jacob, Richard E.; Carson, James P.

    2016-08-23

    A method of assessing heterogeneity in images is disclosed. 3D images of an object are acquired. The acquired images may be filtered and masked. Iterative decomposition is performed on the masked images to obtain image subdivisions that are relatively homogeneous. Comparative analysis, such as variogram analysis or correlogram analysis, is performed of the decomposed images to determine spatial relationships between regions of the images that are relatively homogeneous.

  18. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    Science.gov (United States)

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  19. Dependability Evaluation of a Vehicle System by Simulation Method

    Institute of Scientific and Technical Information of China (English)

    YANG Yu-hang; LI Zhi-zhong; ZHENG Li; WANG Jin-chuan

    2008-01-01

    The vehicle system studied in this paper is a type of complex repairable system in which the subsystems follow various failure distributions and conform to arbitrary failure and repair distributions. The failure data of subsystems are sometimes lacking, and the reliability test sample sizes tend to be small. Monte-Carlo technique combined with Bayes method is used to evaluate its dependability(reliability and maintainability). Following the "first-in, first-out" queuing rule, the logic relation of dependability is established by means of repairing priority and event lists. Simulation outputs the entire history of a mission, statistics of reliability and maintainability parameters and provides the basic data for system reliability design and maintainability management.

  20. A classification scheme for risk assessment methods.

    Energy Technology Data Exchange (ETDEWEB)

    Stamp, Jason Edwin; Campbell, Philip LaRoche

    2004-08-01

    This report presents a classification scheme for risk assessment methods. This scheme, like all classification schemes, provides meaning by imposing a structure that identifies relationships. Our scheme is based on two orthogonal aspects--level of detail, and approach. The resulting structure is shown in Table 1 and is explained in the body of the report. Each cell in the Table represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. This report imposes structure on the set of risk assessment methods in order to reveal their relationships and thus optimize their usage.We present a two-dimensional structure in the form of a matrix, using three abstraction levels for the rows and three approaches for the columns. For each of the nine cells in the matrix we identify the method type by name and example. The matrix helps the user understand: (1) what to expect from a given method, (2) how it relates to other methods, and (3) how best to use it. Each cell in the matrix represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. The matrix, with type names in the cells, is introduced in Table 2 on page 13 below. Unless otherwise stated we use the word 'method' in this report to refer to a 'risk assessment method', though often times we use the full phrase. The use of the terms 'risk assessment' and 'risk management' are close enough that we do not attempt to distinguish them in this report. The remainder of this report is organized as follows. In

  1. LCIA selection methods for assessing toxic releases

    DEFF Research Database (Denmark)

    Larsen, Henrik Fred; Birkved, Morten; Hauschild, Michael Zwicky

    2002-01-01

    Characterization of toxic emissions in life cycle impact assessment (LCIA) is in many cases severely limited by the lack of characterization factors for the emissions mapped in the inventory. The number of substances assigned characterization factors for (eco)toxicity included in the dominating LCA...... methods in use to day (e.g. Eco-indicator 99 and EDIP) is in the range of 40 – 330 and often they only cover a minor part of the substances in the inventory. The user of the LCA method should in principle be able to calculate any missing factors (if needed substance data are available which is often....... The methods are evaluated against a set of pre-defined criteria (comprising consistency with characterization and data requirement) and applied to case studies and a test set of chemicals. The reported work is part of the EU-project OMNIITOX....

  2. On the stochastic dependence between photomultipliers in the TDCR method

    Energy Technology Data Exchange (ETDEWEB)

    Bobin, C., E-mail: christophe.bobin@cea.fr [CEA, LIST, Laboratoire National Henri Becquerel (LNE-LNHB), F-91191 Gif-sur-Yvette Cedex (France); Thiam, C.; Chauvenet, B.; Bouchard, J. [CEA, LIST, Laboratoire National Henri Becquerel (LNE-LNHB), F-91191 Gif-sur-Yvette Cedex (France)

    2012-04-15

    The TDCR method (Triple to Double Coincidence Ratio) is widely implemented in National Metrology Institutes for activity primary measurements based on liquid scintillation counting. The detection efficiency and thereby the activity are determined using a statistical and physical model. In this article, we propose to revisit the application of the classical TDCR model and its validity by introducing a prerequisite of stochastic independence between photomultiplier counting. In order to support the need for this condition, the demonstration is carried out by considering the simple case of a monoenergetic deposition in the scintillation cocktail. Simulations of triple and double coincidence counting are presented in order to point out the existence of stochastic dependence between photomultipliers that can be significant in the case of low-energy deposition in the scintillator. It is demonstrated that a problem of time dependence arises when the coincidence resolving time is shorter than the time distribution of scintillation photons; in addition, it is shown that this effect is at the origin of a bias in the detection efficiency calculation encountered for the standardization of {sup 3}H. This investigation is extended to the study of geometric dependence between photomultipliers related to the position of light emission inside the scintillation vial (the volume of the vial is not considered in the classical TDCR model). In that case, triple and double coincidences are calculated using a stochastic TDCR model based on the Monte-Carlo simulation code Geant4. This stochastic approach is also applied to the standardization of {sup 51}Cr by liquid scintillation; the difference observed in detection efficiencies calculated using the standard and stochastic models can be explained by such an effect of geometric dependence between photomultiplier channels. - Highlights: Black-Right-Pointing-Pointer The TDCR model is revisited by introducing the condition of stochastic

  3. ASSESSMENT METHODS OF CORMORANT (Phalacrocorax carbo DIET

    Directory of Open Access Journals (Sweden)

    Krešimir Terzić

    2008-10-01

    Full Text Available Various cormorant diet assessment methods are used to assess their daily meal in order to evaluate, using these and other data, the damage to commercial fish farms as well as the damage on open waters caused by cormorants. All of the parameters used for evaluating the damage to fish stock (number of birds, density and fish structure, daily meal, fish price, degree of protection and preservation etc. are specific for an individual fishpond or other body of water and can only be used for that locality and not elsewhere. The results on the lowest and highest values of fish mass that cormorants eat daily vary extensively. By examining the available literature, the following values for individual adults have been determined: pellets — 347 g, pellets of captive cormorants — 371 g, stomach content — 359.5 g, regurgitations — 260 to 539 g, energy requirements — 751 g, stomach temperature — 336±98 g.

  4. A Novel Method to Test Dependable Composed Service Components

    Directory of Open Access Journals (Sweden)

    Khaled Farj

    2016-05-01

    Full Text Available Assessing Web service systems performance and their dependability are crucial for the development of today’s applications. Testing the performance and Fault Tolerance Mechanisms (FTMs of composed service components is hard to be measured at design time due to service instability is often caused by the nature of the network conditions. Using a real internet environment for testing systems is difficult to set up and control. We have introduced a fault injection toolkit that emulates a WAN within a LAN environment between composed service components and offers full control over the emulated environment in addition to the capability to inject network-related faults and application specific faults. The toolkit also generates background workloads on the system under test so as to produce more realistic results. We describe an experiment that has been performed to examine the impact of fault tolerance protocols deployed at a service client by using our toolkit system.

  5. Assessment of anthropometric methods in headset design

    DEFF Research Database (Denmark)

    Stavrakos, Stavros-Konstantinos; Ahmed-Kristensen, Saeema

    2012-01-01

    Current approaches to assess consumer products for usability and comfort often involve expensive user trials. For external ear products such as headsets and bluetooth communication devices comfort is an issue leading to many concepts being rejected at the late stages of the product development...... industry. The intention of this approach is to investigate and evaluate the methods leading to a recommendation of their usage during the different phases of the product development process. The current study explores the complicated relationships between comfort, technology and humans through...

  6. [Retrospective exposure assessment in occupational epidemiology: principles and methods].

    Science.gov (United States)

    Cocco, P

    2010-01-01

    Occupational histories in case-control studies typically include a variety of past exposure circumstances and no monitoring data, posing serious challenges to the retrospective assessment of occupational exposures. METHODS. I will use examples from the EPILYMPH case-control study on lymphoma risk to introduce principles and methods of retrospective assessment of occupational exposures. Exposure assessment consists in several indicators, such as frequency and intensity of exposure, as well as a confidence score, expressing the occupational expert own judgement on the reliability of the assessment itself. Testing the null hypothesis from multiple perspectives allows boosting inference: while trends by the individual exposure indicators were all of borderline statistical significance, testing the association between CLL risk and exposure to ethylene oxide with the Fisher's test for combined testing of multiple probabilities yielded a p-value of 0.003. Using the occupational expert assessment as the gold standard, the specificity of a prior job-exposure matrix for benzene was 93%, and its sensitivity 40%., with a positive and negative predictive values ranging 71-77%. Once bias can be excluded, assuming a true association between exposure and disease, retrospective exposure assessment only under estimates the true risk, which size also depends on frequency of the exposure itself.

  7. Statistical methods for assessment of blend homogeneity

    DEFF Research Database (Denmark)

    Madsen, Camilla

    2002-01-01

    as powder blends there is no natural unit or amount to define a sample from the blend, and partly that current technology does not provide a method of universally collecting small representative samples from large static powder beds. In the thesis a number of methods to assess (in)homogeneity are presented...... of internal factors to the blend e.g. the particle size distribution. The relation between particle size distribution and the variation in drug content in blend and tablet samples is discussed. A central problem is to develop acceptance criteria for blends and tablet batches to decide whether the blend...... blend or batch. In the thesis it is shown how to link sampling result and acceptance criteria to the actual quality (homogeneity) of the blend or tablet batch. Also it is discussed how the assurance related to a specific acceptance criteria can be obtained from the corresponding OC-curve. Further...

  8. Assessment of composite motif discovery methods

    Directory of Open Access Journals (Sweden)

    Johansen Jostein

    2008-02-01

    Full Text Available Abstract Background Computational discovery of regulatory elements is an important area of bioinformatics research and more than a hundred motif discovery methods have been published. Traditionally, most of these methods have addressed the problem of single motif discovery – discovering binding motifs for individual transcription factors. In higher organisms, however, transcription factors usually act in combination with nearby bound factors to induce specific regulatory behaviours. Hence, recent focus has shifted from single motifs to the discovery of sets of motifs bound by multiple cooperating transcription factors, so called composite motifs or cis-regulatory modules. Given the large number and diversity of methods available, independent assessment of methods becomes important. Although there have been several benchmark studies of single motif discovery, no similar studies have previously been conducted concerning composite motif discovery. Results We have developed a benchmarking framework for composite motif discovery and used it to evaluate the performance of eight published module discovery tools. Benchmark datasets were constructed based on real genomic sequences containing experimentally verified regulatory modules, and the module discovery programs were asked to predict both the locations of these modules and to specify the single motifs involved. To aid the programs in their search, we provided position weight matrices corresponding to the binding motifs of the transcription factors involved. In addition, selections of decoy matrices were mixed with the genuine matrices on one dataset to test the response of programs to varying levels of noise. Conclusion Although some of the methods tested tended to score somewhat better than others overall, there were still large variations between individual datasets and no single method performed consistently better than the rest in all situations. The variation in performance on individual

  9. Deriving Specifications of Dependable Systems: toward a Method

    CERN Document Server

    Mazzara, Manuel

    2010-01-01

    This paper proposes a method for deriving formal specifications of systems. To accomplish this task we pass through a non trivial number of steps, concepts and tools where the first one, the most important, is the concept of method itself, since we realized that computer science has a proliferation of languages but very few methods. We also propose the idea of Layered Fault Tolerant Specification (LFTS) to make the method extensible to dependable systems. The principle is layering the specification, for the sake of clarity, in (at least) two different levels, the first one for the normal behavior and the others (if more than one) for the abnormal. The abnormal behavior is described in terms of an Error Injector (EI) which represents a model of the erroneous interference coming from the environment. This structure has been inspired by the notion of idealized fault tolerant component but the combination of LFTS and EI using rely guarantee thinking to describe interference can be considered one of the main contr...

  10. Cardiovascular complications in acromegaly: methods of assessment.

    Science.gov (United States)

    Vitale, G; Pivonello, R; Galderisi, M; D'Errico, A; Spinelli, L; Lupoli, G; Lombardi, G; Colao, A

    2001-09-01

    Cardiac involvement is common in acromegaly. Evidence for cardiac hypertrophy, dilation and diastolic filling abnormalities has been widely reported in literature. Generally, ventricular hypertrophy is revealed by echocardiography but early data referred increased cardiac size by standard X-ray. Besides, echocardiography investigates cardiac function and value disease. There are new technologic advances in ultrasonic imaging. Pulsed Tissue Doppler is a new non-invasive ultrasound tool which extends Doppler applications beyond the analysis of intra-cardiac flow velocities until the quantitative assessment of the regional myocardial left ventricular wall motion, measuring directly velocities and time intervals of myocardium. The radionuclide techniques permit to study better the cardiac performance. In fact, diastolic as well as systolic function can be assessed at rest and at peak exercise by equilibrium radionuclide angiography. This method has a main advantage of providing direct evaluation of ventricular function, being operator independent. Coronary artery disease has been poorly studied mainly because of the necessity to perform invasive procedures. Only a few cases have been reported with heart failure study by coronarography and having alterations of perfusion which ameliorated after somatostatin analog treatment. More recently, a few data have been presented using perfusional scintigraphy in acromegaly, even if coronary artery disease does not seem very frequent in acromegaly. Doppler analysis of carotid arteries can be also performed to investigate atherosclerosis: however, patients with active acromegaly have endothelial dysfunction more than clear-cut atherosclerotic plaques. In conclusion, careful assessments of cardiac function, morphology and activity need in patients with acromegaly.

  11. Modelling and assessment of dependent performance shaping factors through Analytic Network Process

    Energy Technology Data Exchange (ETDEWEB)

    De Ambroggi, Massimiliano, E-mail: massimiliano.deambroggi@mail.polimi.i [Politecnico di Milano, Department of Management, Economics and Industrial Engineering, Piazza Leonardo da Vinci 32, Milan 20132 (Italy); Trucco, Paolo [Politecnico di Milano, Department of Management, Economics and Industrial Engineering, Piazza Leonardo da Vinci 32, Milan 20132 (Italy)

    2011-07-15

    Despite continuous progresses in research and applications, one of the major weaknesses of current HRA methods dwells in their limited capability of modelling the mutual influences between performance shaping factors (PSFs). Indeed at least two types of dependencies between PSFs can be defined: (i) dependency between the states of the PSFs; (ii) dependency between the influences (impacts) of the PSFs on the human performance. This paper introduces a method, based on Analytic Network Process (ANP), for the quantification of the latter, where the overall contribution of each PSF (weight) to the human error probability (HEP) is eventually returned. The core of the method is the modelling process, articulated into two steps: firstly, a qualitative network of dependencies between PSFs is identified, then, the importance of each PSF is quantitatively assessed using ANP. The model allows to distinguish two components of the PSF influence: direct influence that is the influence that the considered PSF is able to express by itself, notwithstanding the presence of other PSFs and indirect influence that is the incremental influence of the considered PSF through its influence on other PSFs. A case study in Air Traffic Control is presented where the proposed approach is integrated into the cognitive simulator PROCOS. The results demonstrated a significant modification of the influence of PSFs over the operator performance when dependencies are taken into account, underlining the importance of considering not only the possible correlation between the states of PSFs but also their mutual dependency in affecting human performance in complex systems.

  12. Dependence Assessment in Human Reliability Analysis Using Evidence Theory and AHP.

    Science.gov (United States)

    Su, Xiaoyan; Mahadevan, Sankaran; Xu, Peida; Deng, Yong

    2015-07-01

    Dependence assessment among human errors in human reliability analysis (HRA) is an important issue. Many of the dependence assessment methods in HRA rely heavily on the expert's opinion, thus are subjective and may sometimes cause inconsistency. In this article, we propose a computational model based on the Dempster-Shafer evidence theory (DSET) and the analytic hierarchy process (AHP) method to handle dependence in HRA. First, dependence influencing factors among human tasks are identified and the weights of the factors are determined by experts using the AHP method. Second, judgment on each factor is given by the analyst referring to anchors and linguistic labels. Third, the judgments are represented as basic belief assignments (BBAs) and are integrated into a fused BBA by weighted average combination in DSET. Finally, the CHEP is calculated based on the fused BBA. The proposed model can deal with ambiguity and the degree of confidence in the judgments, and is able to reduce the subjectivity and improve the consistency in the evaluation process.

  13. Unified CFD Methods Via Flowfield-Dependent Variation Theory

    Science.gov (United States)

    Chung, T. J.; Schunk, Greg; Canabal, Francisco; Heard, Gary

    1999-01-01

    This paper addresses the flowfield-dependent variation (FDV) methods in which complex physical phenomena are taken into account in the final form of partial differential equations to be solved so that finite difference methods (FDM) or finite element methods (FEM) themselves will not dictate the physics, but rather are no more than simply the options how to discretize between adjacent nodal points or within an element. The variation parameters introduced in the formulation are calculated from the current flowfield based on changes of Mach numbers, Reynolds numbers, Peclet numbers, and Damkohler numbers between adjacent nodal points, which play many significant roles such as adjusting the governing equations (hyperbolic, parabolic, and/or e!liptic), resolving various physical phenomena, and controlling the accuracy and stability of the numerical solution. The theory is verified by a number of example problems addressing the physical implications of the variation parameters which resemble the flowfield itself, shock capturing mechanism, transitions and interactions between inviscid/viscous, compressibility/incompressibility, and laminar/turbulent flows.

  14. Direct method for calculating temperature-dependent transport properties

    Science.gov (United States)

    Liu, Yi; Yuan, Zhe; Wesselink, R. J. H.; Starikov, Anton A.; van Schilfgaarde, Mark; Kelly, Paul J.

    2015-06-01

    We show how temperature-induced disorder can be combined in a direct way with first-principles scattering theory to study diffusive transport in real materials. Excellent (good) agreement with experiment is found for the resistivity of Cu, Pd, Pt (and Fe) when lattice (and spin) disorder are calculated from first principles. For Fe, the agreement with experiment is limited by how well the magnetization (of itinerant ferromagnets) can be calculated as a function of temperature. By introducing a simple Debye-like model of spin disorder parameterized to reproduce the experimental magnetization, the temperature dependence of the average resistivity, the anisotropic magnetoresistance, and the spin polarization of a Ni80Fe20 alloy are calculated and found to be in good agreement with existing data. Extension of the method to complex, inhomogeneous materials as well as to the calculation of other finite-temperature physical properties within the adiabatic approximation is straightforward.

  15. A copula method for modeling directional dependence of genes

    Directory of Open Access Journals (Sweden)

    Park Changyi

    2008-05-01

    Full Text Available Abstract Background Genes interact with each other as basic building blocks of life, forming a complicated network. The relationship between groups of genes with different functions can be represented as gene networks. With the deposition of huge microarray data sets in public domains, study on gene networking is now possible. In recent years, there has been an increasing interest in the reconstruction of gene networks from gene expression data. Recent work includes linear models, Boolean network models, and Bayesian networks. Among them, Bayesian networks seem to be the most effective in constructing gene networks. A major problem with the Bayesian network approach is the excessive computational time. This problem is due to the interactive feature of the method that requires large search space. Since fitting a model by using the copulas does not require iterations, elicitation of the priors, and complicated calculations of posterior distributions, the need for reference to extensive search spaces can be eliminated leading to manageable computational affords. Bayesian network approach produces a discretely expression of conditional probabilities. Discreteness of the characteristics is not required in the copula approach which involves use of uniform representation of the continuous random variables. Our method is able to overcome the limitation of Bayesian network method for gene-gene interaction, i.e. information loss due to binary transformation. Results We analyzed the gene interactions for two gene data sets (one group is eight histone genes and the other group is 19 genes which include DNA polymerases, DNA helicase, type B cyclin genes, DNA primases, radiation sensitive genes, repaire related genes, replication protein A encoding gene, DNA replication initiation factor, securin gene, nucleosome assembly factor, and a subunit of the cohesin complex by adopting a measure of directional dependence based on a copula function. We have compared

  16. Clinical management methods for out-patients with alcohol dependence

    Directory of Open Access Journals (Sweden)

    Boulze Isabelle

    2006-02-01

    Full Text Available Abstract Background In France outpatient centres for the care of alcoholics are healthcare establishments providing medical, psychological and social support. Although they meet the practical needs of these patients, their degree of use in each of these domains and the respective mobilisation of different skills by the care team are not well understood. Our aim was therefore to determine in detail the management involved as a function of the severity of alcohol dependence. For this purpose, all the procedures involved were compiled in a thesaurus describing its type (psychological, medical, social, reception, its scheduled or unscheduled nature, its method (face-to-face, telephone, letter and its duration. The severity of dependence was evaluated using the Addiction Severity Index (ASI. Results 45 patients were included and followed-up during 291 ± 114 days. The mean initial ASI scores (± SD were: medical (M = 0.39 ± 0.3, working-income (ER = 0.5 ± 0.3, alcohol (A = 0.51 ± 0.2, illicit drugs (D = 0.07 ± 0.08, legal (L = 0.06 ± 0.13, familial and social environment (FS = 0.34 ± 0.26, psychological (P = 0.39 ± 0.22. The total number of procedures was 1341 (29.8 per patient corresponding to 754.4 hours (16.7 per patient. The intensity of management peaked during the first month of treatment, and then declined rapidly; the maximum incidence of abstinence was observed during the 3rd month of management. Interviews with patients, group therapy and staff meetings represented 68.7%, 9.9% and 13.9% of all procedures, respectively. In patients with severe dependence, as compared to moderate, management was twice as intense in the psychological and social domains, but not in the medical domain. The ASI questionnaire was completed a second time by 24 patients, after an average of 3.2 months. The improvement was significant in the M, A, D and P domains only. Conclusion This study provided an overview of the methods employed in managing a sample of

  17. Diagnostic methods to assess inspiratory and expiratory muscle strength.

    Science.gov (United States)

    Caruso, Pedro; Albuquerque, André Luis Pereira de; Santana, Pauliane Vieira; Cardenas, Leticia Zumpano; Ferreira, Jeferson George; Prina, Elena; Trevizan, Patrícia Fernandes; Pereira, Mayra Caleffi; Iamonti, Vinicius; Pletsch, Renata; Macchione, Marcelo Ceneviva; Carvalho, Carlos Roberto Ribeiro

    2015-01-01

    Impairment of (inspiratory and expiratory) respiratory muscles is a common clinical finding, not only in patients with neuromuscular disease but also in patients with primary disease of the lung parenchyma or airways. Although such impairment is common, its recognition is usually delayed because its signs and symptoms are nonspecific and late. This delayed recognition, or even the lack thereof, occurs because the diagnostic tests used in the assessment of respiratory muscle strength are not widely known and available. There are various methods of assessing respiratory muscle strength during the inspiratory and expiratory phases. These methods are divided into two categories: volitional tests (which require patient understanding and cooperation); and non-volitional tests. Volitional tests, such as those that measure maximal inspiratory and expiratory pressures, are the most commonly used because they are readily available. Non-volitional tests depend on magnetic stimulation of the phrenic nerve accompanied by the measurement of inspiratory mouth pressure, inspiratory esophageal pressure, or inspiratory transdiaphragmatic pressure. Another method that has come to be widely used is ultrasound imaging of the diaphragm. We believe that pulmonologists involved in the care of patients with respiratory diseases should be familiar with the tests used in order to assess respiratory muscle function.Therefore, the aim of the present article is to describe the advantages, disadvantages, procedures, and clinical applicability of the main tests used in the assessment of respiratory muscle strength.

  18. An interpolation method for stream habitat assessments

    Science.gov (United States)

    Sheehan, Kenneth R.; Welsh, Stuart A.

    2015-01-01

    Interpolation of stream habitat can be very useful for habitat assessment. Using a small number of habitat samples to predict the habitat of larger areas can reduce time and labor costs as long as it provides accurate estimates of habitat. The spatial correlation of stream habitat variables such as substrate and depth improves the accuracy of interpolated data. Several geographical information system interpolation methods (natural neighbor, inverse distance weighted, ordinary kriging, spline, and universal kriging) were used to predict substrate and depth within a 210.7-m2 section of a second-order stream based on 2.5% and 5.0% sampling of the total area. Depth and substrate were recorded for the entire study site and compared with the interpolated values to determine the accuracy of the predictions. In all instances, the 5% interpolations were more accurate for both depth and substrate than the 2.5% interpolations, which achieved accuracies up to 95% and 92%, respectively. Interpolations of depth based on 2.5% sampling attained accuracies of 49–92%, whereas those based on 5% percent sampling attained accuracies of 57–95%. Natural neighbor interpolation was more accurate than that using the inverse distance weighted, ordinary kriging, spline, and universal kriging approaches. Our findings demonstrate the effective use of minimal amounts of small-scale data for the interpolation of habitat over large areas of a stream channel. Use of this method will provide time and cost savings in the assessment of large sections of rivers as well as functional maps to aid the habitat-based management of aquatic species.

  19. Quantitative methods for assessing drug synergism.

    Science.gov (United States)

    Tallarida, Ronald J

    2011-11-01

    Two or more drugs that individually produce overtly similar effects will sometimes display greatly enhanced effects when given in combination. When the combined effect is greater than that predicted by their individual potencies, the combination is said to be synergistic. A synergistic interaction allows the use of lower doses of the combination constituents, a situation that may reduce adverse reactions. Drug combinations are quite common in the treatment of cancers, infections, pain, and many other diseases and situations. The determination of synergism is a quantitative pursuit that involves a rigorous demonstration that the combination effect is greater than that which is expected from the individual drug's potencies. The basis of that demonstration is the concept of dose equivalence, which is discussed here and applied to an experimental design and data analysis known as isobolographic analysis. That method, and a related method of analysis that also uses dose equivalence, are presented in this brief review, which provides the mathematical basis for assessing synergy and an optimization strategy for determining the dose combination.

  20. Evaluation of methods to assess physical activity

    Science.gov (United States)

    Leenders, Nicole Y. J. M.

    Epidemiological evidence has accumulated that demonstrates that the amount of physical activity-related energy expenditure during a week reduces the incidence of cardiovascular disease, diabetes, obesity, and all-cause mortality. To further understand the amount of daily physical activity and related energy expenditure that are necessary to maintain or improve the functional health status and quality of life, instruments that estimate total (TDEE) and physical activity-related energy expenditure (PAEE) under free-living conditions should be determined to be valid and reliable. Without evaluation of the various methods that estimate TDEE and PAEE with the doubly labeled water (DLW) method in females there will be eventual significant limitations on assessing the efficacy of physical activity interventions on health status in this population. A triaxial accelerometer (Tritrac-R3D, (TT)), an uniaxial (Computer Science and Applications Inc., (CSA)) activity monitor, a Yamax-Digiwalker-500sp°ler , (YX-stepcounter), by measuring heart rate responses (HR method) and a 7-d Physical Activity Recall questionnaire (7-d PAR) were compared with the "criterion method" of DLW during a 7-d period in female adults. The DLW-TDEE was underestimated on average 9, 11 and 15% using 7-d PAR, HR method and TT. The underestimation of DLW-PAEE by 7-d PAR was 21% compared to 47% and 67% for TT and YX-stepcounter. Approximately 56% of the variance in DLW-PAEE*kgsp{-1} is explained by the registration of body movement with accelerometry. A larger proportion of the variance in DLW-PAEE*kgsp{-1} was explained by jointly incorporating information from the vertical and horizontal movement measured with the CSA and Tritrac-R3D (rsp2 = 0.87). Although only a small amount of variance in DLW-PAEE*kgsp{-1} is explained by the number of steps taken per day, because of its low cost and ease of use, the Yamax-stepcounter is useful in studies promoting daily walking. Thus, studies involving the

  1. Operation safety risk analysis method of hydropower project considering time-dependent effect

    Institute of Scientific and Technical Information of China (English)

    Zhang Sherong; Yan Lei

    2012-01-01

    In order to consider the time-dependent characteristic of risk factors of hydropower project, the method of stochastic process simulating structure resistance and load effect is adopted. On the basis of analyzing the structure characteristics and mode of operation, the operation safety risk rate assessment model of hydropower project is established on the comprehensive application of the improved analytic hierarchy process, the time-dependent reliability theory and the risk rate threshold. A scheme to demonstrate the time-dependent risk rate assessment method for an example of the earth-rock dam is particularly implemented by the proposed approach. The example shows that operation safety risk rate is closely related to both the service period and design standard ; considering the effect of time-dependent, the risk rate increases with time and the intersection of them reflects the technical service life of structures. It could provide scientific basis for the operation safety and risk decision of the hydropower project by predicting the trend of risk rate via this model.

  2. Assessment of Benzodiazepine dependence in alcohol and drug dependent outpatients: A research report

    NARCIS (Netherlands)

    Kan, C.C.; Breteler, M.H.M.; Ven, A.H.G.S. van der; Timmermans, M.A.Y.; Zitman, F.G.

    2001-01-01

    In this study on 99 outpatients who were being treated for alcohol and/or drug dependence and also using benzodiazepines (BZDs), prevalence rates of DSM-III-R and ICD-10 substance dependence diagnoses were ascertained and scalability, reliability and validity of the scales of the Benzodiazepine Depe

  3. ASSESSMENT OF WORK-SPACE AND WORK-METHOD DESIGNS ...

    African Journals Online (AJOL)

    ASSESSMENT OF WORK-SPACE AND WORK-METHOD DESIGNS IN NIGERIA AUTOMOBILE SERVICE AND REPAIR INDUSTRY. ... Nigerian Journal of Technology ... This research assessed work-space (WsD) and work-method designs ...

  4. Genome-Environmental Risk Assessment of Cocaine Dependence

    Directory of Open Access Journals (Sweden)

    Changshuai eWei

    2012-05-01

    Full Text Available Cocaine-associated biomedical and psychosocial problems are substantial 21st century global burdens of disease. This burden is largely driven by a cocaine dependence process that becomes engaged with increasing occasions of cocaine product use. For this reason, the development of a risk prediction model for cocaine dependence may be of special value. Ultimately, success in building such a risk prediction model may help promote personalized cocaine dependence prediction, prevention, and treatment approaches not presently available. As an initial step toward this goal, we conducted a genome-environmental risk prediction study for cocaine dependence, simultaneously considering 948,658 single nucleotide polymorphisms (SNPs, six potentially cocaine-related facets of environment, and three personal characteristics. In this study, a novel statistical approach was applied to 1045 case-control samples from the Family Study of Cocaine Dependence. The results identify 330 low- to medium-effect size SNPs (i.e., those with a single locus p-value of less than 10-4 that made a substantial contribution to cocaine dependence risk prediction (AUC=0.718. Inclusion of six facets of environment and three personal characteristics yielded greater accuracy (AUC=0.809. Of special importance was childhood abuse (CA among trauma experiences, with a potentially important interaction of CA and the GBE1 gene in cocaine dependence risk prediction. Genome-environmental risk prediction models may become more promising in future risk prediction research, once a more substantial array of environmental facets are taken into account, sometimes with model improvement when gene-by-environment product terms are included as part of these risk predication models.

  5. A time-dependent method of characteristics formulation with time derivative propagation

    Science.gov (United States)

    Hoffman, Adam J.

    We developed a new time-dependent neutron transport method for nuclear reactor kinetics using method of characteristics (MOC) with angular flux time derivative propagation. In contrast to conventional time integration methods which use local finite difference approximations to treat the time derivative, the new method solves for the spatially-dependent angular flux time derivative by propagation along characteristics in space. This results in the angular flux time derivative being recast in terms of the neutron source time derivatives, and thus the new method is called Source Derivative Propagation (SDP). We developed three SDP methods using different approximations, and they require much less memory than the conventional methods. For SDP, we approximate the source derivatives using backward differences. This is analogous to the backward differentiation formula (BDF), and our results confirmed that the high-order SDP approximations reproduced the high-order angular flux derivative approximation of equivalent order BDF. We assessed SDP by comparison to conventional time-dependent MOC methods. This included both a reference method (RBDC) which stored the angular flux and a popular approximate method (IBDC). We performed error analysis for SDP, RBDC, and IBDC. This informed the refinement of the SDP methods, and clarified when SDP will be accurate. We tested SDP using the computer code DeCART, which was used to model three transients based on the TWIGL and C5G7 benchmarks. A fine time step reference solution was generated using RBDC. The SDP methods converged to the reference when the time step was refined and the BDF order increased. In addition, we observed that SDP accurately replicated the RBDC solution when the same time step and BDF order was used. This indicates that the propagated angular flux time derivative of SDP reproduced the RBDC angular flux derivative. SDP was much more accurate than the IBDC. We assessed the efficiency of SDP by comparing the run

  6. Model-driven dependability assessment of software systems

    CERN Document Server

    Bernardi, Simona; Petriu, Dorina C

    2013-01-01

    In this book, the authors present cutting-edge model-driven techniques for modeling and analysis of software dependability. Most of them are based on the use of UML as software specification language. From the software system specification point of view, such techniques exploit the standard extension mechanisms of UML (i.e., UML profiling). UML profiles enable software engineers to add non-functional properties to the software model, in addition to the functional ones. The authors detail the state of the art on UML profile proposals for dependability specification and rigorously describe the t

  7. Assessment Methods and Tools for Architectural Curricula

    Science.gov (United States)

    Marriott, Christine A.

    2012-01-01

    This research explores the process of assessment within the arena of architectural education by questioning traditional assessment practices and probing into the conditions that necessitate change. As architectural educators we have opened our studios to digital technologies for the purposes of design and representation, but how do we measure and…

  8. Methods for assessment of keel bone damage in poultry.

    Science.gov (United States)

    Casey-Trott, T; Heerkens, J L T; Petrik, M; Regmi, P; Schrader, L; Toscano, M J; Widowski, T

    2015-10-01

    Keel bone damage (KBD) is a critical issue facing the laying hen industry today as a result of the likely pain leading to compromised welfare and the potential for reduced productivity. Recent reports suggest that damage, while highly variable and likely dependent on a host of factors, extends to all systems (including battery cages, furnished cages, and non-cage systems), genetic lines, and management styles. Despite the extent of the problem, the research community remains uncertain as to the causes and influencing factors of KBD. Although progress has been made investigating these factors, the overall effort is hindered by several issues related to the assessment of KBD, including quality and variation in the methods used between research groups. These issues prevent effective comparison of studies, as well as difficulties in identifying the presence of damage leading to poor accuracy and reliability. The current manuscript seeks to resolve these issues by offering precise definitions for types of KBD, reviewing methods for assessment, and providing recommendations that can improve the accuracy and reliability of those assessments. © 2015 Poultry Science Association Inc.

  9. A NEW METHOD FOR EXTRACTING SPIN-DEPENDENT NEUTRON STRUCTURE FUNCTIONS FROM NUCLEAR DATA

    Energy Technology Data Exchange (ETDEWEB)

    Kahn, Y.F.; Melnitchouk, W.

    2009-01-01

    High-energy electrons are currently the best probes of the internal structure of nucleons (protons and neutrons). By collecting data on electrons scattering off light nuclei, such as deuterium and helium, one can extract structure functions (SFs), which encode information about the quarks that make up the nucleon. Spin-dependent SFs, which depend on the relative polarization of the electron beam and the target nucleus, encode quark spins. Proton SFs can be measured directly from electron-proton scattering, but those of the neutron must be extracted from proton data and deuterium or helium-3 data because free neutron targets do not exist. At present, there is no reliable method for accurately determining spin-dependent neutron SFs in the low-momentum-transfer regime, where nucleon resonances are prominent and the functions are not smooth. The focus of this study was to develop a new method for extracting spin-dependent neutron SFs from nuclear data. An approximate convolution formula for nuclear SFs reduces the problem to an integral equation, for which a recursive solution method was designed. The method was then applied to recent data from proton and deuterium scattering experiments to perform a preliminary extraction of spin-dependent neutron SFs in the resonance region. The extraction method was found to reliably converge for arbitrary test functions, and the validity of the extraction from data was verifi ed using a Bjorken integral, which relates integrals of SFs to a known quantity. This new information on neutron structure could be used to assess quark-hadron duality for the neutron, which requires detailed knowledge of SFs in all kinematic regimes.

  10. Assessment of the active method to determine soil moisture

    Directory of Open Access Journals (Sweden)

    José Luis Serna Farfan

    2017-07-01

    Full Text Available In recent years, fiber-optic distributed temperature sensing (FO-DTS methods have been successfully used to investigate a wide range of hydrological applications. In particular, two methods have been developed to monitor the soil water content (θ with the FO-DTS technology: the passive and the active methods. This work presents an assessment of the active method to determine the θ of a sandy soil. In this method, fiber-optic cables with metallic armoring are used and a voltage difference is applied between the two ends of the cable to warm it during a specified time period. Then, an empirical relationship is used to relate θ with a parameter called cumulative temperature (Tcum . To apply the active method, we propose a potential relationship defined by stretches, which depends on the hydrodynamic properties of the soil studied. Different experiments were carried out to assess the active method. These experiments had different heat pulse durations (2, 5, 10 and 20 min with electrical powers of 2.1, 2.6, 2.3 and 2.4 W/m, respectively, and allowed determining the optimum heat pulse duration (tf , the optimum temporal integration interval (Δt, the optimum final time of integration (t0 used in the calculation of the cumulative temperature, and the optimum current (I that should circulate through the fiber-optic cable to generate the heat pulse. Results show that the optimum operating parameters are: tf = 1200 s Δt = 150 s, t0 = tf, and I ≈ 17 A (2.43 W/m. Our analysis allowed obtaining volumetric water contents ranging from 0.14 to 0.46 m3/m3, with errors that are smaller than 0.08 m3/m3.

  11. alternative assessment methods: implications for environmental ...

    African Journals Online (AJOL)

    integrated approach to education and training may be achieved and social ... Outcomes-based instruction and learning requires a paradigm shift .... school subjects that they have chosen. The ..... assessment is complex in nature. The use of.

  12. Time-dependent ROC curve analysis in medical research: current methods and applications.

    Science.gov (United States)

    Kamarudin, Adina Najwa; Cox, Trevor; Kolamunnage-Dona, Ruwanthi

    2017-04-07

    ROC (receiver operating characteristic) curve analysis is well established for assessing how well a marker is capable of discriminating between individuals who experience disease onset and individuals who do not. The classical (standard) approach of ROC curve analysis considers event (disease) status and marker value for an individual as fixed over time, however in practice, both the disease status and marker value change over time. Individuals who are disease-free earlier may develop the disease later due to longer study follow-up, and also their marker value may change from baseline during follow-up. Thus, an ROC curve as a function of time is more appropriate. However, many researchers still use the standard ROC curve approach to determine the marker capability ignoring the time dependency of the disease status or the marker. We comprehensively review currently proposed methodologies of time-dependent ROC curves which use single or longitudinal marker measurements, aiming to provide clarity in each methodology, identify software tools to carry out such analysis in practice and illustrate several applications of the methodology. We have also extended some methods to incorporate a longitudinal marker and illustrated the methodologies using a sequential dataset from the Mayo Clinic trial in primary biliary cirrhosis (PBC) of the liver. From our methodological review, we have identified 18 estimation methods of time-dependent ROC curve analyses for censored event times and three other methods can only deal with non-censored event times. Despite the considerable numbers of estimation methods, applications of the methodology in clinical studies are still lacking. The value of time-dependent ROC curve methods has been re-established. We have illustrated the methods in practice using currently available software and made some recommendations for future research.

  13. Assessment of nuclear power plant siting methods

    Energy Technology Data Exchange (ETDEWEB)

    Rowe, M.D.; Hobbs, B.F.; Pierce, B.L.; Meier, P.M.

    1979-11-01

    Several different methods have been developed for selecting sites for nuclear power plants. This report summarizes the basic assumptions and formal requirements of each method and evaluates conditions under which each is correctly applied to power plant siting problems. It also describes conditions under which different siting methods can produce different results. Included are criteria for evaluating the skill with which site-selection methods have been applied.

  14. Beam-propagation method - Analysis and assessment

    Science.gov (United States)

    van Roey, J.; van der Donk, J.; Lagasse, P. E.

    1981-07-01

    A method for the calculation of the propagation of a light beam through an inhomogeneous medium is presented. A theoretical analysis of this beam-propagation method is given, and a set of conditions necessary for the accurate application of the method is derived. The method is illustrated by the study of a number of integrated-optic structures, such as thin-film waveguides and gratings.

  15. Investigation of Suitability of Cascading Outage Assessment Methods for Real-Time Assessment

    DEFF Research Database (Denmark)

    Petersen, Pauli Fríðheim; Jóhannsson, Hjörtur; Nielsen, Arne Hejde

    of the method to real-time assessment. The investigation revealed that two of the methods are of special interest for further study on real-time assessment of cascading outages. These are the betweenness centrality model, based on network topology, and the manchester model, based on AC power flow.......This paper investigates the suitability of assessment methods for cascading outages for real-time assessment. A total of ten assessment methods for cascading outages are investigated, and for all of the investigated methods a complexity assessment is performed to assess the suitability...

  16. Using Corporate-Based Methods To Assess Technical Communication Programs.

    Science.gov (United States)

    Faber, Brenton; Bekins, Linn; Karis, Bill

    2002-01-01

    Investigates methods of program assessment used by corporate learning sites and profiles value added methods as a way to both construct and evaluate academic programs in technical communication. Examines and critiques assessment methods from corporate training environments including methods employed by corporate universities and value added…

  17. Jet Methods in Time-Dependent Lagrangian Biomechanics

    CERN Document Server

    Ivancevic, Tijana T

    2009-01-01

    In this paper we propose the time-dependent generalization of an `ordinary' autonomous human biomechanics, in which total mechanical + biochemical energy is not conserved. We introduce a general framework for time-dependent biomechanics in terms of jet manifolds associated to the extended musculo-skeletal configuration manifold, called the configuration bundle. We start with an ordinary configuration manifold of human body motion, given as a set of its all active degrees of freedom (DOF) for a particular movement. This is a Riemannian manifold with a material metric tensor given by the total mass-inertia matrix of the human body segments. This is the base manifold for standard autonomous biomechanics. To make its time-dependent generalization, we need to extend it with a real time axis. By this extension, using techniques from fibre bundles, we defined the biomechanical configuration bundle. On the biomechanical bundle we define vector-fields, differential forms and affine connections, as well as the associat...

  18. Assessment of seismic margin calculation methods

    Energy Technology Data Exchange (ETDEWEB)

    Kennedy, R.P.; Murray, R.C.; Ravindra, M.K.; Reed, J.W.; Stevenson, J.D.

    1989-03-01

    Seismic margin review of nuclear power plants requires that the High Confidence of Low Probability of Failure (HCLPF) capacity be calculated for certain components. The candidate methods for calculating the HCLPF capacity as recommended by the Expert Panel on Quantification of Seismic Margins are the Conservative Deterministic Failure Margin (CDFM) method and the Fragility Analysis (FA) method. The present study evaluated these two methods using some representative components in order to provide further guidance in conducting seismic margin reviews. It is concluded that either of the two methods could be used for calculating HCLPF capacities. 21 refs., 9 figs., 6 tabs.

  19. PRODUCT FORMULA METHODS FOR TIME-DEPENDENT SCHRODINGER PROBLEMS

    NARCIS (Netherlands)

    HUYGHEBAERT, J; DERAEDT, H

    1990-01-01

    This paper introduces a family of explicit and unconditionally stable algorithms for solving linear differential equations which contain a time-dependent Hermitian operator. Rigorous upper bounds are derived for two different `time-ordered' approximation schemes and for errors resulting from approxi

  20. Fuzzy Assessment Method and Its Application to Selecting Project Managers

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Open competition is a new form of the assessment of candidates and selection of project managers. This has many merits compared to the traditional administrative method of appointment. This article introduces a method of fuzzy assessment of project manager candidates. Fuzzy assessment unifies objective qualitative and quantitative appraisal and can be used for improving decision-making in the selection process.

  1. Assessment of plaque assay methods for alphaviruses.

    Science.gov (United States)

    Juarez, Diana; Long, Kanya C; Aguilar, Patricia; Kochel, Tadeusz J; Halsey, Eric S

    2013-01-01

    Viruses from the Alphavirus genus are responsible for numerous arboviral diseases impacting human health throughout the world. Confirmation of acute alphavirus infection is based on viral isolation, identification of viral RNA, or a fourfold or greater increase in antibody titers between acute and convalescent samples. In convalescence, the specificity of antibodies to an alphavirus may be confirmed by plaque reduction neutralization test. To identify the best method for alphavirus and neutralizing antibody recognition, the standard solid method using a cell monolayer overlay with 0.4% agarose and the semisolid method using a cell suspension overlay with 0.6% carboxymethyl cellulose (CMC) overlay were evaluated. Mayaro virus, Una virus, Venezuelan equine encephalitis virus (VEEV), and Western equine encephalitis virus (WEEV) were selected to be tested by both methods. The results indicate that the solid method showed consistently greater sensitivity than the semisolid method. Also, a "semisolid-variant method" using a 0.6% CMC overlay on a cell monolayer was assayed for virus titration. This method provided the same sensitivity as the solid method for VEEV and also had greater sensitivity for WEEV titration. Modifications in plaque assay conditions affect significantly results and therefore evaluation of the performance of each new assay is needed.

  2. METHODS OF ASSESSING THE COMMUNICATIVE COMPETENCE

    Directory of Open Access Journals (Sweden)

    Sergey Nikolaevich ZUBAREV

    2015-01-01

    Full Text Available The author describes various techniques of evaluating communicative competence. In the context of the study, taking into consideration the specificity of communicative competence and sphere of professional activities of gradu-ates from humanitarian higher educational institutions, the author makes the conclusions related to the methods of demonstrating the competence. The author also presents methods of evaluating the competence of graduates from humanitarian higher educational institutions in the form of detailed description of each method. Finally, the author makes conclusions on choosing the best method.

  3. Methods and Strategies: The Reflective Assessment Technique

    Science.gov (United States)

    Kennedy, Cathleen; Long, Kathy; Camins, Arthur

    2009-01-01

    Teachers often rely on student questions, their observations of students at work, and their own intuition to monitor how well students are learning. However, the authors found that teachers learn more about their students when they use the four-step Reflective Assessment Technique that draws on guided teacher reflections to inform classroom…

  4. A Method for Assessing Quality of Service in Broadband Networks

    DEFF Research Database (Denmark)

    Bujlow, Tomasz; Riaz, M. Tahir; Pedersen, Jens Myrup

    2012-01-01

    Monitoring of Quality of Service (QoS) in high-speed Internet infrastructure is a challenging task. However, precise assessments must take into account the fact that the requirements for the given quality level are service-dependent. Backbone QoS monitoring and analysis requires processing of large...... taken from the description of system sockets. This paper proposes a new method for measuring the Quality of Service (QoS) level in broadband networks, based on our Volunteer-Based System for collecting the training data, Machine Learning Algorithms for generating the classification rules and application...... and provide C5.0 high-quality training data, divided into groups corresponding to different types of applications. It was found that currently existing means of collecting data (classification by ports, Deep Packet Inspection, statistical classification, public data sources) are not sufficient and they do...

  5. Assessing Whether Oil Dependency in Venezuela Contributes to National Instability

    Directory of Open Access Journals (Sweden)

    Adam Kott

    2012-08-01

    Full Text Available The focus of this article is on what role, if any, oil has on Venezuela's instability. When trying to explain why a resource-rich country experiences slow or negative growth, experts often point to the resource curse. The following pages explore the traditional theory behind the resource curse as well as alternative perspectives to this theory such as ownership structure and the correlation between oil prices and democracy. This article also explores the various forms of instability within Venezuela and their causes. Finally, the article looks at President Hugo Chavez's political and economic policies as well as the stagnation of the state oil company, Petroleos de Venezuela (PDVSA. This article dispels the myth that the resource curse is the source of destabilization in many resource dependent countries. Rather than a cause of instability, this phenomenon is a symptom of a much larger problem that is largely structural.

  6. Comparing assessment methods as predictors of student learning in an undergraduate mathematics course

    Science.gov (United States)

    Shorter, Nichole A.; Young, Cynthia Y.

    2011-12-01

    This experiment was designed to determine which assessment method: continuous assessment (in the form of daily in-class quizzes), cumulative assessment (in the form of online homework), or project-based learning, best predicted student learning (dependent upon post-test grades) in an undergraduate mathematics course. Participants included 117 university-level undergraduate freshmen enrolled in a course titled 'Mathematics for Calculus'. A stepwise regression model was formulated to model the relationship between the predictor variables (the continuous assessment, cumulative assessment, and project scores) versus the outcome variable (the post-test scores). Results indicated that ultimately the continuous assessment scores best predicted students' post-test scores.

  7. Cyber Assessment Methods for SCADA Security

    Energy Technology Data Exchange (ETDEWEB)

    2005-06-01

    The terrorist attacks of September 11, 2001 brought to light threats and vulnerabilities that face the United States. In response, the U.S. Government is directing the effort to secure the nation's critical infrastructure by creating programs to implement the National Strategy to Secure Cyberspace (1). One part of this effort involves assessing Supervisory Control and Data Acquisition (SCADA) systems. These systems are essential to the control of critical elements of our national infrastructure, such as electric power, oil, and gas production and distribution. Since their incapacitation or destruction would have a debilitating impact on the defense or economic security of the United States, one of the main objectives of this program is to identify vulnerabilities and encourage the public and private sectors to work together to design secure control systems that resolve these weaknesses. This paper describes vulnerability assessment methodologies used in ongoing research and assessment activities designed to identify and resolve vulnerabilities so as to improve the security of the nation's critical infrastructure.

  8. Cyber Assessment Methods For SCADA Security

    Energy Technology Data Exchange (ETDEWEB)

    May Robin Permann; Kenneth Rohde

    2005-06-01

    The terrorist attacks of September 11, 2001 brought to light threats and vulnerabilities that face the United States. In response, the U.S. Government is directing the effort to secure the nation's critical infrastructure by creating programs to implement the National Strategy to Secure Cyberspace (1). One part of this effort involves assessing Supervisory Control and Data Acquisition (SCADA) systems. These systems are essential to the control of critical elements of our national infrastructure, such as electric power, oil, and gas production and distribution. Since their incapacitation or destruction would have a debilitating impact on the defense or economic security of the United States, one of the main objectives of this program is to identify vulnerabilities and encourage the public and private sectors to work together to design secure control systems that resolve these weaknesses. This paper describes vulnerability assessment methodologies used in ongoing research and assessment activities designed to identify and resolve vulnerabilities so as to improve the security of the nation's critical infrastructure.

  9. Comparative study of environmental impact assessment methods ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-07-20

    Jul 20, 2009 ... Then both approaches required be comparing and contrasting. Among these methods .... should include political powers (factors related to the poli- cy test) and .... mental system and predict the component behavior. Using the ...

  10. Evaluation of binding strength depending on the adhesive binding methods

    Directory of Open Access Journals (Sweden)

    Suzana Pasanec Preprotić

    2015-05-01

    Full Text Available A book with a personal value is worth remembering since it represents specific interests of an individual - author of the book. Therefore the original is the first issue of a book which is always bound manually. Due to cost-effectiveness, adhesive binding is most commonly used in author’s edition in paperback and hardback. Adhesive binding methods differ only if a paper leaf is a binding unit in adhesive binding form. The subject of the research is the quality of book block binding for two binding methods with/without mull fabric. The assumption is that double-fan adhesive binding method shows an extraordinary binding quality as compared to the rough spine method. For the needs of this research book block parameters remained unaltered: paper type, size and book volume. The results related to strength were obtained by using an experimental method of tensile strength for individual paper leaves. The rating of book block quality was conducted in accordance with FOGRA Nr.71006 guidelines for page pull-test. Furthermore, strength results for both methods were compared in order to evaluate the importance of changing the quality of adhesive binding. Statistical method ANOVA analysis of variance and Fisher’s F-test were used to evaluate the quality of book block binding.

  11. Improved method for assessing iron stores in the bone marrow

    NARCIS (Netherlands)

    Phiri, K.S.; Calis, J.C.J.; Kachala, D.; Borgstein, E.; Waluza, J.; Bates, I.; Brabin, B.; Boele van Hensbroek, M.

    2009-01-01

    BACKGROUND: Bone marrow iron microscopy has been the "gold standard" method of assessing iron deficiency. However, the commonly used method of grading marrow iron remains highly subjective. AIM: To improve the bone marrow grading method by developing a detailed protocol that assesses iron in fragmen

  12. Improved method for assessing iron stores in the bone marrow

    NARCIS (Netherlands)

    Phiri, K.S.; Calis, J.C.J.; Kachala, D.; Borgstein, E.; Waluza, J.; Bates, I.; Brabin, B.; Boele van Hensbroek, M.

    2009-01-01

    BACKGROUND: Bone marrow iron microscopy has been the "gold standard" method of assessing iron deficiency. However, the commonly used method of grading marrow iron remains highly subjective. AIM: To improve the bone marrow grading method by developing a detailed protocol that assesses iron in

  13. Time-dependent predictors in clinical research, performance of a novel method.

    Science.gov (United States)

    van de Bosch, Joan; Atiqi, Roya; Cleophas, Ton J

    2010-01-01

    Individual patients' predictors of survival may change across time, because people may change their lifestyles. Standard statistical methods do not allow adjustments for time-dependent predictors. In the past decade, time-dependent factor analysis has been introduced as a novel approach adequate for the purpose. Using examples from survival studies, we assess the performance of the novel method. SPSS statistical software is used (SPSS Inc., Chicago, IL). Cox regression is a major simplification of real life; it assumes that the ratio of the risks of dying in parallel groups is constant over time. It is, therefore, inadequate to analyze, for example, the effect of elevated low-density lipoprotein cholesterol on survival, because the relative hazard of dying is different in the first, second, and third decades. The time-dependent Cox regression model allowing for nonproportional hazards is applied and provides a better precision than the usual Cox regression (P = 0.117 versus 0.0001). Elevated blood pressure produces the highest risk at the time it is highest. An overall analysis of the effect of blood pressure on survival is not significant, but after adjustment for the periods with highest blood pressures using the segmented time-dependent Cox regression method, blood pressure is a significant predictor of survival (P = 0.04). In a long-term therapeutic study, treatment modality is a significant predictor of survival, but after the inclusion of the time-dependent low-density lipoprotein cholesterol variable, the precision of the estimate improves from a P value of 0.02 to 0.0001. Predictors of survival may change across time, e.g., the effect of smoking, cholesterol, and increased blood pressure in cardiovascular research and patients' frailty in oncology research. Analytical models for survival analysis adjusting such changes are welcome. The time-dependent and segmented time-dependent predictors are adequate for the purpose. The usual multiple Cox regression

  14. Primary Teachers’ Views Concerning the Assessment Methods Used by Them

    Directory of Open Access Journals (Sweden)

    Kader Birinci Konur

    2011-12-01

    Full Text Available Purpose of this study is to determine primary teachers’ views about assessment methods which are used by them. Semi-constructed interview which consists of five open-ended questions has been used as data gathering instrument. While the scope of research is the teachers who are teaching in primary schools in Rize, the samples are twenty-five primary schools teachers who were selected from the five schools. In conclusion, teachers are using both traditional assessment methods and alternative assessment methods. But, they don’t think that all of the assessment methods which in new curriculum are useful in their school because of lack of time and source. Teachers have used these methods which are changed according to environment conditions. Based on this conclusions, importance should be given to alternative assessment approaches in classes which are given to students in university and information should be presented about how use this assessment methods.

  15. Assessment of User Home Location Geoinference Methods

    Energy Technology Data Exchange (ETDEWEB)

    Harrison, Joshua J.; Bell, Eric B.; Corley, Courtney D.; Dowling, Chase P.; Cowell, Andrew J.

    2015-05-29

    This study presents an assessment of multiple approaches to determine the home and/or other important locations to a Twitter user. In this study, we present a unique approach to the problem of geotagged data sparsity in social media when performing geoinferencing tasks. Given the sparsity of explicitly geotagged Twitter data, the ability to perform accurate and reliable user geolocation from a limited number of geotagged posts has proven to be quite useful. In our survey, we have achieved accuracy rates of over 86% in matching Twitter user profile locations with their inferred home locations derived from geotagged posts.

  16. Biomedical Technology Assessment The 3Q Method

    CERN Document Server

    Weinfurt, Phillip

    2010-01-01

    Evaluating biomedical technology poses a significant challenge in light of the complexity and rate of introduction in today's healthcare delivery system. Successful evaluation requires an integration of clinical medicine, science, finance, and market analysis. Little guidance, however, exists for those who must conduct comprehensive technology evaluations. The 3Q Method meets these present day needs. The 3Q Method is organized around 3 key questions dealing with 1) clinical and scientific basis, 2) financial fit and 3) strategic and expertise fit. Both healthcare providers (e.g., hospitals) an

  17. Methods of Vessel Casualty Process Assessment

    Directory of Open Access Journals (Sweden)

    Jaroslaw Soliwoda

    2014-06-01

    Full Text Available Maritime casualty is an event of considerable economic and social impact. For this reason, implemented the reporting systems of accidents at sea, and the Administration was obligated to establish a Commission of Maritime Accidents. On the basis of casualty analysis and reports are developed proposals preventing similar casualties in the future. However, there is no uniform evaluation system which check references of existing regulations and recommendations to the occurred casualties. This paper presents a method to evaluate the used methods of casualty prediction with respect to the real incident and catastrophe.

  18. Extrapolation Method for System Reliability Assessment

    DEFF Research Database (Denmark)

    Qin, Jianjun; Nishijima, Kazuyoshi; Faber, Michael Havbro

    2012-01-01

    The present paper presents a new scheme for probability integral solution for system reliability analysis, which takes basis in the approaches by Naess et al. (2009) and Bucher (2009). The idea is to evaluate the probability integral by extrapolation, based on a sequence of MC approximations....... The scheme is extended so that it can be applied to cases where the asymptotic property may not be valid and/or the random variables are not normally distributed. The performance of the scheme is investigated by four principal series and parallel systems and some practical examples. The results indicate...... of integrals with scaled domains. The performance of this class of approximation depends on the approach applied for the scaling and the functional form utilized for the extrapolation. A scheme for this task is derived here taking basis in the theory of asymptotic solutions to multinormal probability integrals...

  19. Microbiological methods for assessing soil quality

    NARCIS (Netherlands)

    Bloem, J.; Hopkins, D.W.; Benedetti, A.

    2006-01-01

    This book provides a selection of microbiological methods that are already applied in regional or national soil quality monitoring programs. It is split into two parts: part one gives an overview of approaches to monitoring, evaluating and managing soil quality. Part two provides a selection of meth

  20. METHODS OF AVAILABLE POTASSIUM ASSESSMENT IN ...

    African Journals Online (AJOL)

    AGROSEARCH UIL

    Soil potassium (K+) exists in solution, exchangeable, and non-exchangeable ... evaluating K availability under intensive cropping; as those soils considered sufficient in ... response to potassium, soil test methods should have a high correlation with .... loamy sand to sandy loam in texture with kaolinite being the dominant.

  1. Time-Dependent Reliability Modeling and Analysis Method for Mechanics Based on Convex Process

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2015-01-01

    Full Text Available The objective of the present study is to evaluate the time-dependent reliability for dynamic mechanics with insufficient time-varying uncertainty information. In this paper, the nonprobabilistic convex process model, which contains autocorrelation and cross-correlation, is firstly employed for the quantitative assessment of the time-variant uncertainty in structural performance characteristics. By combination of the set-theory method and the regularization treatment, the time-varying properties of structural limit state are determined and a standard convex process with autocorrelation for describing the limit state is formulated. By virtue of the classical first-passage method in random process theory, a new nonprobabilistic measure index of time-dependent reliability is proposed and its solution strategy is mathematically conducted. Furthermore, the Monte-Carlo simulation method is also discussed to illustrate the feasibility and accuracy of the developed approach. Three engineering cases clearly demonstrate that the proposed method may provide a reasonable and more efficient way to estimate structural safety than Monte-Carlo simulations throughout a product life-cycle.

  2. A comparative assessment of statistical methods for extreme weather analysis

    Science.gov (United States)

    Schlögl, Matthias; Laaha, Gregor

    2017-04-01

    Extreme weather exposure assessment is of major importance for scientists and practitioners alike. We compare different extreme value approaches and fitting methods with respect to their value for assessing extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series over the standardly used annual maxima series in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing partial duration series, PDS) being superior to the block maxima approach (employing annual maxima series, AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was neither visible from the square-root criterion, nor from standardly used graphical diagnosis (mean residual life plot), but from a direct comparison of AMS and PDS in synoptic quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best suited approach. This will make the analyses more robust, in cases where threshold selection and dependency introduces biases to the PDS approach, but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend conditional performance measures that focus

  3. Testing an Automated Accuracy Assessment Method on Bibliographic Data

    Directory of Open Access Journals (Sweden)

    Marlies Olensky

    2014-12-01

    Full Text Available This study investigates automated data accuracy assessment as described in data quality literature for its suitability to assess bibliographic data. The data samples comprise the publications of two Nobel Prize winners in the field of Chemistry for a 10-year-publication period retrieved from the two bibliometric data sources, Web of Science and Scopus. The bibliographic records are assessed against the original publication (gold standard and an automatic assessment method is compared to a manual one. The results show that the manual assessment method reflects truer accuracy scores. The automated assessment method would need to be extended by additional rules that reflect specific characteristics of bibliographic data. Both data sources had higher accuracy scores per field than accumulated per record. This study contributes to the research on finding a standardized assessment method of bibliographic data accuracy as well as defining the impact of data accuracy on the citation matching process.

  4. Surface water quality assessment by environmetric methods.

    Science.gov (United States)

    Boyacioglu, Hülya; Boyacioglu, Hayal

    2007-08-01

    This environmetric study deals with the interpretation of river water monitoring data from the basin of the Buyuk Menderes River and its tributaries in Turkey. Eleven variables were measured to estimate water quality at 17 sampling sites. Factor analysis was applied to explain the correlations between the observations in terms of underlying factors. Results revealed that, water quality was strongly affected from agricultural uses. Cluster analysis was used to classify stations with similar properties and results distinguished three groups of stations. Water quality at downstream of the river was quite different from the other part. It is recommended to involve the environmetric data treatment as a substantial procedure in assessment of water quality data.

  5. Resampling method for applying density-dependent habitat selection theory to wildlife surveys.

    Directory of Open Access Journals (Sweden)

    Olivia Tardy

    Full Text Available Isodar theory can be used to evaluate fitness consequences of density-dependent habitat selection by animals. A typical habitat isodar is a regression curve plotting competitor densities in two adjacent habitats when individual fitness is equal. Despite the increasing use of habitat isodars, their application remains largely limited to areas composed of pairs of adjacent habitats that are defined a priori. We developed a resampling method that uses data from wildlife surveys to build isodars in heterogeneous landscapes without having to predefine habitat types. The method consists in randomly placing blocks over the survey area and dividing those blocks in two adjacent sub-blocks of the same size. Animal abundance is then estimated within the two sub-blocks. This process is done 100 times. Different functional forms of isodars can be investigated by relating animal abundance and differences in habitat features between sub-blocks. We applied this method to abundance data of raccoons and striped skunks, two of the main hosts of rabies virus in North America. Habitat selection by raccoons and striped skunks depended on both conspecific abundance and the difference in landscape composition and structure between sub-blocks. When conspecific abundance was low, raccoons and striped skunks favored areas with relatively high proportions of forests and anthropogenic features, respectively. Under high conspecific abundance, however, both species preferred areas with rather large corn-forest edge densities and corn field proportions. Based on random sampling techniques, we provide a robust method that is applicable to a broad range of species, including medium- to large-sized mammals with high mobility. The method is sufficiently flexible to incorporate multiple environmental covariates that can reflect key requirements of the focal species. We thus illustrate how isodar theory can be used with wildlife surveys to assess density-dependent habitat selection

  6. Enhanced Performance of the Eurostat Method for Comprehensive Assessment of Urban Metabolism

    NARCIS (Netherlands)

    Voskamp, Ilse M.; Stremke, Sven; Spiller, Marc; Perrotti, Daniela; Hoek, van der Jan Peter; Rijnaarts, Huub H.M.

    2016-01-01

    Sustainable urban resource management depends essentially on a sound understanding of a city's resource flows. One established method for analyzing the urban metabolism (UM) is the Eurostat material flow analysis (MFA). However, for a comprehensive assessment of the UM, this method has its limita

  7. Safety assessment of infrastructures using a new Bayesian Monte Carlo method

    NARCIS (Netherlands)

    Rajabalinejad, M.; Demirbilek, Z.

    2011-01-01

    A recently developed Bayesian Monte Carlo (BMC) method and its application to safety assessment of structures are described in this paper. We use a one-dimensional BMC method that was proposed in 2009 by Rajabalinejad in order to develop a weighted logical dependence between successive Monte Carlo s

  8. Assessment of dental plaque by optoelectronic methods

    Science.gov (United States)

    Negrutiu, Meda-Lavinia; Sinescu, Cosmin; Bortun, Cristina Maria; Levai, Mihaela-Codrina; Topala, Florin Ionel; Crǎciunescu, Emanuela Lidia; Cojocariu, Andreea Codruta; Duma, Virgil Florin; Podoleanu, Adrian Gh.

    2016-03-01

    The formation of dental biofilm follows specific mechanisms of initial colonization on the surface, microcolony formation, development of organized three dimensional community structures, and detachment from the surface. The structure of the plaque biofilm might restrict the penetration of antimicrobial agents, while bacteria on a surface grow slowly and display a novel phenotype; the consequence of the latter is a reduced sensitivity to inhibitors. The aim of this study was to evaluate with different optoelectronic methods the morphological characteristics of the dental biofilm. The study was performed on samples from 25 patients aged between 18 and 35 years. The methods used in this study were Spectral Domain Optical Coherence Tomography (SD-OCT) working at 870 nm for in vivo evaluations and Scanning Electron Microscopy (SEM) for validations. For each patient a sample of dental biofilm was obtained directly from the vestibular surface of the teeth's. SD-OCT produced C- and B-scans that were used to generate three dimensional (3D) reconstructions of the sample. The results were compared with SEM evaluations. The biofilm network was dramatically destroyed after the professional dental cleaning. OCT noninvasive methods can act as a valuable tool for the 3D characterization of dental biofilms.

  9. Methodical approaches to the assessment of personnel competitiveness

    National Research Council Canada - National Science Library

    Semikina Anna Valerievna

    2014-01-01

    On the basis of generalization and systematization of methodical approaches the advanced scientific and methodical approach to the assessment of competitiveness as of an indicator of the human capital quality is offered...

  10. Microbial diversity in fecal samples depends on DNA extraction method

    DEFF Research Database (Denmark)

    Mirsepasi, Hengameh; Persson, Søren; Struve, Carsten

    2014-01-01

    BACKGROUND: There are challenges, when extracting bacterial DNA from specimens for molecular diagnostics, since fecal samples also contain DNA from human cells and many different substances derived from food, cell residues and medication that can inhibit downstream PCR. The purpose of the study...... was to evaluate two different DNA extraction methods in order to choose the most efficient method for studying intestinal bacterial diversity using Denaturing Gradient Gel Electrophoresis (DGGE). FINDINGS: In this study, a semi-automatic DNA extraction system (easyMag®, BioMérieux, Marcy I'Etoile, France......) and a manual one (QIAamp DNA Stool Mini Kit, Qiagen, Hilden, Germany) were tested on stool samples collected from 3 patients with Inflammatory Bowel disease (IBD) and 5 healthy individuals. DNA extracts obtained by the QIAamp DNA Stool Mini Kit yield a higher amount of DNA compared to DNA extracts obtained...

  11. Time-dependent optimal heater control using finite difference method

    Energy Technology Data Exchange (ETDEWEB)

    Li, Zhen Zhe; Heo, Kwang Su; Choi, Jun Hoo; Seol, Seoung Yun [Chonnam National Univ., Gwangju (Korea, Republic of)

    2008-07-01

    Thermoforming is one of the most versatile and economical process to produce polymer products. The drawback of thermoforming is difficult to control thickness of final products. Temperature distribution affects the thickness distribution of final products, but temperature difference between surface and center of sheet is difficult to decrease because of low thermal conductivity of ABS material. In order to decrease temperature difference between surface and center, heating profile must be expressed as exponential function form. In this study, Finite Difference Method was used to find out the coefficients of optimal heating profiles. Through investigation, the optimal results using Finite Difference Method show that temperature difference between surface and center of sheet can be remarkably minimized with satisfying temperature of forming window.

  12. Analytical Methods for Environmental Risk Assessment of Acid Sulfate Soils: A Review

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Assessment of acid sulfate soil risk is an important step for acid sulfate soil management and its reliability depends very much on the suitability and accuracy of various analytical methods for estimating sulfide-derived potential acidity, actual acidity and acid-neutralizing capacity in acid sulfate soils. This paper critically reviews various analytical methods that are currently used for determination of the above parameters, as well as their implications for environmental risk assessment of acid sulfate soils.

  13. Methods for assessing mitochondrial function in diabetes.

    Science.gov (United States)

    Perry, Christopher G R; Kane, Daniel A; Lanza, Ian R; Neufer, P Darrell

    2013-04-01

    A growing body of research is investigating the potential contribution of mitochondrial function to the etiology of type 2 diabetes. Numerous in vitro, in situ, and in vivo methodologies are available to examine various aspects of mitochondrial function, each requiring an understanding of their principles, advantages, and limitations. This review provides investigators with a critical overview of the strengths, limitations and critical experimental parameters to consider when selecting and conducting studies on mitochondrial function. In vitro (isolated mitochondria) and in situ (permeabilized cells/tissue) approaches provide direct access to the mitochondria, allowing for study of mitochondrial bioenergetics and redox function under defined substrate conditions. Several experimental parameters must be tightly controlled, including assay media, temperature, oxygen concentration, and in the case of permeabilized skeletal muscle, the contractile state of the fibers. Recently developed technology now offers the opportunity to measure oxygen consumption in intact cultured cells. Magnetic resonance spectroscopy provides the most direct way of assessing mitochondrial function in vivo with interpretations based on specific modeling approaches. The continuing rapid evolution of these technologies offers new and exciting opportunities for deciphering the potential role of mitochondrial function in the etiology and treatment of diabetes.

  14. CAPABILITY ASSESSMENT OF MEASURING EQUIPMENT USING STATISTIC METHOD

    Directory of Open Access Journals (Sweden)

    Pavel POLÁK

    2014-10-01

    Full Text Available Capability assessment of the measurement device is one of the methods of process quality control. Only in case the measurement device is capable, the capability of the measurement and consequently production process can be assessed. This paper deals with assessment of the capability of the measuring device using indices Cg and Cgk.

  15. Methods of assessing reading used by Iranian EFL teachers

    Directory of Open Access Journals (Sweden)

    Somaye Ketabi

    2016-04-01

    Full Text Available Investigating the effects of different methods of assessing reading on students’ reading comprehension has been one of the major topics among Iranian EFL researchers (Atai & Nikuinezhad, 2006; Delgoshaei, Kharrazi, & Talkhabi, 2011; Shams & Tavakoli, 2014. However, the amount of popularity of these methods among teachers has not gained this much attention. The present study examined the popularity of different methods of assessing reading among teachers of adult and young adult learners and also investigated the difference in the frequency of the methods used by these two groups of teachers. Categories of different methods were chosen based on Brown's (2004 taxonomy and the study conducted by Cheng, Rogers, and Hu (2004. Background Information Questionnaire and assessment questionnaire were used to collect the data. The results revealed that Iranian teachers did not use a variety of reading assessment methods in their classes. The most common method to assess reading among Iranian teachers was reported to be reading aloud, and other methods of assessing reading ‒ e.g. preparing summaries and oral questioning ‒ were far less common than reading aloud. Alternative methods of assessment ‒ e.g. journals and portfolios ‒ were the least common methods as were reported by Iranian EFL teachers.

  16. Methods for land use impact assessment: A review

    Energy Technology Data Exchange (ETDEWEB)

    Perminova, Tataina, E-mail: tatiana.perminova@utt.fr [Research Centre for Environmental Studies and Sustainability, University of Technology of Troyes, CNRS UMR 6281, 12 Rue Marie Curie CS 42060, F-10004 Troyes Cedex (France); Department of Geoecology and Geochemistry, Institute of Natural Resources, National Research Tomsk Polytechnic University, 30 Lenin Avenue, 634050 Tomsk (Russian Federation); Sirina, Natalia, E-mail: natalia.sirina@utt.fr [Research Centre for Environmental Studies and Sustainability, University of Technology of Troyes, CNRS UMR 6281, 12 Rue Marie Curie CS 42060, F-10004 Troyes Cedex (France); Laratte, Bertrand, E-mail: bertrand.laratte@utt.fr [Research Centre for Environmental Studies and Sustainability, University of Technology of Troyes, CNRS UMR 6281, 12 Rue Marie Curie CS 42060, F-10004 Troyes Cedex (France); Baranovskaya, Natalia, E-mail: natalya.baranovs@mail.ru [Department of Geoecology and Geochemistry, Institute of Natural Resources, National Research Tomsk Polytechnic University, 30 Lenin Avenue, 634050 Tomsk (Russian Federation); Rikhvanov, Leonid, E-mail: rikhvanov@tpu.ru [Department of Geoecology and Geochemistry, Institute of Natural Resources, National Research Tomsk Polytechnic University, 30 Lenin Avenue, 634050 Tomsk (Russian Federation)

    2016-09-15

    Many types of methods to assess land use impact have been developed. Nevertheless a systematic synthesis of all these approaches is necessary to highlight the most commonly used and most effective methods. Given the growing interest in this area of research, a review of the different methods of assessing land use impact (LUI) was performed using bibliometric analysis. One hundred eighty seven articles of agricultural and biological science, and environmental sciences were examined. According to our results, the most frequently used land use assessment methods are Life-Cycle Assessment, Material Flow Analysis/Input–Output Analysis, Environmental Impact Assessment and Ecological Footprint. Comparison of the methods allowed their specific features to be identified and to arrive at the conclusion that a combination of several methods is the best basis for a comprehensive analysis of land use impact assessment. - Highlights: • We identified the most frequently used methods in land use impact assessment. • A comparison of the methods based on several criteria was carried out. • Agricultural land use is by far the most common area of study within the methods. • Incentive driven methods, like LCA, arouse the most interest in this field.

  17. Time Domain Stability Margin Assessment Method

    Science.gov (United States)

    Clements, Keith

    2017-01-01

    The baseline stability margins for NASA's Space Launch System (SLS) launch vehicle were generated via the classical approach of linearizing the system equations of motion and determining the gain and phase margins from the resulting frequency domain model. To improve the fidelity of the classical methods, the linear frequency domain approach can be extended by replacing static, memoryless nonlinearities with describing functions. This technique, however, does not address the time varying nature of the dynamics of a launch vehicle in flight. An alternative technique for the evaluation of the stability of the nonlinear launch vehicle dynamics along its trajectory is to incrementally adjust the gain and/or time delay in the time domain simulation until the system exhibits unstable behavior. This technique has the added benefit of providing a direct comparison between the time domain and frequency domain tools in support of simulation validation.

  18. Predictive validity of the Hand Arm Risk assessment Method (HARM)

    NARCIS (Netherlands)

    Douwes, M.; Boocock, M.; Coenen, P.; Heuvel, S. van den; Bosch, T.

    2014-01-01

    The Hand Arm Risk assessment Method (HARM) is a simplified risk assessment method for determining musculoskeletal symptoms to the arm, neck and/or shoulder posed by hand-arm tasks of the upper body. The purpose of this study was to evaluate the predictive validity of HARM using data collected from a

  19. Alternative method for assessing coking coal plasticity

    Energy Technology Data Exchange (ETDEWEB)

    Dzuy Nguyen; Susan Woodhouse; Merrick Mahoney [University of Adelaide (Australia). BHP Billiton Newcastle Technology Centre

    2008-07-15

    Traditional plasticity measurements for coal have a number of limitations associated with the reproducibility of the tests and their use in predicting coking behaviour. This report reviews alternative rheological methods for characterising the plastic behaviour of coking coals. It reviews the application of more fundamental rheological measurements to the coal system as well as reviewing applications of rheology to other physical systems. These systems may act as potential models for the application of fundamental rheological measurements to cokemaking. The systems considered were polymer melts, coal ash melts, lava, bread making and ice cream. These systems were chosen because they exhibit some physically equivalent processes to the processes occurring during cokemaking, eg, the generation of bubbles within a softened system that then resolidifies. A number of recommendations were made; the steady and oscillatory shear squeeze flow techniques be further investigated to determine if the measured rheology characteristics are related to transformations within the coke oven and the characteristics of resultant coke; modification of Gieseler plastometers for more fundamental rheology measurements not be attempted.

  20. Inventory of LCIA selection methods for assessing toxic releases. Methods and typology report part B

    DEFF Research Database (Denmark)

    Larsen, Henrik Fred; Birkved, Morten; Hauschild, Michael Zwicky

    This report describes an inventory of Life Cycle Impact Assessment (LCIA) selection methods for assessing toxic releases. It consists of an inventory of current selection methods and other Chemical Ranking and Scoring (CRS) methods assessed to be relevant for the development of (a) new selection...... method(s) in Work package 8 (WP8) of the OMNIITOX project. The selection methods and the other CRS methods are described in detail, a set of evaluation criteria are developed and the methods are evaluated against these criteria. This report (Deliverable 11B (D11B)) gives the results from task 7.1d, 7.1e...

  1. Assessment methods in surgical training in the United Kingdom

    Directory of Open Access Journals (Sweden)

    Evgenios Evgeniou

    2013-02-01

    Full Text Available A career in surgery in the United Kingdom demands a commitment to a long journey of assessment. The assessment methods used must ensure that the appropriate candidates are selected into a programme of study or a job and must guarantee public safety by regulating the progression of surgical trainees and the certification of trained surgeons. This review attempts to analyse the psychometric properties of various assessment methods used in the selection of candidates to medical school, job selection, progression in training, and certification. Validity is an indicator of how well an assessment measures what it is designed to measure. Reliability informs us whether a test is consistent in its outcome by measuring the reproducibility and discriminating ability of the test. In the long journey of assessment in surgical training, the same assessment formats are frequently being used for selection into a programme of study, job selection, progression, and certification. Although similar assessment methods are being used for different purposes in surgical training, the psychometric properties of these assessment methods have not been examined separately for each purpose. Because of the significance of these assessments for trainees and patients, their reliability and validity should be examined thoroughly in every context where the assessment method is being used.

  2. Feasibility of Ecological Momentary Assessment Using Cellular Telephones in Methamphetamine Dependent Subjects

    Directory of Open Access Journals (Sweden)

    John Mendelson

    2008-01-01

    Full Text Available Background: Predictors of relapse to methamphetamine use are poorly understood. State variables may play an important role in relapse, but they have been difficult to measure at frequent intervals in outpatients.Methods: We conducted a feasibility study of the use of cellular telephones to collect state variable data from outpatients. Six subjects in treatment for methamphetamine dependence were called three times per weekday for approximately seven weeks. Seven questionnaires were administered that assessed craving, stress, affect and current type of location and social environment.Results: 395/606 (65% of calls attempted were completed. The mean time to complete each call was 4.9 (s.d. 1.8 minutes and the mean time to complete each item was 8.4 (s.d. 4.8 seconds. Subjects rated the acceptability of the procedures as good. All six cellular phones and battery chargers were returned undamaged.Conclusion: Cellular telephones are a feasible method for collecting state data from methamphetamine dependent outpatients.

  3. Assessing Theory Uncertainties in EFT Power Countings from Residual Cutoff Dependence

    CERN Document Server

    Griesshammer, Harald W

    2015-01-01

    I summarise a method to quantitatively assess the consistency of power-counting proposals in Effective Field Theories which are non-perturbative at leading order. It uses the fact that the Renormalisation Group evolution of an observable predicts the functional form of its residual cutoff-dependence on the EFT breakdown scale, on the low-momentum scales, and on the order of the calculation. The criterion serves as a non-trivial test of a suggested power counting whose exact nature is disputed. For example, in Chiral EFT with more than one nucleon, a lack of universally accepted analytic solutions obfuscates the relation between convergence pattern and numerical results, and led to proposals which predict different numbers of Low Energy Coefficients at the same chiral order. The method may provide independent confirmation whether an observable is properly renormalised at a given order, and allows one to estimate both the breakdown scale and the momentum-dependent order-by-order convergence pattern of an EFT. C...

  4. Prospective Assessment of Cannabis Withdrawal in Adolescents with Cannabis Dependence: A Pilot Study

    Science.gov (United States)

    Milin, Robert; Manion, Ian; Dare, Glenda; Walker, Selena

    2008-01-01

    A study to identify and assess the withdrawal symptoms in adolescents afflicted with cannabis dependence is conducted. Results conclude that withdrawal symptoms of cannabis were present in adolescents seeking treatment for this substance abuse.

  5. Prospective Assessment of Cannabis Withdrawal in Adolescents with Cannabis Dependence: A Pilot Study

    Science.gov (United States)

    Milin, Robert; Manion, Ian; Dare, Glenda; Walker, Selena

    2008-01-01

    A study to identify and assess the withdrawal symptoms in adolescents afflicted with cannabis dependence is conducted. Results conclude that withdrawal symptoms of cannabis were present in adolescents seeking treatment for this substance abuse.

  6. The time line method for assessing galloping exposure

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, A.S.

    1982-08-01

    The design of double circuit transmission structures is often determined by the need to allow sufficient electrical clearance between phases under galloping span conditions. In the past such designs have been arrived at according to certain ''galloping ellipse'' criteria in which the ellipse geometry is based on mid-span sag. The new method, disclosed herein, starts with the statistical history of the weather in the particular region, as to wind, wind direction, temperature, and ice leading to an exposure rate (Hrs./Yr.) for the normal component of wind speed. These data are combined with estimates of galloping motion including amplitude dependence on wind speed, gusting, and frequency mis-match between galloping and horizontal (swinging) movement at mid-span. A comparison is included between untreated and treated spans, the latter having galloping control devices with only 50% amplitude reduction capability. A range of span lengths between 750 ft. (227M) and 1,500 ft. (454M) is considered. The new method, for the first time, provides a means to assess the benefits of alternative designs in quantitative terms.

  7. Interlaboratory Validation of the Leaching Environmental Assessment Framework (LEAF) Method 1313 and Method 1316

    Science.gov (United States)

    This document summarizes the results of an interlaboratory study conducted to generate precision estimates for two parallel batch leaching methods which are part of the Leaching Environmental Assessment Framework (LEAF). These methods are: (1) Method 1313: Liquid-Solid Partition...

  8. Identifying patients at risk of nursing home admission: The Leeds Elderly Assessment Dependency Screening tool (LEADS

    Directory of Open Access Journals (Sweden)

    Fear Jon

    2006-03-01

    Full Text Available Abstract Background Discharge from hospital to a nursing home represents a major event in the life of an older person and should only follow a comprehensive functional and medical assessment. A previous study identified 3 dependency scales able to discriminate across outcomes for older people admitted to an acute setting. We wished to determine if a single dependency scale derived from the 3 scales could be created. In addition could this new scale with other predictors be used as a comprehensive tool to identify patients at risk of nursing home admission. Methods Items from the 3 scales were combined and analysed using Rasch Analysis. Sensitivity and specificity analysis and ROC curves were applied to identify the most appropriate cut score. Binary logistic regression using this cut-off, and other predictive variables, were used to create a predictive algorithm score. Sensitivity, specificity and likelihood ratio scores of the algorithm scores were used to identify the best predictive score for risk of nursing home placement. Results A 17-item (LEADS scale was derived, which together with four other indicators, had a sensitivity of 88% for patients at risk of nursing home placement, and a specificity of 85% for not needing a nursing home placement, within 2 weeks of admission. Conclusion A combined short 17-item scale of dependency plus other predictive variables can assess the risk of nursing home placement for older people in an acute care setting within 2 weeks of admission. This gives an opportunity for either early discharge planning, or therapeutic intervention to offset the risk of placement.

  9. Disordered Speech Assessment Using Automatic Methods Based on Quantitative Measures

    Directory of Open Access Journals (Sweden)

    Christine Sapienza

    2005-06-01

    Full Text Available Speech quality assessment methods are necessary for evaluating and documenting treatment outcomes of patients suffering from degraded speech due to Parkinson's disease, stroke, or other disease processes. Subjective methods of speech quality assessment are more accurate and more robust than objective methods but are time-consuming and costly. We propose a novel objective measure of speech quality assessment that builds on traditional speech processing techniques such as dynamic time warping (DTW and the Itakura-Saito (IS distortion measure. Initial results show that our objective measure correlates well with the more expensive subjective methods.

  10. A new assessment method for image fusion quality

    Science.gov (United States)

    Li, Liu; Jiang, Wanying; Li, Jing; Yuchi, Ming; Ding, Mingyue; Zhang, Xuming

    2013-03-01

    Image fusion quality assessment plays a critically important role in the field of medical imaging. To evaluate image fusion quality effectively, a lot of assessment methods have been proposed. Examples include mutual information (MI), root mean square error (RMSE), and universal image quality index (UIQI). These image fusion assessment methods could not reflect the human visual inspection effectively. To address this problem, we have proposed a novel image fusion assessment method which combines the nonsubsampled contourlet transform (NSCT) with the regional mutual information in this paper. In this proposed method, the source medical images are firstly decomposed into different levels by the NSCT. Then the maximum NSCT coefficients of the decomposed directional images at each level are obtained to compute the regional mutual information (RMI). Finally, multi-channel RMI is computed by the weighted sum of the obtained RMI values at the various levels of NSCT. The advantage of the proposed method lies in the fact that the NSCT can represent image information using multidirections and multi-scales and therefore it conforms to the multi-channel characteristic of human visual system, leading to its outstanding image assessment performance. The experimental results using CT and MRI images demonstrate that the proposed assessment method outperforms such assessment methods as MI and UIQI based measure in evaluating image fusion quality and it can provide consistent results with human visual assessment.

  11. Comparing methods for assessing the effectiveness of subnational REDD+ initiatives

    Science.gov (United States)

    Bos, Astrid B.; Duchelle, Amy E.; Angelsen, Arild; Avitabile, Valerio; De Sy, Veronique; Herold, Martin; Joseph, Shijo; de Sassi, Claudio; Sills, Erin O.; Sunderlin, William D.; Wunder, Sven

    2017-07-01

    The central role of forests in climate change mitigation, as recognized in the Paris agreement, makes it increasingly important to develop and test methods for monitoring and evaluating the carbon effectiveness of REDD+. Over the last decade, hundreds of subnational REDD+ initiatives have emerged, presenting an opportunity to pilot and compare different approaches to quantifying impacts on carbon emissions. This study (1) develops a Before-After-Control-Intervention (BACI) method to assess the effectiveness of these REDD+ initiatives; (2) compares the results at the meso (initiative) and micro (village) scales; and (3) compares BACI with the simpler Before-After (BA) results. Our study covers 23 subnational REDD+ initiatives in Brazil, Peru, Cameroon, Tanzania, Indonesia and Vietnam. As a proxy for deforestation, we use annual tree cover loss. We aggregate data into two periods (before and after the start of each initiative). Analysis using control areas (‘control-intervention’) suggests better REDD+ performance, although the effect is more pronounced at the micro than at the meso level. Yet, BACI requires more data than BA, and is subject to possible bias in the before period. Selection of proper control areas is vital, but at either scale is not straightforward. Low absolute deforestation numbers and peak years influence both our BA and BACI results. In principle, BACI is superior, with its potential to effectively control for confounding factors. We conclude that the more local the scale of performance assessment, the more relevant is the use of the BACI approach. For various reasons, we find overall minimal impact of REDD+ in reducing deforestation on the ground thus far. Incorporating results from micro and meso level monitoring into national reporting systems is important, since overall REDD+ impact depends on land use decisions on the ground.

  12. Health smart home: towards an assistant tool for automatic assessment of the dependence of elders.

    Science.gov (United States)

    Le, Xuan Hoa Binh; Di Mascolo, Maria; Gouin, Alexia; Noury, Norbert

    2007-01-01

    In order to help elders living alone to age in place independently and safely, it can be useful to have an assistant tool that can automatically assess their dependence and issue an alert if there is any loss of autonomy. The dependence can be assessed by the degree of performance, by the elders, of activities of daily living. This article presents an approach enabling the activity recognition for an elder living alone in a Health Smart Home equipped with noninvasive sensors.

  13. DIM : A systematic and lightweight method for identifying dependencies between requirements

    OpenAIRE

    Gomez, Arturo; Rueda, Gema

    2010-01-01

    Dependencies between requirements are a crucial factor for any software development since they impact many project areas. Nevertheless, their identification remains a challenge. Some methods have been proposed but none of them are really applicable to real projects due to their high cost or low accuracy. DIM is a lightweight method for identifying dependencies proposed on a previous paper. This paper presents an experiment comparing the sets of dependencies found by DIM and a method based on ...

  14. Influence of expertise on rockfall hazard assessment using empirical methods

    Science.gov (United States)

    Delonca, Adeline; Verdel, Thierry; Gunzburger, Yann

    2016-07-01

    To date, many rockfall hazard assessment methods still consider qualitative observations within their analysis. Based on this statement, knowledge and expertise are supposed to be major parameters of rockfall assessment. To test this hypothesis, an experiment was carried out in order to evaluate the influence of knowledge and expertise on rockfall hazard assessment. Three populations were selected, having different levels of expertise: (1) students in geosciences, (2) researchers in geosciences and (3) confirmed experts. These three populations evaluated the rockfall hazard level on the same site, considering two different methods: the Laboratoire des Ponts et Chaussées (LPC) method and a method partly based on the "slope mass rating" (SMR) method. To complement the analysis, the completion of an "a priori" assessment of the rockfall hazard was requested of each population, without using any method. The LPC method is the most widely used method in France for official hazard mapping. It combines two main indicators: the predisposition to instability and the expected magnitude. Reversely, the SMR method was used as an ad hoc quantitative method to investigate the effect of quantification within a method. These procedures were applied on a test site divided into three different sectors. A statistical treatment of the results (descriptive statistical analysis, chi-square independent test and ANOVA) shows that there is a significant influence of the method used on the rockfall hazard assessment, whatever the sector. However, there is a non-significant influence of the level of expertise of the population the sectors 2 and 3. On sector 1, there is a significant influence of the level of expertise, explained by the importance of the temporal probability assessment in the rockfall hazard assessment process. The SMR-based method seems highly sensitive to the "site activity" indicator and exhibits an important dispersion in its results. However, the results are more similar

  15. Nursing-care dependency : Development of an assessment scale for demented and mentally handicapped patients

    NARCIS (Netherlands)

    Dijkstra, Ate; Buist, Girbe; Dassen, T

    1996-01-01

    This article describing the first phase in the development of an assessment scale of nursing-care dependency (NCD) for Dutch demented and mentally handicapped patients focuses on the background to the study and the content validation of the nursing-care dependency scale. The scale aims to

  16. A new embedding quality assessment method for manifold learning

    CERN Document Server

    Zhang, Peng; Zhang, Bo

    2011-01-01

    Manifold learning is a hot research topic in the field of computer science. A crucial issue with current manifold learning methods is that they lack a natural quantitative measure to assess the quality of learned embeddings, which greatly limits their applications to real-world problems. In this paper, a new embedding quality assessment method for manifold learning, named as Normalization Independent Embedding Quality Assessment (NIEQA), is proposed. Compared with current assessment methods which are limited to isometric embeddings, the NIEQA method has a much larger application range due to two features. First, it is based on a new measure which can effectively evaluate how well local neighborhood geometry is preserved under normalization, hence it can be applied to both isometric and normalized embeddings. Second, it can provide both local and global evaluations to output an overall assessment. Therefore, NIEQA can serve as a natural tool in model selection and evaluation tasks for manifold learning. Experi...

  17. Assessment of hip dysplasia and osteoarthritis: Variability of different methods

    Energy Technology Data Exchange (ETDEWEB)

    Troelsen, Anders; Elmengaard, Brian; Soeballe, Kjeld (Orthopedic Research Unit, Univ. Hospital of Aarhus, Aarhus (Denmark)), e-mail: a_troelsen@hotmail.com; Roemer, Lone (Dept. of Radiology, Univ. Hospital of Aarhus, Aarhus (Denmark)); Kring, Soeren (Dept. of Orthopedic Surgery, Aabenraa Hospital, Aabenraa (Denmark))

    2010-03-15

    Background: Reliable assessment of hip dysplasia and osteoarthritis is crucial in young adults who may benefit from joint-preserving surgery. Purpose: To investigate the variability of different methods for diagnostic assessment of hip dysplasia and osteoarthritis. Material and Methods: By each of four observers, two assessments were done by vision and two by angle construction. For both methods, the intra- and interobserver variability of center-edge and acetabular index angle assessment were analyzed. The observers' ability to diagnose hip dysplasia and osteoarthritis were assessed. All measures were compared to those made on computed tomography scan. Results: Intra- and interobserver variability of angle assessment was less when angles were drawn compared with assessment by vision, and the observers' ability to diagnose hip dysplasia improved when angles were drawn. Assessment of osteoarthritis in general showed poor agreement with findings on computed tomography scan. Conclusion: We recommend that angles always should be drawn for assessment of hip dysplasia on pelvic radiographs. Given the inherent variability of diagnostic assessment of hip dysplasia, a computed tomography scan could be considered in patients with relevant hip symptoms and a center-edge angle between 20 deg and 30 deg. Osteoarthritis should be assessed by measuring the joint space width or by classifying the Toennis grade as either 0-1 or 2-3

  18. Revisiting Individual Creativity Assessment: Triangulation in Subjective and Objective Assessment Methods

    Science.gov (United States)

    Park, Namgyoo K.; Chun, Monica Youngshin; Lee, Jinju

    2016-01-01

    Compared to the significant development of creativity studies, individual creativity research has not reached a meaningful consensus regarding the most valid and reliable method for assessing individual creativity. This study revisited 2 of the most popular methods for assessing individual creativity: subjective and objective methods. This study…

  19. Dependent data in social sciences research forms, issues, and methods of analysis

    CERN Document Server

    Eye, Alexander; Wiedermann, Wolfgang

    2015-01-01

    This volume presents contributions on handling data in which the postulate of independence in the data matrix is violated. When this postulate is violated and when the methods assuming independence are still applied, the estimated parameters are likely to be biased, and statistical decisions are very likely to be incorrect. Problems associated with dependence in data have been known for a long time, and led to the development of tailored methods for the analysis of dependent data in various areas of statistical analysis. These methods include, for example, methods for the analysis of longitudinal data, corrections for dependency, and corrections for degrees of freedom. This volume contains the following five sections: growth curve modeling, directional dependence, dyadic data modeling, item response modeling (IRT), and other methods for the analysis of dependent data (e.g., approaches for modeling cross-section dependence, multidimensional scaling techniques, and mixed models). Researchers and graduate stud...

  20. Time- and Site-Dependent Life Cycle Assessment of Thermal Waste Treatment

    Energy Technology Data Exchange (ETDEWEB)

    Hellweg, Stefanie; Hofstetter, Thomas B.; Hungerbuehler, Konrad [Swiss Federal Inst. of Technology (ETH), Zuerich (Switzerland). Chemical Engineering Dept.

    2002-12-01

    largely depends on the assessment of heavy metal emissions from landfills and the weighting of the corresponding impacts at different points in time. Unfortunately, common LCA methods hardly consider spatial and temporal aspects. Several methodological innovations are suggested in this work. In order to quantify the impact of landfill leachates with respect to groundwater contamination, a simplified geochemical landfill model is proposed. The results indicate that slag landfills might release heavy metals over very long time periods ranging from a few thousand years in the case of Cd to more than 100,000 years in the case of Cu. The dissolved concentrations in the leachate exceed the quality goals set by the Swiss Water Protection Law by a factor of at least 50. The classification of the mobility of heavy metal cations in the subsoil of the landfill was performed with a generic guideline developed for this purpose. The method is easily applicable to individual landfill sites. The results indicate that the geological conditions below the landfills play an important role. Depending on these conditions, the retardation of the heavy metals ranged from a few days to many thousand years at different sites. The long emission period of the heavy metals from landfills and the retardation of these pollutants in the subsoil raise the question whether impacts at different points in time should be weighted alike. For instance, the magnitude of damage might change as a consequence of a changing background contamination. It is concluded that possible future changes in the magnitude of damage should be considered in scenario analysis in the characterization phase of LCA. The proposed methodological innovations for the assessment of heavy metal transport have been applied to the case study of Cd and Cu emissions from three slag landfills at different sites in Switzerland. The emissions of heavy metals to the subsoil as a function of time were calculated with the geochemical landfill model

  1. Culture-Dependent and -Independent Methods Capture Different Microbial Community Fractions in Hydrocarbon-Contaminated Soils

    Science.gov (United States)

    Stefani, Franck O. P.; Bell, Terrence H.; Marchand, Charlotte; de la Providencia, Ivan E.; El Yassimi, Abdel; St-Arnaud, Marc; Hijri, Mohamed

    2015-01-01

    Bioremediation is a cost-effective and sustainable approach for treating polluted soils, but our ability to improve on current bioremediation strategies depends on our ability to isolate microorganisms from these soils. Although culturing is widely used in bioremediation research and applications, it is unknown whether the composition of cultured isolates closely mirrors the indigenous microbial community from contaminated soils. To assess this, we paired culture-independent (454-pyrosequencing of total soil DNA) with culture-dependent (isolation using seven different growth media) techniques to analyse the bacterial and fungal communities from hydrocarbon-contaminated soils. Although bacterial and fungal rarefaction curves were saturated for both methods, only 2.4% and 8.2% of the bacterial and fungal OTUs, respectively, were shared between datasets. Isolated taxa increased the total recovered species richness by only 2% for bacteria and 5% for fungi. Interestingly, none of the bacteria that we isolated were representative of the major bacterial OTUs recovered by 454-pyrosequencing. Isolation of fungi was moderately more effective at capturing the dominant OTUs observed by culture-independent analysis, as 3 of 31 cultured fungal strains ranked among the 20 most abundant fungal OTUs in the 454-pyrosequencing dataset. This study is one of the most comprehensive comparisons of microbial communities from hydrocarbon-contaminated soils using both isolation and high-throughput sequencing methods. PMID:26053848

  2. Culture-Dependent and -Independent Methods Capture Different Microbial Community Fractions in Hydrocarbon-Contaminated Soils.

    Directory of Open Access Journals (Sweden)

    Franck O P Stefani

    Full Text Available Bioremediation is a cost-effective and sustainable approach for treating polluted soils, but our ability to improve on current bioremediation strategies depends on our ability to isolate microorganisms from these soils. Although culturing is widely used in bioremediation research and applications, it is unknown whether the composition of cultured isolates closely mirrors the indigenous microbial community from contaminated soils. To assess this, we paired culture-independent (454-pyrosequencing of total soil DNA with culture-dependent (isolation using seven different growth media techniques to analyse the bacterial and fungal communities from hydrocarbon-contaminated soils. Although bacterial and fungal rarefaction curves were saturated for both methods, only 2.4% and 8.2% of the bacterial and fungal OTUs, respectively, were shared between datasets. Isolated taxa increased the total recovered species richness by only 2% for bacteria and 5% for fungi. Interestingly, none of the bacteria that we isolated were representative of the major bacterial OTUs recovered by 454-pyrosequencing. Isolation of fungi was moderately more effective at capturing the dominant OTUs observed by culture-independent analysis, as 3 of 31 cultured fungal strains ranked among the 20 most abundant fungal OTUs in the 454-pyrosequencing dataset. This study is one of the most comprehensive comparisons of microbial communities from hydrocarbon-contaminated soils using both isolation and high-throughput sequencing methods.

  3. Dependence of dietary intake estimates on the time frame of assessment

    NARCIS (Netherlands)

    Löwik, M.R.H.; Hulshof, K.F.A.M.; Brussaard, J.H.; Kistemaker, C.

    1999-01-01

    Food chemical risk management needs, among other things, assessment of exposure. For dietary intake food consumption surveys are the data source to be used. One complicating factor in the usage of these data is the dependence of dietary intake estimates on the time frame of assessment. Central to th

  4. Critical Assessment of Correction Methods for Fisheye Lens Distortion

    Science.gov (United States)

    Liu, Y.; Tian, C.; Huang, Y.

    2016-06-01

    A fisheye lens is widely used to create a wide panoramic or hemispherical image. It is an ultra wide-angle lens that produces strong visual distortion. The distortion modeling and estimation of the fisheye lens are the crucial step for fisheye lens calibration and image rectification in computer vision and close-range photography. There are two kinds of distortion: radial and tangential distortion. Radial distortion is large for fisheye imaging and critical for the subsequent image processing. Although many researchers have developed calibration algorithms of radial distortion of fisheye lens, quantitative evaluation of the correction performance has remained a challenge. This is the first paper that intuitively and objectively evaluates the performance of five different calibration algorithms. Upto- date research on fisheye lens calibration is comprehensively reviewed to identify the research need. To differentiate their performance in terms of precision and ease-using, five methods are then tested using a diverse set of actual images of the checkerboard that are taken at Wuhan University, China under varying lighting conditions, shadows, and shooting angles. The method of rational function model, which was generally used for wide-angle lens correction, outperforms the other methods. However, the one parameter division model is easy for practical use without compromising too much the precision. The reason is that it depends on the linear structure in the image and requires no preceding calibration. It is a tradeoff between correction precision and ease-using. By critically assessing the strengths and limitations of the existing algorithms, the paper provides valuable insight and guideline for future practice and algorithm development that are important for fisheye lens calibration. It is promising for the optimal design of lens correction models that are suitable for the millions of portable imaging devices.

  5. Static Dependency Pair Method based on Strong Computability for Higher-Order Rewrite Systems

    CERN Document Server

    Kusakari, Keiichirou; Sakai, Masahiko; Blanqui, Frédéric

    2011-01-01

    Higher-order rewrite systems (HRSs) and simply-typed term rewriting systems (STRSs) are computational models of functional programs. We recently proposed an extremely powerful method, the static dependency pair method, which is based on the notion of strong computability, in order to prove termination in STRSs. In this paper, we extend the method to HRSs. Since HRSs include \\lambda-abstraction but STRSs do not, we restructure the static dependency pair method to allow \\lambda-abstraction, and show that the static dependency pair method also works well on HRSs without new restrictions.

  6. Methods and dimensions of electronic health record data quality assessment: enabling reuse for clinical research.

    Science.gov (United States)

    Weiskopf, Nicole Gray; Weng, Chunhua

    2013-01-01

    To review the methods and dimensions of data quality assessment in the context of electronic health record (EHR) data reuse for research. A review of the clinical research literature discussing data quality assessment methodology for EHR data was performed. Using an iterative process, the aspects of data quality being measured were abstracted and categorized, as well as the methods of assessment used. Five dimensions of data quality were identified, which are completeness, correctness, concordance, plausibility, and currency, and seven broad categories of data quality assessment methods: comparison with gold standards, data element agreement, data source agreement, distribution comparison, validity checks, log review, and element presence. Examination of the methods by which clinical researchers have investigated the quality and suitability of EHR data for research shows that there are fundamental features of data quality, which may be difficult to measure, as well as proxy dimensions. Researchers interested in the reuse of EHR data for clinical research are recommended to consider the adoption of a consistent taxonomy of EHR data quality, to remain aware of the task-dependence of data quality, to integrate work on data quality assessment from other fields, and to adopt systematic, empirically driven, statistically based methods of data quality assessment. There is currently little consistency or potential generalizability in the methods used to assess EHR data quality. If the reuse of EHR data for clinical research is to become accepted, researchers should adopt validated, systematic methods of EHR data quality assessment.

  7. A RISK ASSESSMENT METHOD OF THE WIRELESS NETWORK SECURITY

    Institute of Scientific and Technical Information of China (English)

    Zhao Dongmei; Wang Changguang; Ma Jianfeng

    2007-01-01

    The core of network security is the risk assessment.In this letter,a risk assessment method is introduced to estimate the wireless network security.The method,which combines Analytic Hierarchy Process(AHP)method and fuzzy logical method,is applied to the risk assessment.Fuzzy logical method is applied to judge the important degree of each factor in the aspects of the probability,the influence and the uncontrollability,not to directly judge the important degree itself.The risk assessment is carved up 3 layers applying AHP method,the sort weight of the third layer is calculated by fuzzy logical method.Finally,the important degree is calculated by AHP method.By comparing the important degree of each factor,the risk which can be controlled by taking measures is known.The study of the case shows that the method can be easily used to the risk assessment of the wireless network security and its results conform to the actual situation.

  8. The professional portfolio: an evidence-based assessment method.

    Science.gov (United States)

    Byrne, Michelle; Schroeter, Kathryn; Carter, Shannon; Mower, Julie

    2009-12-01

    Competency assessment is critical for a myriad of disciplines, including medicine, law, education, and nursing. Many nurse managers and educators are responsible for nursing competency assessment, and assessment results are often used for annual reviews, promotions, and satisfying accrediting agencies' requirements. Credentialing bodies continually seek methods to measure and document the continuing competence of licensees or certificants. Many methods and frameworks for continued competency assessment exist. The portfolio process is one method to validate personal and professional accomplishments in an interactive, multidimensional manner. This article illustrates how portfolios can be used to assess competence. One specialty nursing certification board's process of creating an evidence-based portfolio for recertification or reactivation of a credential is used as an example. The theoretical background, development process, implementation, and future implications may serve as a template for other organizations in developing their own portfolio models.

  9. Orohanditest: A new method for orofacial damage assessment

    Directory of Open Access Journals (Sweden)

    Inês Morais Caldas

    2013-01-01

    Conclusion: Orohanditest provides a reliable, precise, and complete orofacial damage description and quantification. Therefore, this method can be useful as an auxiliary tool in the orofacial damage assessment process.

  10. Comparison of Cognitive Assessment Methods With Heterosocially Anxious College Women.

    Science.gov (United States)

    Myszka, Michael T.; And Others

    1986-01-01

    Investigated comparability of self-statements generated by different cognitive assessment methods; effect of an assessment delay on cognitive phenomena; and interrelationships among different cognitive variables. Subjects were heterosocially anxious women (N=64) who engaged in a conversation with a male confederate. Self-statements generated by…

  11. Three methods for the assessment of communication skills

    NARCIS (Netherlands)

    Smit, G.N.; van der Molen, H.T.

    1996-01-01

    Assessment of students' communication skills after a course in problem-clarifying skills requires an assessment method different from the traditional written examination. In this article we describe the construction and evaluation of simulations, video tests and paper-and-pencil tests. The results

  12. Three methods for the assessment of communication skills

    NARCIS (Netherlands)

    Smit, G.N.; van der Molen, H.T.

    1996-01-01

    Assessment of students' communication skills after a course in problem-clarifying skills requires an assessment method different from the traditional written examination. In this article we describe the construction and evaluation of simulations, video tests and paper-and-pencil tests. The results o

  13. Models and Methods for Assessing Refugee Mental Health Needs.

    Science.gov (United States)

    Deinard, Amos S.; And Others

    This background paper on refugee needs assessment discusses the assumptions, goals, objectives, strategies, models, and methods that the state refugee programs can consider in designing their strategies for assessing the mental health needs of refugees. It begins with a set of background assumptions about the ethnic profile of recent refugee…

  14. Minimal Residual Disease Assessment in Lymphoma: Methods and Applications.

    Science.gov (United States)

    Herrera, Alex F; Armand, Philippe

    2017-09-21

    Standard methods for disease response assessment in patients with lymphoma, including positron emission tomography and computed tomography scans, are imperfect. In other hematologic malignancies, particularly leukemias, the ability to detect minimal residual disease (MRD) is increasingly influencing treatment paradigms. However, in many subtypes of lymphoma, the application of MRD assessment techniques, like flow cytometry or polymerase chain reaction-based methods, has been challenging because of the absence of readily detected circulating disease or canonic chromosomal translocations. Newer MRD detection methods that use next-generation sequencing have yielded promising results in a number of lymphoma subtypes, fueling the hope that MRD detection may soon be applicable in clinical practice for most patients with lymphoma. MRD assessment can provide real-time information about tumor burden and response to therapy, noninvasive genomic profiling, and monitoring of clonal dynamics, allowing for many possible applications that could significantly affect the care of patients with lymphoma. Further validation of MRD assessment methods, including the incorporation of MRD assessment into clinical trials in patients with lymphoma, will be critical to determine how best to deploy MRD testing in routine practice and whether MRD assessment can ultimately bring us closer to the goal of personalized lymphoma care. In this review article, we describe the methods available for detecting MRD in patients with lymphoma and their relative advantages and disadvantages. We discuss preliminary results supporting the potential applications for MRD testing in the care of patients with lymphoma and strategies for including MRD assessment in lymphoma clinical trials.

  15. Assessment of dependence and anxiety among benzodiazepine users in a provincial municipality in Rio Grande do Sul, Brazil

    Directory of Open Access Journals (Sweden)

    Janaína Barden Schallemberger

    Full Text Available Abstract Introduction: Benzodiazepines are among the most prescribed drugs for anxiety and one of the most used drug classes in the world and have a high potential for addiction. The objective of this study was to assess levels of dependence and anxiety among users of these drugs in the public health system. Methods: This was a cross-sectional, descriptive and quantitative study. Benzodiazepine users treated on the public health system were selected. Anxiety levels were assessed with the Hamilton Anxiety Scale and dependency with the Benzodiazepine Dependence Self-Report Questionnaire. Results: Benzodiazepine use was higher among women and in older age groups. Duration of benzodiazepine use was greater than 1 year for all respondents. The dependence assessment indicated that more than half of users were dependent on taking benzodiazepines and most had a severe degree of anxiety. Conclusion: This study found evidence of prolonged and inappropriate use of benzodiazepines. It is necessary to educate users about the risks of these drugs and to develop strategies to rationalize use of these drugs by working with prescribers and dispensers.

  16. Assessing numerical dependence in gene expression summaries with the jackknife expression difference.

    Directory of Open Access Journals (Sweden)

    John R Stevens

    Full Text Available Statistical methods to test for differential expression traditionally assume that each gene's expression summaries are independent across arrays. When certain preprocessing methods are used to obtain those summaries, this assumption is not necessarily true. In general, the erroneous assumption of dependence results in a loss of statistical power. We introduce a diagnostic measure of numerical dependence for gene expression summaries from any preprocessing method and discuss the relative performance of several common preprocessing methods with respect to this measure. Some common preprocessing methods introduce non-trivial levels of numerical dependence. The issue of (between-array dependence has received little if any attention in the literature, and researchers working with gene expression data should not take such properties for granted, or they risk unnecessarily losing statistical power.

  17. Assessment of New Calculation Method for Toxicological Sums-of-Fractions for Hanford Tank Farm Wastes

    Energy Technology Data Exchange (ETDEWEB)

    Mahoney, Lenna A.

    2006-10-18

    The toxicological source terms used for potential accident assessment in the Hanford Tank Farms DSA are based on toxicological sums-of-fractions (SOFs) that were calculated based on the Best Basis Inventory (BBI) from May 2002, using a method that depended on thermodynamic equilibrium calculations of the compositions of liquid and solid phases. The present report describes a simplified SOF-calculation method that is to be used in future toxicological updates and assessments and compares its results (for the 2002 BBI) to those of the old method.

  18. Assessment methods and management of hypersexuality and paraphilic disorders.

    Science.gov (United States)

    Turner, Daniel; Schöttle, Daniel; Bradford, John; Briken, Peer

    2014-11-01

    The recent implementation of the Diagnostic and Statistical Manual of Mental Disorders, fifth edition introduced some important changes in the conceptualization of hypersexuality and paraphilic disorders. The destigmatization of nonnormative sexual behaviors could be viewed as positive, However, other changes are more controversial. In order to stimulate new research approaches and provide mental healthcare providers with appropriate treatment regimes, validated assessment and treatment methods are needed. The purpose of this article is to review the studies published between January 2013 and July 2014 that aimed at assessing the psychometric properties of the currently applied assessment instruments and treatment approaches for hypersexuality and hypersexual disorders or paraphilias and paraphilic disorder. Currently existing instruments can validly assess hypersexual behaviors in different populations (e.g. college students, gay and bisexual men, and patients with neurodegenerative disorders) and cultural backgrounds (e.g. Germany, Spain, and USA). Concerning the assessment of paraphilias, it was shown that combining different assessment methods show a better performance in distinguishing between patients with paraphilias and control groups. In addition to psychotherapeutic treatment, pharmacological agents aiming at a reduction of serum testosterone levels are used for hypersexual behaviors as well as paraphilic disorders. Although the currently applied assessment and treatment methods seem to perform quite well, more research about the assessment and evidence-based treatment is needed. This would help to overcome the existing unresolved issues concerning the conceptualization of hypersexual and paraphilic disorders.

  19. Human Health Risk Assessment Applied to Rural Populations Dependent on Unregulated Drinking Water Sources: A Scoping Review

    Science.gov (United States)

    Ford, Lorelei; Bharadwaj, Lalita; McLeod, Lianne; Waldner, Cheryl

    2017-01-01

    Safe drinking water is a global challenge for rural populations dependent on unregulated water. A scoping review of research on human health risk assessments (HHRA) applied to this vulnerable population may be used to improve assessments applied by government and researchers. This review aims to summarize and describe the characteristics of HHRA methods, publications, and current literature gaps of HHRA studies on rural populations dependent on unregulated or unspecified drinking water. Peer-reviewed literature was systematically searched (January 2000 to May 2014) and identified at least one drinking water source as unregulated (21%) or unspecified (79%) in 100 studies. Only 7% of reviewed studies identified a rural community dependent on unregulated drinking water. Source water and hazards most frequently cited included groundwater (67%) and chemical water hazards (82%). Most HHRAs (86%) applied deterministic methods with 14% reporting probabilistic and stochastic methods. Publications increased over time with 57% set in Asia, and 47% of studies identified at least one literature gap in the areas of research, risk management, and community exposure. HHRAs applied to rural populations dependent on unregulated water are poorly represented in the literature even though almost half of the global population is rural. PMID:28788087

  20. Spectral Method for Solving Time Dependent Flow of Upper-Convected Maxwell Fluid in Tube

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The time dependent flow of upper-convected Maxwell fluid in a horizontal circular pipe is studied by spectral method. The time dependent problem is mathematically reduced to a partial differential equation of second order. By using spectral method the partial differential equation can be reduced to a system of ordinary differential equations for different terms of Chebyshev polynomials approximations. The ordinary differential equations are solved by Laplace transform and the eigenvalue method that leads to an analytical form of the solutions.

  1. Valuation methods within the framework of life cycle assessment

    Energy Technology Data Exchange (ETDEWEB)

    Finnveden, G.

    1996-05-01

    Life Cycle Assessment Valuation methods are discussed. Different approaches for valuation are discussed as well as presently available valuation methods in relation to: * the values involved in the valuation, * the LCA framework, and * different applications of LCA. Among the conclusions are: * ethical and ideological valuations are involved not only when applying valuation weighting factors, but also when choosing valuation method and also when choosing whether to perform a valuation weighting or not, * it can be questioned whether straight distance-to-target methods are valuation methods, * it is still an open question whether presently available valuation methods produce meaningful and reliable information, * further development of quantitative valuation methods could concentrate both on different types of monetarisation methods and panel methods, * in many applications of LCA, the expected result is an identification of critical areas rather than a one-dimensional score, reducing the need for valuation methods. 88 refs, 3 figs, 4 tabs

  2. Valuation methods within the framework of life cycle assessment

    Energy Technology Data Exchange (ETDEWEB)

    Finnveden, G.

    1996-05-01

    Life Cycle Assessment Valuation methods are discussed. Different approaches for valuation are discussed as well as presently available valuation methods in relation to: * the values involved in the valuation, * the LCA framework, and * different applications of LCA. Among the conclusions are: * ethical and ideological valuations are involved not only when applying valuation weighting factors, but also when choosing valuation method and also when choosing whether to perform a valuation weighting or not, * it can be questioned whether straight distance-to-target methods are valuation methods, * it is still an open question whether presently available valuation methods produce meaningful and reliable information, * further development of quantitative valuation methods could concentrate both on different types of monetarisation methods and panel methods, * in many applications of LCA, the expected result is an identification of critical areas rather than a one-dimensional score, reducing the need for valuation methods. 88 refs, 3 figs, 4 tabs

  3. Ab initio time-dependent method to study the hydrogen molecule exposed to intense ultrashort laser pulses

    Energy Technology Data Exchange (ETDEWEB)

    Sanz-Vicario, J.L. [Departamento de Quimica, C-IX, Universidad Autonoma de Madrid, 28049-Madrid (Spain); Sede de Investigacion Universitaria (SIU). Instituto de Fisica, Universidad de Antioquia, Medellin (Colombia)], E-mail: joseluis.sanzvicario@uam.es; Palacios, A. [Departamento de Quimica, C-IX, Universidad Autonoma de Madrid, 28049-Madrid (Spain); Cardona, J.C. [Sede de Investigacion Universitaria (SIU). Instituto de Fisica, Universidad de Antioquia, Medellin (Colombia); Bachau, H. [Centre des Lasers Intenses et Applications, UMR 5107 du CNRS-Universite bordeaux I-CEA, Universite Bordeaux I, 33405 Talence Cedex (France); Martin, F. [Departamento de Quimica, C-IX, Universidad Autonoma de Madrid, 28049-Madrid (Spain)

    2007-10-15

    An ab initio non-perturbative time dependent method to describe ionization of molecular systems by ultrashort (femtosecond) laser pulses has been developed. The method allows one to describe competing processes such as non dissociative ionization, dissociative ionization and dissociation into neutrals, including the possibility of autoionization. In this work we assess the validity of the method by applying it to different physical situations and by comparing with results previously obtained within stationary perturbation theory. In particular, it is shown that inclusion of the nuclear motion is essential to describe H{sub 2} resonance enhanced multiphoton ionization and interferences mediated by H{sub 2} autoionizing states.

  4. [Study on application of two risk assessment methods in coal dust occupational health risk assessment].

    Science.gov (United States)

    Wu, B; Zhang, Y L; Chen, Y Q

    2017-04-20

    Objective: To evaluate the applicability of quantitative grading method (GBZ/T 229.1-2010) and occupational hazard risk index method in coal dust occupational health risk assessment. Methods: Taking 4 coal mines as the research object of risk assessment and making occupational health field testing and investigation. Based on two risk assessment methods, we analysed the health risk levels of 20 occupations which were exposed to coal dust in workplaces. Results: Coal dust working post had different risk levels in 4 coal mines, the post of higher risk level were mainly concentrated in the underground workplace of coal mine, especially the post of coal mining and tunneling system. The two risk assessment results showed that the risk levels of coal-mining machine drivers and tunneling machine drivers were the highest. The risk levels of coal dust working post used by two risk assessment methods had no significant difference (P>0.05) and were highly correlated (r=0.821, Prisk assessment methods were supported by the field investigation and literatures. Conclusion: The two risk assessment methods can be used in coal dust occupational health risk assessment.

  5. Methods for Assessing Honeycomb Sandwich Panel Wrinkling Failures

    Science.gov (United States)

    Zalewski, Bart F.; Dial, William B.; Bednarcyk, Brett A.

    2012-01-01

    Efficient closed-form methods for predicting the facesheet wrinkling failure mode in sandwich panels are assessed. Comparisons were made with finite element model predictions for facesheet wrinkling, and a validated closed-form method was implemented in the HyperSizer structure sizing software.

  6. Exploring valid and reliable assessment methods for care management education.

    Science.gov (United States)

    Gennissen, Lokke; Stammen, Lorette; Bueno-de-Mesquita, Jolien; Wieringa, Sietse; Busari, Jamiu

    2016-07-04

    Purpose It is assumed that the use of valid and reliable assessment methods can facilitate the development of medical residents' management and leadership competencies. To justify this assertion, the perceptions of an expert panel of health care leaders were explored on assessment methods used for evaluating care management (CM) development in Dutch residency programs. This paper aims to investigate how assessors and trainees value these methods and examine for any inherent benefits or shortcomings when they are applied in practice. Design/methodology/approach A Delphi survey was conducted among members of the platform for medical leadership in The Netherlands. This panel of experts was made up of clinical educators, practitioners and residents interested in CM education. Findings Of the respondents, 40 (55.6 per cent) and 31 (43 per cent) participated in the first and second rounds of the Delphi survey, respectively. The respondents agreed that assessment methods currently being used to measure residents' CM competencies were weak, though feasible for use in many residency programs. Multi-source feedback (MSF, 92.1 per cent), portfolio/e-portfolio (86.8 per cent) and knowledge testing (76.3 per cent) were identified as the most commonly known assessment methods with familiarity rates exceeding 75 per cent. Practical implications The findings suggested that an "assessment framework" comprising MSF, portfolios, individual process improvement projects or self-reflections and observations in clinical practice should be used to measure CM competencies in residents. Originality/value This study reaffirms the need for objective methods to assess CM skills in post-graduate medical education, as there was not a single assessment method that stood out as the best instrument.

  7. Development of a resilience scale for Thai substance-dependent women: A mixed methods approach.

    Science.gov (United States)

    Sakunpong, Nanchatsan; Choochom, Oraphin; Taephant, Nattasuda

    2016-08-01

    The purpose of this study was to develop a resilience scale based on the experiences of substance-dependent women in Thailand and evaluate its validity and reliability. A sequential exploratory mixed methods design was employed as the main methodology to develop the resilience scale according to the results from qualitative data by analyzing focus group discussions of 13 participants. Then, the scale was administered to 252 substance-dependent women from four substance-treatment centers. The psychometric properties were explored with an index of item objective congruence (IOC), Pearson correlation, second-order confirmatory factor analysis and Cronbach's alpha coefficient to estimate the quantitative data. The qualitative results showed that resilience is defined by three themes: individual, family and community factors, consisted of 13 different categories. The quantitative results also revealed that all 71 items in the resilience scale passed the IOC criteria, convergence and construct validity. The goodness-of-fit indices demonstrated that the resilience model was consistent with the empirical data. (Chi-square=74.28, df=59, p-value=0.08, RMSEA=0.03, SRMR=0.04, NNFI=0.99, CFI=0.99, GFI=0.96). The internal consistency, assessed by a Cronbach's alpha score of 0.92, can be interpreted as demonstrating high reliability. Furthermore, the structure of the resilience scale was confirmed by the available resilience literature. This study can help clinicians gain a more comprehensive understanding regarding the complex process of resilience among substance-dependent women and aid them in providing these women with the appropriate interventions.

  8. Systematic evaluation of observational methods assessing biomechanical exposures at work

    DEFF Research Database (Denmark)

    Takala, Esa-Pekka; Pehkonen, Irmeli; Forsman, Mikael

    2010-01-01

    OBJECTIVES: This systematic review aimed to identify published observational methods assessing biomechanical exposures in occupational settings and evaluate them with reference to the needs of different users. METHODS: We searched scientific databases and the internet for material from 1965...... to September 2008. Methods were included if they were primarily based on the systematic observation of work, the observation target was the human body, and the method was clearly described in the literature. A systematic evaluation procedure was developed to assess concurrent and predictive validity...... the use of technical instruments. Generally, the observations showed moderate to good agreement with the corresponding assessments made from video recordings; agreement was the best for large-scale body postures and work actions. Postures of wrist and hand as well as trunk rotation seemed to be more...

  9. Reporting methods of blinding in randomized trials assessing nonpharmacological treatments.

    Directory of Open Access Journals (Sweden)

    Isabelle Boutron

    2007-02-01

    Full Text Available BACKGROUND: Blinding is a cornerstone of treatment evaluation. Blinding is more difficult to obtain in trials assessing nonpharmacological treatment and frequently relies on "creative" (nonstandard methods. The purpose of this study was to systematically describe the strategies used to obtain blinding in a sample of randomized controlled trials of nonpharmacological treatment. METHODS AND FINDINGS: We systematically searched in Medline and the Cochrane Methodology Register for randomized controlled trials (RCTs assessing nonpharmacological treatment with blinding, published during 2004 in high-impact-factor journals. Data were extracted using a standardized extraction form. We identified 145 articles, with the method of blinding described in 123 of the reports. Methods of blinding of participants and/or health care providers and/or other caregivers concerned mainly use of sham procedures such as simulation of surgical procedures, similar attention-control interventions, or a placebo with a different mode of administration for rehabilitation or psychotherapy. Trials assessing devices reported various placebo interventions such as use of sham prosthesis, identical apparatus (e.g., identical but inactivated machine or use of activated machine with a barrier to block the treatment, or simulation of using a device. Blinding participants to the study hypothesis was also an important method of blinding. The methods reported for blinding outcome assessors relied mainly on centralized assessment of paraclinical examinations, clinical examinations (i.e., use of video, audiotape, photography, or adjudications of clinical events. CONCLUSIONS: This study classifies blinding methods and provides a detailed description of methods that could overcome some barriers of blinding in clinical trials assessing nonpharmacological treatment, and provides information for readers assessing the quality of results of such trials.

  10. Assessment methods for solid waste management: A literature review.

    Science.gov (United States)

    Allesch, Astrid; Brunner, Paul H

    2014-06-01

    Assessment methods are common tools to support decisions regarding waste management. The objective of this review article is to provide guidance for the selection of appropriate evaluation methods. For this purpose, frequently used assessment methods are reviewed, categorised, and summarised. In total, 151 studies have been considered in view of their goals, methodologies, systems investigated, and results regarding economic, environmental, and social issues. A goal shared by all studies is the support of stakeholders. Most studies are based on life cycle assessments, multi-criteria-decision-making, cost-benefit analysis, risk assessments, and benchmarking. Approximately 40% of the reviewed articles are life cycle assessment-based; and more than 50% apply scenario analysis to identify the best waste management options. Most studies focus on municipal solid waste and consider specific environmental loadings. Economic aspects are considered by approximately 50% of the studies, and only a small number evaluate social aspects. The choice of system elements and boundaries varies significantly among the studies; thus, assessment results are sometimes contradictory. Based on the results of this review, we recommend the following considerations when assessing waste management systems: (i) a mass balance approach based on a rigid input-output analysis of the entire system, (ii) a goal-oriented evaluation of the results of the mass balance, which takes into account the intended waste management objectives; and (iii) a transparent and reproducible presentation of the methodology, data, and results.

  11. Methods of assessing total doses integrated across pathways

    Energy Technology Data Exchange (ETDEWEB)

    Grzechnik, M.; Camplin, W.; Clyne, F. [Centre for Environment, Fisheries and Aquaculture Science, Lowestoft (United Kingdom); Allott, R. [Environment Agency, London (United Kingdom); Webbe-Wood, D. [Food Standards Agency, London (United Kingdom)

    2006-07-01

    Calculated doses for comparison with limits resulting from discharges into the environment should be summed across all relevant pathways and food groups to ensure adequate protection. Current methodology for assessments used in the radioactivity in Food and the Environment (R.I.F.E.) reports separate doses from pathways related to liquid discharges of radioactivity to the environment from those due to gaseous releases. Surveys of local inhabitant food consumption and occupancy rates are conducted in the vicinity of nuclear sites. Information has been recorded in an integrated way, such that the data for eachividual is recorded for all pathways of interest. These can include consumption of foods, such as fish, crustaceans, molluscs, fruit and vegetables, milk and meats. Occupancy times over beach sediments and time spent in close proximity to the site is also recorded for inclusion of external and inhalation radiation dose pathways. The integrated habits survey data may be combined with monitored environmental radionuclide concentrations to calculate total dose. The criteria for successful adoption of a method for this calculation were: Reproducibility can others easily use the approach and reassess doses? Rigour and realism how good is the match with reality?Transparency a measure of the ease with which others can understand how the calculations are performed and what they mean. Homogeneity is the group receiving the dose relatively homogeneous with respect to age, diet and those aspects that affect the dose received? Five methods of total dose calculation were compared and ranked according to their suitability. Each method was labelled (A to E) and given a short, relevant name for identification. The methods are described below; A) Individual doses to individuals are calculated and critical group selection is dependent on dose received. B) Individual Plus As in A, but consumption and occupancy rates for high dose is used to derive rates for application in future

  12. CART IV: improving automatic camouflage assessment with assistance methods

    Science.gov (United States)

    Müller, Thomas; Müller, Markus

    2010-04-01

    In order to facilitate systematic, computer aided improvements of camouflage and concealment assessment methods, the software system CART (Camouflage Assessment in Real-Time) was built up for the camouflage assessment of objects in multispectral image sequences (see contributions to SPIE 2007, SPIE 2008 and SPIE 2009 [1], [2], [3]). It comprises a semi-automatic marking of target objects (ground truth generation) including their propagation over the image sequence and the evaluation via user-defined feature extractors. The conspicuity of camouflaged objects due to their movement can be assessed with a purpose-built processing method named MTI snail track algorithm. This paper presents the enhancements over the recent year and addresses procedures to assist the camouflage assessment of moving objects for image data material with strong noise or image artefacts. This extends the evaluation methods significantly to a broader application range. For example, some noisy infrared image data material can be evaluated for the first time by applying the presented methods which fathom the correlations between camouflage assessment, MTI (moving target indication) and dedicated noise filtering.

  13. Orohanditest: A new method for orofacial damage assessment

    Science.gov (United States)

    Caldas, Inês Morais; Magalhães, Teresa; Matos, Eduarda; Afonso, Américo

    2013-01-01

    Background: Currently, orofacial sequelae are recognized as very influential on the quality-of-life for a victim of orofacial damage. Therefore, correct forensic assessment for indenisation purposes is mandatory. However, orofacial damage is frequently reduced to organic components, which results in a forensic assessment process, which are inadequate. This study aims to improve the orofacial damage assessment through the development of an auxiliary tool, the orohanditest. Materials and Methods: A preliminary inventory was constructed, using relevant bibliographic elements and retrospective study of forensic examinations reports concerning orofacial trauma. This inventory was then utilized in the assessment of 265 orofacial trauma victims for validation. Validity was studied by analyzing the internal construct validity (exploring factorial validity and assessing internal consistency) and the external construct validity (assessing convergent validity and discriminant validity). The level of significance was defined as P < 0.05. Results: The final inventory (orohanditest) was comprised of the three components of body (8 items), functions (10 items) and situations (24 items), which were found to be statistically reliable and valid for assessment. The final score (orofacial damage coefficient) reflects the orofacial damage severity. Conclusion: Orohanditest provides a reliable, precise, and complete orofacial damage description and quantification. Therefore, this method can be useful as an auxiliary tool in the orofacial damage assessment process. PMID:24379863

  14. New mobile methods for dietary assessment: review of image-assisted and image-based dietary assessment methods.

    Science.gov (United States)

    Boushey, C J; Spoden, M; Zhu, F M; Delp, E J; Kerr, D A

    2016-12-12

    For nutrition practitioners and researchers, assessing dietary intake of children and adults with a high level of accuracy continues to be a challenge. Developments in mobile technologies have created a role for images in the assessment of dietary intake. The objective of this review was to examine peer-reviewed published papers covering development, evaluation and/or validation of image-assisted or image-based dietary assessment methods from December 2013 to January 2016. Images taken with handheld devices or wearable cameras have been used to assist traditional dietary assessment methods for portion size estimations made by dietitians (image-assisted methods). Image-assisted approaches can supplement either dietary records or 24-h dietary recalls. In recent years, image-based approaches integrating application technology for mobile devices have been developed (image-based methods). Image-based approaches aim at capturing all eating occasions by images as the primary record of dietary intake, and therefore follow the methodology of food records. The present paper reviews several image-assisted and image-based methods, their benefits and challenges; followed by details on an image-based mobile food record. Mobile technology offers a wide range of feasible options for dietary assessment, which are easier to incorporate into daily routines. The presented studies illustrate that image-assisted methods can improve the accuracy of conventional dietary assessment methods by adding eating occasion detail via pictures captured by an individual (dynamic images). All of the studies reduced underreporting with the help of images compared with results with traditional assessment methods. Studies with larger sample sizes are needed to better delineate attributes with regards to age of user, degree of error and cost.

  15. Methods of synthesizing qualitative research studies for health technology assessment.

    Science.gov (United States)

    Ring, Nicola; Jepson, Ruth; Ritchie, Karen

    2011-10-01

    Synthesizing qualitative research is an important means of ensuring the needs, preferences, and experiences of patients are taken into account by service providers and policy makers, but the range of methods available can appear confusing. This study presents the methods for synthesizing qualitative research most used in health research to-date and, specifically those with a potential role in health technology assessment. To identify reviews conducted using the eight main methods for synthesizing qualitative studies, nine electronic databases were searched using key terms including meta-ethnography and synthesis. A summary table groups the identified reviews by their use of the eight methods, highlighting the methods used most generally and specifically in relation to health technology assessment topics. Although there is debate about how best to identify and quality appraise qualitative research for synthesis, 107 reviews were identified using one of the eight main methods. Four methods (meta-ethnography, meta-study, meta-summary, and thematic synthesis) have been most widely used and have a role within health technology assessment. Meta-ethnography is the leading method for synthesizing qualitative health research. Thematic synthesis is also useful for integrating qualitative and quantitative findings. Four other methods (critical interpretive synthesis, grounded theory synthesis, meta-interpretation, and cross-case analysis) have been under-used in health research and their potential in health technology assessments is currently under-developed. Synthesizing individual qualitative studies has becoming increasingly common in recent years. Although this is still an emerging research discipline such an approach is one means of promoting the patient-centeredness of health technology assessments.

  16. A new approximation method for time-dependent problems in quantum mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Amore, Paolo [Facultad de Ciencias, Universidad de Colima, Bernal Diaz del Castillo 340, Colima, Colima (Mexico)]. E-mail: paolo@ucol.mx; Aranda, Alfredo [Facultad de Ciencias, Universidad de Colima, Bernal Diaz del Castillo 340, Colima, Colima (Mexico)]. E-mail: fefo@ucol.mx; Fernandez, Francisco M. [INIFTA (Conicet, UNLP), Diag. 113 y 64 S/N, Sucursal 4, Casilla de Correo 16, 1900 La Plata (Argentina)]. E-mail: fernande@quimica.unlp.edu.ar; Jones, Hugh [Department of Physics, Imperial College, London SW7 2AZ (United Kingdom)]. E-mail: h.f.jones@imperial.ac.uk

    2005-06-06

    We propose an approximate solution of the time-dependent Schroedinger equation using the method of stationary states combined with a variational matrix method for finding the energies and eigenstates. We illustrate the effectiveness of the method by applying it to the time development of the wave-function in the quantum-mechanical version of the inflationary slow-roll transition.

  17. Assessment methods of injection moulded nano-patterned surfaces

    DEFF Research Database (Denmark)

    Menotti, S.; Bisacco, G.; Hansen, H. N.

    2014-01-01

    algorithm for feature recognition. To compare the methods, the mould insert and a number of replicated nano-patterned surfaces, injection moulded with an induction heating aid, were measured on nominally identical locations by means of an atomic force microscope mounted on a manual CMM........ In this work two different methods for quantitative characterization of random nano-patterned surfaces were compared and assessed. One method is based on the estimation of the roughness amplitude parameters Sa and Sz (ISO 25178). The second method is based on pore and particle analysis using the watershed......Assessment of nano-patterned surfaces requires measurements with nano-metric resolution. In order to enable the optimization of the moulding process it is necessary to develop a robust method for quantitative characterization of the replication quality of random nano-patterned surfaces...

  18. An Integrated Method of Supply Chains Vulnerability Assessment

    Directory of Open Access Journals (Sweden)

    Jiaguo Liu

    2016-01-01

    Full Text Available Supply chain vulnerability identification and evaluation are extremely important to mitigate the supply chain risk. We present an integrated method to assess the supply chain vulnerability. The potential failure mode of the supply chain vulnerability is analyzed through the SCOR model. Combining the fuzzy theory and the gray theory, the correlation degree of each vulnerability indicator can be calculated and the target improvements can be carried out. In order to verify the effectiveness of the proposed method, we use Kendall’s tau coefficient to measure the effect of different methods. The result shows that the presented method has the highest consistency in the assessment compared with the other two methods.

  19. Discontinuous Time Relaxation Method for the Time-Dependent Navier-Stokes Equations

    Directory of Open Access Journals (Sweden)

    Monika Neda

    2010-01-01

    is considered. A fully discrete scheme using discontinuous finite elements is proposed and analyzed. Optimal velocity error estimates are derived. The dependence of these estimates with respect to the Reynolds number Re is (ReRe, which is an improvement with respect to the continuous finite element method where the dependence is (ReRe3.

  20. Companies Credit Risk Assessment Methods for Investment Decision Making

    Directory of Open Access Journals (Sweden)

    Dovilė Peškauskaitė

    2017-06-01

    Full Text Available As the banks have tightened lending requirements, companies look for alternative sources of external funding. One of such is bonds issue. Unfortunately, corporate bonds issue as a source of funding is rare in Lithuania. This occurs because companies face with a lack of information, investors fear to take on credit risk. Credit risk is defined as a borrower’s failure to meet its obligation. Investors, in order to avoid credit risk, have to assess the state of the companies. The goal of the article is to determine the most informative methods of credit risk assessment. The article summarizes corporate lending sources, analyzes corporate default causes and credit risk assessment methods. The study based on the SWOT analysis shows that investors before making an investment decision should evaluate both the business risk,using qualitative method CAMPARI, and the financial risk, using financial ratio analysis.

  1. A comprehensive environmental impact assessment method for shale gas development

    Directory of Open Access Journals (Sweden)

    Renjin Sun

    2015-03-01

    Full Text Available The great success of US commercial shale gas exploitation stimulates the shale gas development in China, subsequently, the corresponding supporting policies were issued in the 12th Five-Year Plan. But from the experience in the US shale gas development, we know that the resulted environmental threats are always an unavoidable issue, but no uniform and standard evaluation system has yet been set up in China. The comprehensive environment refers to the combination of natural ecological environment and external macro-environment. In view of this, we conducted a series of studies on how to set up a comprehensive environmental impact assessment system as well as the related evaluation methodology and models. First, we made an in-depth investigation into shale gas development procedures and any possible environmental impacts, and then compared, screened and modified environmental impact assessment methods for shale gas development. Also, we established an evaluating system and assessment models according to different status of the above two types of environment: the correlation matrix method was employed to assess the impacts on natural ecological environment and the optimization distance method was modified to evaluate the impacts on external macro-environment. Finally, we substitute the two subindexes into the comprehensive environmental impact assessment model and achieved the final numerical result of environmental impact assessment. This model can be used to evaluate if a shale gas project has any impact on environment, compare the impacts before and after a shale gas development project, or the impacts of different projects.

  2. The Determining Method about the Conflict between the Null Constraints and the Set of Functional Dependencies

    Institute of Scientific and Technical Information of China (English)

    刘惟一

    1989-01-01

    In this paper the conflict between the null constraints and the set of functional dependencies is defined.Some rules for determining the conflicts and a method for processing the conflicts are obtained.

  3. Safety assessment and detection methods of genetically modified organisms.

    Science.gov (United States)

    Xu, Rong; Zheng, Zhe; Jiao, Guanglian

    2014-01-01

    Genetically modified organisms (GMOs), are gaining importance in agriculture as well as the production of food and feed. Along with the development of GMOs, health and food safety concerns have been raised. These concerns for these new GMOs make it necessary to set up strict system on food safety assessment of GMOs. The food safety assessment of GMOs, current development status of safety and precise transgenic technologies and GMOs detection have been discussed in this review. The recent patents about GMOs and their detection methods are also reviewed. This review can provide elementary introduction on how to assess and detect GMOs.

  4. Operational Safety Assessment of Turbo Generators with Wavelet Rényi Entropy from Sensor-Dependent Vibration Signals

    Science.gov (United States)

    Zhang, Xiaoli; Wang, Baojian; Chen, Xuefeng

    2015-01-01

    With the rapid development of sensor technology, various professional sensors are installed on modern machinery to monitor operational processes and assure operational safety, which play an important role in industry and society. In this work a new operational safety assessment approach with wavelet Rényi entropy utilizing sensor-dependent vibration signals is proposed. On the basis of a professional sensor and the corresponding system, sensor-dependent vibration signals are acquired and analyzed by a second generation wavelet package, which reflects time-varying operational characteristic of individual machinery. Derived from the sensor-dependent signals’ wavelet energy distribution over the observed signal frequency range, wavelet Rényi entropy is defined to compute the operational uncertainty of a turbo generator, which is then associated with its operational safety degree. The proposed method is applied in a 50 MW turbo generator, whereupon it is proved to be reasonable and effective for operation and maintenance. PMID:25894934

  5. Operational Safety Assessment of Turbo Generators with Wavelet Rényi Entropy from Sensor-Dependent Vibration Signals

    Directory of Open Access Journals (Sweden)

    Xiaoli Zhang

    2015-04-01

    Full Text Available With the rapid development of sensor technology, various professional sensors are installed on modern machinery to monitor operational processes and assure operational safety, which play an important role in industry and society. In this work a new operational safety assessment approach with wavelet Rényi entropy utilizing sensor-dependent vibration signals is proposed. On the basis of a professional sensor and the corresponding system, sensor-dependent vibration signals are acquired and analyzed by a second generation wavelet package, which reflects time-varying operational characteristic of individual machinery. Derived from the sensor-dependent signals’ wavelet energy distribution over the observed signal frequency range, wavelet Rényi entropy is defined to compute the operational uncertainty of a turbo generator, which is then associated with its operational safety degree. The proposed method is applied in a 50 MW turbo generator, whereupon it is proved to be reasonable and effective for operation and maintenance.

  6. Operational safety assessment of turbo generators with wavelet Rényi entropy from sensor-dependent vibration signals.

    Science.gov (United States)

    Zhang, Xiaoli; Wang, Baojian; Chen, Xuefeng

    2015-04-16

    With the rapid development of sensor technology, various professional sensors are installed on modern machinery to monitor operational processes and assure operational safety, which play an important role in industry and society. In this work a new operational safety assessment approach with wavelet Rényi entropy utilizing sensor-dependent vibration signals is proposed. On the basis of a professional sensor and the corresponding system, sensor-dependent vibration signals are acquired and analyzed by a second generation wavelet package, which reflects time-varying operational characteristic of individual machinery. Derived from the sensor-dependent signals' wavelet energy distribution over the observed signal frequency range, wavelet Rényi entropy is defined to compute the operational uncertainty of a turbo generator, which is then associated with its operational safety degree. The proposed method is applied in a 50 MW turbo generator, whereupon it is proved to be reasonable and effective for operation and maintenance.

  7. The Impact of Harmonics Calculation Methods on Power Quality Assessment in Wind Farms

    DEFF Research Database (Denmark)

    Kocewiak, Lukasz Hubert; Hjerrild, Jesper; Bak, Claus Leth

    2010-01-01

    Different methods of calculating harmonics in measurements obtained from offshore wind farms are shown in this paper. Appropriate data processing methods are suggested for harmonics with different origin and nature. Enhancements of discrete Fourier transform application in order to reduce...... measurement data processing errors are proposed and compared with classical methods. Comparison of signal processing methods for harmonic studies is presented and application dependent on harmonics origin and nature recommended. Certain aspects related to magnitude and phase calculation in stationary...... measurement data are analysed and described. Qualitative indices of measurement data harmonic analysis in order to assess the calculation accuracy are suggested and used....

  8. Assessment of dependency by the FFDI: Comparisons to the PID-5 and maladaptive agreeableness.

    Science.gov (United States)

    Gore, Whitney L; Widiger, Thomas A

    2015-11-01

    The present study explores the validity of the Five Factor Dependency Inventory (FFDI), a measure of dependent personality traits from the perspective of the five factor model, examined across three separate samples and two studies. The first study examined the FFDI with respect to the traits assigned to assess dependent personality disorder (DPD) by the DSM-5 work group, two measures of DSM-IV-TR DPD and three measures of dependent traits, sampling 184 Mechanical Turk participants and 83 students (the latter oversampled for DPD features). Based on responses from an additional 137 students, the second study investigated the role of maladaptive agreeableness in dependency by examining the FFDI in relation to the interpersonal circumplex using three alternative measures. Discriminant validity was provided with respect to DSM-5 traits and the interpersonal circumplex. Incremental validity was provided with respect to the ability of the FFDI to account for variance within DPD measures beyond the variance explained by DSM-5 traits. Implications for the assessment of dependency and the proposed DSM-5 dimensional trait model are discussed.

  9. The round robin site assessment method: A new approach to wind energy site assessment

    Energy Technology Data Exchange (ETDEWEB)

    Lackner, Matthew A.; Rogers, Anthony L.; Manwell, James F. [Renewable Energy Research Laboratory, Department of Mechanical and Industrial Engineering, University of Massachusetts Amherst, 160 Governors Dr., Amherst, MA 01003 (United States)

    2008-09-15

    Portability is one of the many potential advantages of utilizing ground-based measurement devices such as SODARs and LIDARs instead of meteorological towers for wind resource assessment. This paper investigates the use of a monitoring strategy that leverages the portability of ground-based devices, dubbed the ''round robin site assessment method.'' The premise is to measure the wind resource at multiple sites in a single year using a single portable device, but to discontinuously distribute the measurement time at each site over the whole year, so that the total measurement period comprises smaller segments of measured data. This measured data set is then utilized in the measure-correlate-predict (MCP) process to predict the long-term wind resource at the site. This method aims to increase the number of sites assessed in a single year, without the sacrifice in accuracy and precision that usually accompanies shorter measurement periods. The performance of the round robin site assessment method was compared to the standard method, in which the measured data are continuous. The results demonstrate that the round robin site assessment method is an effective monitoring strategy that improves the accuracy and reduces the uncertainty of MCP predictions for measurement periods less than 1 year. In fact, the round robin site assessment method compares favorably to the accuracy and uncertainty of a full year of resource assessment. While there are some tradeoffs to be made by using the round robin site assessment method, it is potentially a very useful strategy for wind resource assessment. (author)

  10. TENCompetence Assessment Model and Related Tools for Non Traditional Methods of Assessment

    NARCIS (Netherlands)

    Petrov, Milen; Aleksieva-Petrova, Adelina; Stefanov, Krassen; Schoonenboom, Judith; Miao, Yongwu

    2008-01-01

    Petrov, M., Aleksieva-Petrova, A., Stefanov, K., Schoonenboom, J., & Miao, Y. (2008). TENCompetence Assessment Model and Related Tools for Non Traditional Methods of Assessment. In H. W. Sligte & R. Koper (Eds). Proceedings of the 4th TENCompetence Open Workshop. Empowering Learners for Lifelong Com

  11. Assessment of cognitive impairment in long-term oxygen therapy-dependent COPD patients

    Directory of Open Access Journals (Sweden)

    Karamanli H

    2015-09-01

    Full Text Available Harun Karamanli,1 Faik Ilik,2 Fatih Kayhan,3 Ahmet Cemal Pazarli4 1Department of Pulmonology, 2Department of Neurology, ³Department of Psychiatry, Faculty of Medicine, Mevlana University, Konya, Turkey; 4Department of Pulmonology, Elbistan State Hospital, Elbistan, Turkey Background: A number of studies have shown that COPD, particularly in its later and more severe stages, is associated with various cognitive deficits. Thus, the primary goal of the present study was to elucidate the extent of cognitive impairment in patients with long-term oxygen therapy-dependent (LTOTD COPD. In addition, this study aimed to determine the effectiveness of two cognitive screening tests, the Mini-Mental State Examination (MMSE and the Montreal Cognitive Assessment (MoCA, for COPD patients and the ability of oxygen therapy to mitigate COPD-related deficits in cognitive function. Methods: The present study enrolled 45 subjects: 24 nonuser and 21 regular-user LTOTD-COPD patients. All subjects had a similar grade of education, and there were no significant differences regarding age or sex. The MoCA (cutoff: <26 points and MMSE (cutoff: ≤24 points scores were compared between these two groups.Results: The nonuser LTOTD-COPD group had a significantly lower MoCA score than that of the regular-user LTOTD-COPD group (19.38±2.99 vs 21.68±2.14, respectively as well as a significantly lower MMSE score. Moreover, the absence of supplemental oxygen therapy increased the risk of cognitive impairment (MoCA, P=0.007 and MMSE, P=0.014, and the MoCA and MMSE scores significantly correlated with the number of emergency admissions and the number of hospitalizations in the last year.Conclusion: In the present study, the nonuser LTOTD-COPD group exhibited a significant decrease in cognitive status compared with the regular-user LTOTD-COPD group. This suggests that the assessment of cognitive function in nonuser LTOTD-COPD patients and the use of protective strategies, such as

  12. Implementation of Relevant Methods in Assessing Traffic-Technological Projects

    Directory of Open Access Journals (Sweden)

    Danijela Barić

    2007-09-01

    Full Text Available The assessment of investment traffic-technological projectsmeans a set of activities whose basic aim is to determine the justificationand feasibility of the projects. The decision-makingprocess, including the decision-making on investments is an extremelycomplex process, and the decision-maker has to have avision of the future and make decisions accordingly in a modemand flexible manner. Therefore, the decisions need to be theresult of a planning and research process based on relevant scientificmethods. The work includes the selected, analysed andpresented methods of cost-benefit analysis, methods of multi-criteria decision-making and SWOT (Strengths, Weaknesses,Opportunities, and Threats analysis methods. Regarding thebasic characteristics, the mentioned methods have been compared,the order of their implementation has been determined,and then they have been implemented in assessing the traffic-technological projects of reconstmction with the aim of selectingthe optimal variant solution.

  13. Criteria and methods for indicator assessment and selection

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Tennøy, Aud; Joumard, Robert

    2010-01-01

    indicators? How can several indicators be jointly considered? And how can indicators be used in planning and decision making? Firstly we provide definition of 'indicator of environmental sustainability in transport'. The functions, strengths and weaknesses of indicators as measurement tools, and as decision...... for indicators and assessments. As the decision making context influences the perceived and actual needs for indicators and methods, we also analysed the dimensions and context of decision making. We derived criteria and methods for the assessment and selection of indicators of environmental sustainability...... in transport, in terms of measurement, monitoring and management. The methods and the criteria are exemplified for seven chains of causality. Methods for a comprehensive joint consideration of environmentally sustainable indicators are analyzed and evaluated. They concerned aggregated or composite indicators...

  14. Rapid assessment methods in eye care: An overview

    Directory of Open Access Journals (Sweden)

    Srinivas Marmamula

    2012-01-01

    Full Text Available Reliable information is required for the planning and management of eye care services. While classical research methods provide reliable estimates, they are prohibitively expensive and resource intensive. Rapid assessment (RA methods are indispensable tools in situations where data are needed quickly and where time- or cost-related factors prohibit the use of classical epidemiological surveys. These methods have been developed and field tested, and can be applied across almost the entire gamut of health care. The 1990s witnessed the emergence of RA methods in eye care for cataract, onchocerciasis, and trachoma and, more recently, the main causes of avoidable blindness and visual impairment. The important features of RA methods include the use of local resources, simplified sampling methodology, and a simple examination protocol/data collection method that can be performed by locally available personnel. The analysis is quick and easy to interpret. The entire process is inexpensive, so the survey may be repeated once every 5-10 years to assess the changing trends in disease burden. RA survey methods are typically linked with an intervention. This article provides an overview of the RA methods commonly used in eye care, and emphasizes the selection of appropriate methods based on the local need and context.

  15. A New Method to Quickly Assess the Inhibitor Efficiency

    Institute of Scientific and Technical Information of China (English)

    GENG Chunlei; XU Yongmo; WENG Duan

    2008-01-01

    A new method to quickly assess the efficiency of corrosion inhibitor was developed by electrically accelerating chloride ions diffusing onto the surface of the embedded steel bar in concrete and inducing corrosion.Potentiodynamic polarization scanning and linear polarization method were used to evaluate the corrosion states which were compared with the direct observation of the bar surface by breaking the sample.The test duration was about two days and the results clearly show the differences in efficiency of the inhibitors tested.

  16. A Modified Generalized Fisher Method for Combining Probabilities from Dependent Tests

    Directory of Open Access Journals (Sweden)

    Hongying (Daisy eDai

    2014-02-01

    Full Text Available Rapid developments in molecular technology have yielded a large amount of high throughput genetic data to understand the mechanism for complex traits. The increase of genetic variants requires hundreds and thousands of statistical tests to be performed simultaneously in analysis, which poses a challenge to control the overall Type I error rate. Combining p-values from multiple hypothesis testing has shown promise for aggregating effects in high-dimensional genetic data analysis. Several p-value combining methods have been developed and applied to genetic data; see [Dai, et al. 2012b] for a comprehensive review. However, there is a lack of investigations conducted for dependent genetic data, especially for weighted p-value combining methods. Single nucleotide polymorphisms (SNPs are often correlated due to linkage disequilibrium. Other genetic data, including variants from next generation sequencing, gene expression levels measured by microarray, protein and DNA methylation data, etc. also contain complex correlation structures. Ignoring correlation structures among genetic variants may lead to severe inflation of Type I error rates for omnibus testing of p-values. In this work, we propose modifications to the Lancaster procedure by taking the correlation structure among p-values into account. The weight function in the Lancaster procedure allows meaningful biological information to be incorporated into the statistical analysis, which can increase the power of the statistical testing and/or remove the bias in the process. Extensive empirical assessments demonstrate that the modified Lancaster procedure largely reduces the Type I error rates due to correlation among p-values, and retains considerable power to detect signals among p-values. We applied our method to reassess published renal transplant data, and identified a novel association between B cell pathways and allograft tolerance.

  17. Problem Decomposition Method to Compute an Optimal Cover for a Set of Functional Dependencies

    Directory of Open Access Journals (Sweden)

    Vitalie COTELEA

    2011-12-01

    Full Text Available The paper proposes a problem decomposition method for building optimal cover for a set of functional dependencies to decrease the solving time. At the beginning, the paper includes an overview of the covers of functional dependencies. There are considered definitions and properties of non redundant covers for sets of functional dependencies, reduced and canonical covers as well as equivalence classes of functional dependencies, minimum and optimal covers. Then, a theoretical tool for inference of functional dependencies is proposed, which possesses the uniqueness property. And finally, the set of attributes of the relational schema is divided into equivalence classes of attributes that will serve as the basis for building optimal cover for a set of functional dependencies.

  18. Analysis and Comparison of Objective Methods for Image Quality Assessment

    Directory of Open Access Journals (Sweden)

    P. S. Babkin

    2014-01-01

    Full Text Available The purpose of this work is research and modification of the reference objective methods for image quality assessment. The ultimate goal is to obtain a modification of formal assessments that more closely corresponds to the subjective expert estimates (MOS.In considering the formal reference objective methods for image quality assessment we used the results of other authors, which offer results and comparative analyzes of the most effective algorithms. Based on these investigations we have chosen two of the most successful algorithm for which was made a further analysis in the MATLAB 7.8 R 2009 a (PQS and MSSSIM. The publication focuses on the features of the algorithms, which have great importance in practical implementation, but are insufficiently covered in the publications by other authors.In the implemented modification of the algorithm PQS boundary detector Kirsch was replaced by the boundary detector Canny. Further experiments were carried out according to the method of the ITU-R VT.500-13 (01/2012 using monochrome images treated with different types of filters (should be emphasized that an objective assessment of image quality PQS is applicable only to monochrome images. Images were obtained with a thermal imaging surveillance system. The experimental results proved the effectiveness of this modification.In the specialized literature in the field of formal to evaluation methods pictures, this type of modification was not mentioned.The method described in the publication can be applied to various practical implementations of digital image processing.Advisability and effectiveness of using the modified method of PQS to assess the structural differences between the images are shown in the article and this will be used in solving the problems of identification and automatic control.

  19. [Study on the risk assessment method of regional groundwater pollution].

    Science.gov (United States)

    Yang, Yan; Yu, Yun-Jiang; Wang, Zong-Qing; Li, Ding-Long; Sun, Hong-Wei

    2013-02-01

    Based on the boundary elements of system risk assessment, the regional groundwater pollution risk assessment index system was preliminarily established, which included: regional groundwater specific vulnerability assessment, the regional pollution sources characteristics assessment and the health risk assessment of regional featured pollutants. The three sub-evaluation systems were coupled with the multi-index comprehensive method, the risk was characterized with the Spatial Analysis of ArcMap, and a new method to evaluate regional groundwater pollution risk that suitable for different parts of natural conditions, different types of pollution was established. Take Changzhou as an example, the risk of shallow groundwater pollution was studied with the new method, and found that the vulnerability index of groundwater in Changzhou is high and distributes unevenly; The distribution of pollution sources is concentrated and has a great impact on groundwater pollution risks; Influenced by the pollutants and pollution sources, the values of health risks are high in the urban area of Changzhou. The pollution risk of shallow groundwater is high and distributes unevenly, and distributes in the north of the line of Anjia-Xuejia-Zhenglu, the center of the city and the southeast, where the human activities are more intense and the pollution sources are intensive.

  20. Biological methods used to assess surface water quality

    Directory of Open Access Journals (Sweden)

    Szczerbiñska Natalia

    2015-12-01

    Full Text Available In accordance with the guidelines of the Water Framework Directive 2000/60 (WFD, both ecological and chemical statuses determine the assessment of surface waters. The profile of ecological status is based on the analysis of various biological components, and physicochemical and hydromorphological indicators complement this assessment. The aim of this article is to present the biological methods used in the assessment of water status with a special focus on bioassay, as well as to provide a review of methods of monitoring water status. Biological test methods include both biomonitoring and bioanalytics. Water biomonitoring is used to assess and forecast the status of water. These studies aim to collect data on water pollution and forecast its impact. Biomonitoring uses organisms which are characterized by particular vulnerability to contaminants. Bioindicator organisms are algae, fungi, bacteria, larval invertebrates, cyanobacteria, macroinvertebrates, and fish. Bioanalytics is based on the receptors of contaminants that can be biologically active substances. In bioanalytics, biosensors such as viruses, bacteria, antibodies, enzymes, and biotests are used to assess degrees of pollution.

  1. [Assessing forest ecosystem health I. Model, method, and index system].

    Science.gov (United States)

    Chen, Gao; Dai, Limin; Ji, Lanzhu; Deng, Hongbing; Hao, Zhanqing; Wang, Qingli

    2004-10-01

    Ecosystem health assessment is one of the main researches and urgent tasks of ecosystem science in 21st century. An operational definition on ecosystem health and an all-sided, simple, easy operational and standard index system, which are the foundation of assessment on ecosystem health, are necessary in obtaining a simple and applicable assessment theory and method of ecosystem health. Taking the Korean pine and broadleaved mixed forest ecosystem as an example, an originally creative idea on ecosystem health was put forward in this paper based on the idea of mode ecosystem set and the idea of forest ecosystem health, together with its assessment. This creative idea can help understand what ecosystem health is. Finally, a formula was deduced based on a new effective health assessment method--health distance (HD), which is the first time to be brought forward in China. At the same time, aiming at it's characteristics by status understanding and material health questions, a health index system of Korean pine and broadleaved mixed forest ecosystem was put forward in this paper, which is a compound ecosystem based on the compound properties of nature, economy and society. It is concrete enough to measure sub-index, so it is the foundation to assess ecosystem health of Korean pine and broadleaved mixed forest in next researches.

  2. Student evaluation of teaching and assessment methods in pharmacology

    Directory of Open Access Journals (Sweden)

    Badyal Dinesh

    2010-01-01

    Full Text Available Background: The students are in the best position to comment on the effectiveness of any teaching system and they may be regarded as the best judges to assess the teaching and evaluation methods. Objective: This study was designed to obtain student feedback on teaching and assessment methods in the subject of pharmacology and use it for improvement. Materials and Methods: Based on student feedback from batch 2006, innovative strategies were implemented. To know the effect of these strategies feedback was obtained from subsequent batch 2007 using a written validated questionnaire covering various aspects of teaching and assessment methods. Results: Students were satisfied with all teaching methods except lecture, seminars and pharmacy exercises. Majority of the students showed preference for tutorials, short answer questions and revision classes. All students felt that there should be more time for clinical pharmacology and bedside teaching. The performance score of the students (batch 2007 indicated improvement in their scores (12% when earlier feedback suggestions were implemented. The pass percentage of the subsequent batch in university examinations improved from 90 to 100%. Conclusion: The implementation of suggestions obtained from students resulted in improvement in their performance. Hence, it is very essential to synchronize teaching and evaluation methods with special requirements of medical students.

  3. Significance and challenges of stereoselectivity assessing methods in drug metabolism

    Directory of Open Access Journals (Sweden)

    Zhuowei Shen

    2016-02-01

    Full Text Available Stereoselectivity in drug metabolism can not only influence the pharmacological activities, tolerability, safety, and bioavailability of drugs directly, but also cause different kinds of drug–drug interactions. Thus, assessing stereoselectivity in drug metabolism is of great significance for pharmaceutical research and development (R&D and rational use in clinic. Although there are various methods available for assessing stereoselectivity in drug metabolism, many of them have shortcomings. The indirect method of chromatographic methods can only be applicable to specific samples with functional groups to be derivatized or form complex with a chiral selector, while the direct method achieved by chiral stationary phases (CSPs is expensive. As a detector of chromatographic methods, mass spectrometry (MS is highly sensitive and specific, whereas the matrix interference is still a challenge to overcome. In addition, the use of nuclear magnetic resonance (NMR and immunoassay in chiral analysis are worth noting. This review presents several typical examples of drug stereoselective metabolism and provides a literature-based evaluation on current chiral analytical techniques to show the significance and challenges of stereoselectivity assessing methods in drug metabolism.

  4. Microbial composition during Chinese soy sauce koji-making based on culture dependent and independent methods.

    Science.gov (United States)

    Yan, Yin-zhuo; Qian, Yu-lin; Ji, Feng-di; Chen, Jing-yu; Han, Bei-zhong

    2013-05-01

    Koji-making is a key process for production of high quality soy sauce. The microbial composition during koji-making was investigated by culture-dependent and culture-independent methods to determine predominant bacterial and fungal populations. The culture-dependent methods used were direct culture and colony morphology observation, and PCR amplification of 16S/26S rDNA fragments followed by sequencing analysis. The culture-independent method was based on the analysis of 16S/26S rDNA clone libraries. There were differences between the results obtained by different methods. However, sufficient overlap existed between the different methods to identify potentially significant microbial groups. 16 and 20 different bacterial species were identified using culture-dependent and culture-independent methods, respectively. 7 species could be identified by both methods. The most predominant bacterial genera were Weissella and Staphylococcus. Both 6 different fungal species were identified using culture-dependent and culture-independent methods, respectively. Only 3 species could be identified by both sets of methods. The most predominant fungi were Aspergillus and Candida species. This work illustrated the importance of a comprehensive polyphasic approach in the analysis of microbial composition during soy sauce koji-making, the knowledge of which will enable further optimization of microbial composition and quality control of koji to upgrade Chinese traditional soy sauce product.

  5. Method and apparatus of assessing down-hole drilling conditions

    Science.gov (United States)

    Hall, David R.; Pixton, David S.; Johnson, Monte L.; Bartholomew, David B.; Fox, Joe

    2007-04-24

    A method and apparatus for use in assessing down-hole drilling conditions are disclosed. The apparatus includes a drill string, a plurality of sensors, a computing device, and a down-hole network. The sensors are distributed along the length of the drill string and are capable of sensing localized down-hole conditions while drilling. The computing device is coupled to at least one sensor of the plurality of sensors. The data is transmitted from the sensors to the computing device over the down-hole network. The computing device analyzes data output by the sensors and representative of the sensed localized conditions to assess the down-hole drilling conditions. The method includes sensing localized drilling conditions at a plurality of points distributed along the length of a drill string during drilling operations; transmitting data representative of the sensed localized conditions to a predetermined location; and analyzing the transmitted data to assess the down-hole drilling conditions.

  6. The AHP method used in assessment of foundry enterprise position

    Directory of Open Access Journals (Sweden)

    J. Szymszal

    2008-10-01

    Full Text Available Complex assessment of activity of a selected foundry enterprise based on a modern AHP (Analytic Hierarchy Process method has beenpresented. Having defined the areas of analysis, which include: marketing (products, distribution channels, sales organisation and client concentration, personnel (skills, managerial abilities, organisation climate, effectiveness of incentives, personnel fluctuations, production (availability of raw materials, technical level of production, effective use of production capacities, organisation and management (foundry structure, organisation culture, management performance, the analysis was made using the weighted sum of evaluations. The second step consisted in a comparative assessment of Foundry position using Saaty’s scale modified by Weber and the AHP method with examinationof a hierarchy structure involving the main (parent problem and its direct evolution into sub-problems. The assessment of Foundryposition made by AHP enables introducing changes and/or innovations which are expected to improve the overall productioneffectiveness.

  7. Integrating methods for ecosystem service assessment and valuation

    NARCIS (Netherlands)

    Hattam, Caroline; Bohnke-Henrichs, Anne; Börger, Tobias; Burdon, Daryl; Hadjimichael, Maria; Delaney, Alyne; Atkins, Jonathan P.; Garrard, Samantha; Austen, Melanie C.

    2015-01-01

    A mixed-method approach was used to assess and value the ecosystem services derived from the Dogger Bank, an extensive shallow sandbank in the southern North Sea. Three parallel studies were undertaken that 1) identified and quantified, where possible, how indicators for ecosystem service provisi

  8. Myths and Misconceptions about Using Qualitative Methods in Assessment

    Science.gov (United States)

    Harper, Shaun R.; Kuh, George D.

    2007-01-01

    The value of qualitative assessment approaches has been underestimated primarily because they are often juxtaposed against long-standing quantitative traditions and the widely accepted premise that the best research produces generalizable and statistically significant findings. Institutional researchers avoid qualitative methods for at least three…

  9. The Annemarie Roeper Method of Qualitative Assessment: My Journey

    Science.gov (United States)

    Beneventi, Anne

    2016-01-01

    The Annemarie Roeper Method of Qualitative Assessment (QA) establishes an extremely rich set of procedures for revealing students' strengths as well as opportunities for the development of bright young people. This article explores the ways in which the QA process serves as a sterling example of a holistic, authentic system for recognizing…

  10. Using Empirical Article Analysis to Assess Research Methods Courses

    Science.gov (United States)

    Bachiochi, Peter; Everton, Wendi; Evans, Melanie; Fugere, Madeleine; Escoto, Carlos; Letterman, Margaret; Leszczynski, Jennifer

    2011-01-01

    Developing students who can apply their knowledge of empirical research is a key outcome of the undergraduate psychology major. This learning outcome was assessed in two research methods courses by having students read and analyze a condensed empirical journal article. At the start and end of the semester, students in multiple sections of an…

  11. Student Teachers' Views about Assessment and Evaluation Methods in Mathematics

    Science.gov (United States)

    Dogan, Mustafa

    2011-01-01

    This study aimed to find out assessment and evaluation approaches in a Mathematics Teacher Training Department based on the views and experiences of student teachers. The study used a descriptive survey method, with the research sample consisting of 150 third- and fourth-year Primary Mathematics student teachers. Data were collected using a…

  12. Assessment method for buildings' Rehabilitation needs: Development and application

    NARCIS (Netherlands)

    Vilhena, A.; Costa Branco De Oliveira Pedro, J.A.; Vasconcelos de Paiva, J.

    2010-01-01

    The purpose of this study was to develop an assessment method of a building rehabilitation needs. It was considered that a building needs rehabilitation if it would not comply with the functional requirements defined in Portuguese legislation or determined by good practices of design and constructio

  13. Assessing Students' Writing Skills: A Comparison of Direct & Indirect Methods.

    Science.gov (United States)

    Koffler, Stephen L.

    This research examined the results from direct and indirect writing assessments to determine the most effective method of discrimination. The New Jersey State Department of Education developed a test for ninth-grade students which was designed to measure the ability to apply writing mechanics to written text and to communicate effectively in…

  14. Using Empirical Article Analysis to Assess Research Methods Courses

    Science.gov (United States)

    Bachiochi, Peter; Everton, Wendi; Evans, Melanie; Fugere, Madeleine; Escoto, Carlos; Letterman, Margaret; Leszczynski, Jennifer

    2011-01-01

    Developing students who can apply their knowledge of empirical research is a key outcome of the undergraduate psychology major. This learning outcome was assessed in two research methods courses by having students read and analyze a condensed empirical journal article. At the start and end of the semester, students in multiple sections of an…

  15. The Annemarie Roeper Method of Qualitative Assessment: My Journey

    Science.gov (United States)

    Beneventi, Anne

    2016-01-01

    The Annemarie Roeper Method of Qualitative Assessment (QA) establishes an extremely rich set of procedures for revealing students' strengths as well as opportunities for the development of bright young people. This article explores the ways in which the QA process serves as a sterling example of a holistic, authentic system for recognizing…

  16. Myths and Misconceptions about Using Qualitative Methods in Assessment

    Science.gov (United States)

    Harper, Shaun R.; Kuh, George D.

    2007-01-01

    The value of qualitative assessment approaches has been underestimated primarily because they are often juxtaposed against long-standing quantitative traditions and the widely accepted premise that the best research produces generalizable and statistically significant findings. Institutional researchers avoid qualitative methods for at least three…

  17. A simple method to assess bacterial attachment to surfaces

    Digital Repository Service at National Institute of Oceanography (India)

    Sonak, S.; Bhosle, N.B.

    , sensitive, less time consuming and therefore many samples can be analysed in a short period of time. When the calibrated method was employed to assess the attachment of Vibrio sp to polystyrene, stainless steel and copper, it give a fairly reliable estimate...

  18. Application of a solvable model to test the accuracy of the time-dependent Hartree-Fock method

    Energy Technology Data Exchange (ETDEWEB)

    Bouayad, N. [Blida Univ. (Algeria). Inst. de Phys.; Zettili, N. [Blida Univ. (Algeria). Inst. de Phys.]|[Department of Physics, King Fahd University of Petroleum and Minerals, Dhahran 31261 (Saudi Arabia)

    1996-11-11

    This work deals with the application of a solvable model to study the accuracy of a nuclear many-body approximation method. Using a new exactly solvable model, we carry out here a quantitative test of the accuracy of the time-dependent Hartree-Fock (TDHF) method. The application of the TDHF method to the model reveals that the model is suitable for describing various forms of collective motion: harmonic and anharmonic oscillations as well as rotations. We then show that, by properly quantizing the TDHF results, the TDHF approximation method yields energy spectra that are in very good agreement with their exact counterparts. This work shows that the model offers a rich and comprehensive framework for studying the various aspects of the TDHF method and also for assessing quantitatively its accuracy. (orig.).

  19. Application of a solvable model to test the accuracy of the time-dependent Hartree-Fock method

    Science.gov (United States)

    Bouayad, Nouredine; Zettili, Nouredine

    1996-02-01

    This work deals with the application of a solvable model to study the accuracy of a nuclear many-body approximation method. Using a new exactly solvable model, we carry out here a quantitative test of the accuracy of the time-dependent Hartree-Fock (TDHF) method. The application of the TDHF method to the model reveals that the model is suitable for describing various forms of collective motion: harmonic and anharmonic oscillations as well as rotations. We then show that, by properly quantizing the TDHF results, the TDHF approximation method yields energy spectra that are in very good agreement with their exact counterparts. This work shows that the model offers a rich and comprehensive framework for studying the various aspects of the TDHF method and also for assessing quantitatively its accuracy.

  20. Orohanditest: A new method for orofacial damage assessment.

    Science.gov (United States)

    Caldas, Inês Morais; Magalhães, Teresa; Matos, Eduarda; Afonso, Américo

    2013-11-01

    Currently, orofacial sequelae are recognized as very influential on the quality-of-life for a victim of orofacial damage. Therefore, correct forensic assessment for indenisation purposes is mandatory. However, orofacial damage is frequently reduced to organic components, which results in a forensic assessment process, which are inadequate. This study aims to improve the orofacial damage assessment through the development of an auxiliary tool, the orohanditest. A preliminary inventory was constructed, using relevant bibliographic elements and retrospective study of forensic examinations reports concerning orofacial trauma. This inventory was then utilized in the assessment of 265 orofacial trauma victims for validation. Validity was studied by analyzing the internal construct validity (exploring factorial validity and assessing internal consistency) and the external construct validity (assessing convergent validity and discriminant validity). The level of significance was defined as P orofacial damage coefficient) reflects the orofacial damage severity. Orohanditest provides a reliable, precise, and complete orofacial damage description and quantification. Therefore, this method can be useful as an auxiliary tool in the orofacial damage assessment process.

  1. Assessment of composite index methods for agricultural vulnerability to climate change.

    Science.gov (United States)

    Wiréhn, Lotten; Danielsson, Åsa; Neset, Tina-Simone S

    2015-06-01

    A common way of quantifying and communicating climate vulnerability is to calculate composite indices from indicators, visualizing these as maps. Inherent methodological uncertainties in vulnerability assessments, however, require greater attention. This study examines Swedish agricultural vulnerability to climate change, the aim being to review various indicator approaches for assessing agricultural vulnerability to climate change and to evaluate differences in climate vulnerability depending on the weighting and summarizing methods. The reviewed methods are evaluated by being tested at the municipal level. Three weighting and summarizing methods, representative of climate vulnerability indices in general, are analysed. The results indicate that 34 of 36 method combinations differ significantly from each other. We argue that representing agricultural vulnerability in a single composite index might be insufficient to guide climate adaptation. We emphasize the need for further research into how to measure and visualize agricultural vulnerability and into how to communicate uncertainties in both data and methods. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Addressing dependability by applying an approach for model-based risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjorn Axel [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: bjorn.axel.gran@hrp.no; Fredriksen, Rune [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: rune.fredriksen@hrp.no; Thunem, Atoosa P.-J. [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: atoosa.p-j.thunem@hrp.no

    2007-11-15

    This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development.

  3. Withdrawal of corticosteroids in inflammatory bowel disease patients after dependency periods ranging from 2 to 45 years: a proposed method.

    LENUS (Irish Health Repository)

    Murphy, S J

    2012-02-01

    BACKGROUND: Even in the biologic era, corticosteroid dependency in IBD patients is common and causes a lot of morbidity, but methods of withdrawal are not well described. AIM: To assess the effectiveness of a corticosteroid withdrawal method. METHODS: Twelve patients (10 men, 2 women; 6 ulcerative colitis, 6 Crohn\\'s disease), median age 53.5 years (range 29-75) were included. IBD patients with quiescent disease refractory to conventional weaning were transitioned to oral dexamethasone, educated about symptoms of the corticosteroid withdrawal syndrome (CWS) and weaned under the supervision of an endocrinologist. When patients failed to wean despite a slow weaning pace and their IBD remaining quiescent, low dose synthetic ACTH stimulation testing was performed to assess for adrenal insufficiency. Multivariate analysis was performed to assess predictors of a slow wean. RESULTS: Median durations for disease and corticosteroid dependency were 21 (range 3-45) and 14 (range 2-45) years respectively. Ten patients (83%) were successfully weaned after a median follow-up from final wean of 38 months (range 5-73). Disease flares occurred in two patients, CWS in five and ACTH testing was performed in 10. Multivariate analysis showed that longer duration of corticosteroid use appeared to be associated with a slower wean (P = 0.056). CONCLUSIONS: Corticosteroid withdrawal using this protocol had a high success rate and durable effect and was effective in patients with long-standing (up to 45 years) dependency. As symptoms of CWS mimic symptoms of IBD disease flares, gastroenterologists may have difficulty distinguishing them, which may be a contributory factor to the frequency of corticosteroid dependency in IBD patients.

  4. Time-dependent density-functional theory in the projector augmented-wave method

    DEFF Research Database (Denmark)

    Walter, Michael; Häkkinen, Hannu; Lehtovaara, Lauri

    2008-01-01

    We present the implementation of the time-dependent density-functional theory both in linear-response and in time-propagation formalisms using the projector augmented-wave method in real-space grids. The two technically very different methods are compared in the linear-response regime where we...

  5. A Maximum Likelihood Method for Latent Class Regression Involving a Censored Dependent Variable.

    Science.gov (United States)

    Jedidi, Kamel; And Others

    1993-01-01

    A method is proposed to simultaneously estimate regression functions and subject membership in "k" latent classes or groups given a censored dependent variable for a cross-section of subjects. Maximum likelihood estimates are obtained using an EM algorithm. The method is illustrated through a consumer psychology application. (SLD)

  6. Assessment of chemoselective neoglycosylation methods using chlorambucil as a model.

    Science.gov (United States)

    Goff, Randal D; Thorson, Jon S

    2010-11-25

    To systematically assess the impact of glycosylation and the corresponding chemoselective linker upon the anticancer activity/selectivity of the drug chlorambucil, herein we report the synthesis and anticancer activities of a 63-member library of chlorambucil-based neoglycosides. A comparison of N-alkoxyamine-, N-acylhydrazine-, and N-hydroxyamine-based chemoselective glycosylation of chlorambucil revealed sugar- and linker-dependent partitioning among open- and closed-ring neoglycosides and corresponding sugar-dependent variant biological activity. Cumulatively, this study represents the first neoglycorandomization of a synthetic drug and expands our understanding of the impact of sugar structure upon product distribution/equilibria in the context of N-alkoxyamino-, N-hydroxyamino-, and N-acylhydrazine-based chemoselective glycosylation. This study also revealed several analogues with increased in vitro anticancer activity, most notably D-threoside 60 (NSC 748747), which displayed much broader tumor specificity and notably increased potency over the parent drug.

  7. Multi-item direct behavior ratings: Dependability of two levels of assessment specificity.

    Science.gov (United States)

    Volpe, Robert J; Briesch, Amy M

    2015-09-01

    Direct Behavior Rating-Multi-Item Scales (DBR-MIS) have been developed as formative measures of behavioral assessment for use in school-based problem-solving models. Initial research has examined the dependability of composite scores generated by summing all items comprising the scales. However, it has been argued that DBR-MIS may offer assessment of 2 levels of behavioral specificity (i.e., item-level, global composite-level). Further, it has been argued that scales can be individualized for each student to improve efficiency without sacrificing technical characteristics. The current study examines the dependability of 5 items comprising a DBR-MIS designed to measure classroom disruptive behavior. A series of generalizability theory and decision studies were conducted to examine the dependability of each item (calls out, noisy, clowns around, talks to classmates and out of seat), as well as a 3-item composite that was individualized for each student. Seven graduate students rated the behavior of 9 middle-school students on each item over 3 occasions. Ratings were based on 10-min video clips of students during mathematics instruction. Separate generalizability and decision studies were conducted for each item and for a 3-item composite that was individualized for each student based on the highest rated items on the first rating occasion. Findings indicate favorable dependability estimates for 3 of the 5 items and exceptional dependability estimates for the individualized composite.

  8. Statistical methods for assessing agreement between continuous measurements

    DEFF Research Database (Denmark)

    Sokolowski, Ineta; Hansen, Rikke Pilegaard; Vedsted, Peter

    ), concordance coefficient, Bland-Altman limits of agreement and percentage of agreement to assess the agreement between patient reported delay and doctor reported delay in diagnosis of cancer in general practice. Key messages: The correct statistical approach is not obvious. Many studies give the product......-moment correlation coefficient (r) between the results of the two measurements methods as an indicator of agreement, which is wrong. There have been proposed several alternative methods, which we will describe together with preconditions for use of the methods....

  9. Semiclassical Method to Schr(o)dinger Equation with Position-Dependent Effective Mass

    Institute of Scientific and Technical Information of China (English)

    CHEN Gang; XUAN Pei-Cai; CHEN Zi-Dong

    2006-01-01

    In this paper, two novel semiclassical methods including the standard and supersymmetric WKB quantization conditions are suggested to discuss the Schrodinger equation with position-dependent effective mass. From a proper coordinate transformation, the formalism of the Schrodinger equation with position-dependent effective mass is mapped into isospectral one with constant mass and therefore for a given mass distribution and physical potential function the bound state energy spectrum can be determined easily by above method associated with a simple integral formula. It is shown that our method can give the analytical results for some exactly-solvable quantum systems.

  10. Standard guide for three methods of assessing buried steel tanks

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1998-01-01

    1.1 This guide covers procedures to be implemented prior to the application of cathodic protection for evaluating the suitability of a tank for upgrading by cathodic protection alone. 1.2 Three procedures are described and identified as Methods A, B, and C. 1.2.1 Method A—Noninvasive with primary emphasis on statistical and electrochemical analysis of external site environment corrosion data. 1.2.2 Method B—Invasive ultrasonic thickness testing with external corrosion evaluation. 1.2.3 Method C—Invasive permanently recorded visual inspection and evaluation including external corrosion assessment. 1.3 This guide presents the methodology and the procedures utilizing site and tank specific data for determining a tank's condition and the suitability for such tanks to be upgraded with cathodic protection. 1.4 The tank's condition shall be assessed using Method A, B, or C. Prior to assessing the tank, a preliminary site survey shall be performed pursuant to Section 8 and the tank shall be tightness test...

  11. A Structured Framework for Assessing the "Goodness" of Agile Methods

    CERN Document Server

    Soundararajan, Shvetha

    2010-01-01

    Agile Methods are designed for customization; they offer an organization or a team the flexibility to adopt a set of principles and practices based on their culture and values. While that flexibility is consistent with the agile philosophy, it can lead to the adoption of principles and practices that can be sub-optimal relative to the desired objectives. We question then, how can one determine if adopted practices are "in sync" with the identified principles, and to what extent those principles support organizational objectives? In this research, we focus on assessing the "goodness" of an agile method adopted by an organization based on (1) its adequacy, (2) the capability of the organization to provide the supporting environment to competently implement the method, and (3) its effectiveness. To guide our assessment, we propose the Objectives, Principles and Practices (OPP) framework. The design of the OPP framework revolves around the identification of the agile objectives, principles that support the achiev...

  12. Application of Method of Multicriteria Alternatives for Land Assessment

    Directory of Open Access Journals (Sweden)

    Pavel V. Grigorev

    2017-06-01

    Full Text Available This article discusses the use of the multicriteria alternatives method for the assessment of a real estate object taking into account the concept of a system of standards, rules and requirements in the field of valuation activities, considering international standards for valuation. The main means for work and costs associated with allotment and development of the built-up area are indicated. In the work, the assessment of four sites is carried out taking into account three parameters: the distance from the construction site to the center by car; cost of 1 ha of land of each of the plots; deterioration of the centralized heat supply networks. The results show that the method of multicriteria alternatives is objective and optimal when comparing land sites on the criteria with different units of measurements. The advantage of this method is the possibility to apply it to evaluation in different areas of the economy.

  13. Total System Performance Assessment-License Application Methods and Approach

    Energy Technology Data Exchange (ETDEWEB)

    J. McNeish

    2002-09-13

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issue (KTI) agreements, the ''Yucca Mountain Review Plan'' (CNWRA 2002 [158449]), and 10 CFR Part 63. This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are utilized in this document.

  14. Pesticide risk assessment in free-ranging bees is weather and landscape dependent.

    Science.gov (United States)

    Henry, Mickaël; Bertrand, Colette; Le Féon, Violette; Requier, Fabrice; Odoux, Jean-François; Aupinel, Pierrick; Bretagnolle, Vincent; Decourtye, Axel

    2014-07-10

    The risk assessment of plant protection products on pollinators is currently based on the evaluation of lethal doses through repeatable lethal toxicity laboratory trials. Recent advances in honeybee toxicology have, however, raised interest on assessing sublethal effects in free-ranging individuals. Here, we show that the sublethal effects of a neonicotinoid pesticide are modified in magnitude by environmental interactions specific to the landscape and time of exposure events. Field sublethal assessment is therefore context dependent and should be addressed in a temporally and spatially explicit way, especially regarding weather and landscape physiognomy. We further develop an analytical Effective Dose (ED) framework to help disentangle context-induced from treatment-induced effects and thus to alleviate uncertainty in field studies. Although the ED framework involves trials at concentrations above the expected field exposure levels, it allows to explicitly delineating the climatic and landscape contexts that should be targeted for in-depth higher tier risk assessment.

  15. A Comparison between Two Instruments for Assessing Dependency in Daily Activities: Agreement of the Northwick Park Dependency Score with the Functional Independence Measure

    Directory of Open Access Journals (Sweden)

    Siv Svensson

    2012-01-01

    Full Text Available Background. There is a need for tools to assess dependency among persons with severe impairments. Objectives. The aim was to compare the Functional Independence Measure (FIM and the Northwick Park Dependency Score (NPDS, in a sample from in-patient rehabilitation. Material and Methods. Data from 115 persons (20 to 65 years of age with neurological impairments was gathered. Analyses were made of sensitivity, specificity, positive predictive value, and negative predictive value. Agreement of the scales was assessed with kappa and concordance with Goodman-Kruskal’s gamma. Scale structures were explored using the Rank-Transformable Pattern of Agreement (RTPA. Content validation was performed. Results. The sensitivity of the NPDS as compared to FIM varied between 0.53 (feeding and 1.0 (mobility and specificity between 0.64 (mobility and 1.0 (bladder. The positive predictive value varied from 0.62 (mobility to 1.0 (bladder, and the negative predictive value varied from 0.48 (bowel to 1.0 (mobility. Agreement between the scales was moderate to good (four items and excellent (three items. Concordance was good, with a gamma of −.856, an asymptotic error (ase of .025, and P<.000. The parallel reliability between the FIM and the NPDS showed a tendency for NPDS to be more sensitive (having more categories when dependency is high. Conclusion. FIM and NPDS complement each other. NPDS can be used as a measure for severely injured patients who are sensitive when there is a high need of nursing time.

  16. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    Science.gov (United States)

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment.

  17. Validating the JobFit system functional assessment method

    Energy Technology Data Exchange (ETDEWEB)

    Jenny Legge; Robin Burgess-Limerick

    2007-05-15

    Workplace injuries are costing the Australian coal mining industry and its communities $410 Million a year. This ACARP study aims to meet those demands by developing a safe, reliable and valid pre-employment functional assessment tool. All JobFit System Pre-Employment Functional Assessments (PEFAs) consist of a musculoskeletal screen, balance test, aerobic fitness test and job-specific postural tolerances and material handling tasks. The results of each component are compared to the applicant's job demands and an overall PEFA score between 1 and 4 is given with 1 being the better score. The reliability study and validity study were conducted concurrently. The reliability study examined test-retest, intra-tester and inter-tester reliability of the JobFit System Functional Assessment Method. Overall, good to excellent reliability was found, which was sufficient to be used for comparison with injury data for determining the validity of the assessment. The overall assessment score and material handling tasks had the greatest reliability. The validity study compared the assessment results of 336 records from a Queensland underground and open cut coal mine with their injury records. A predictive relationship was found between PEFA score and the risk of a back/trunk/shoulder injury from manual handling. An association was also found between PEFA score of 1 and increased length of employment. Lower aerobic fitness test results had an inverse relationship with injury rates. The study found that underground workers, regardless of PEFA score, were more likely to have an injury when compared to other departments. No relationship was found between age and risk of injury. These results confirm the validity of the JobFit System Functional Assessment method.

  18. Assessment and comparison of methods for solar ultraviolet radiation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Leszczynski, K.

    1995-06-01

    In the study, the different methods to measure the solar ultraviolet radiation are compared. The methods included are spectroradiometric, erythemally weighted broadband and multi-channel measurements. The comparison of the different methods is based on a literature review and assessments of optical characteristics of the spectroradiometer Optronic 742 of the Finnish Centre for Radiation and Nuclear Safety (STUK) and of the erythemally weighted Robertson-Berger type broadband radiometers Solar Light models 500 and 501 of the Finnish Meteorological Institute and STUK. An introduction to the sources of error in solar UV measurements, to methods for radiometric characterization of UV radiometers together with methods for error reduction are presented. Reviews on experiences from world-wide UV monitoring efforts and instrumentation as well as on the results from international UV radiometer intercomparisons are also presented. (62 refs.).

  19. Credit Assessment of Contractors: A Rough Set Method

    Institute of Scientific and Technical Information of China (English)

    LIU Gaojun; ZHU Yan

    2006-01-01

    A rough set method is presented in this paper to assess the credit of contractors. Unlike traditional methods, the rough set method deduces credit-classifying rules from actual data to predict new cases. The method uses a contractors' database with a genetic algorithm and an exhaustive reduction implemented using ROSETTA software that integrates rough set method. The classification accuracy of the rough set model is not as good as that of a decision tree, logistic regression, and neural network models, but the rough set model more accurately predicts contractors with bad credit. The results show that the rough set model is especially useful for detecting corporations with bad credit in the currently disordered Chinese construction market.

  20. Methodical Approaches to the Assessment of Personnel Adaptation System

    Directory of Open Access Journals (Sweden)

    Elena Aleksandrovna Petrova

    2015-12-01

    Full Text Available The formation of system of adaptation of the personnel is a necessary link of personnel management. Unfortunately, importance of actions for career guidance and adaptation of workers is not seriously perceived by HR departments. Many state enterprises and commercial organizations still have no basic programs of adaptation. Introduction of a control system of adaptation in the enterprises represents rather complex challenge, but the solution of such important tasks for the enterprise depends on: the reduction of starting expenses, the reduction of staff turnover; perhaps faster achievement of the working indicators accepted for the organization – the employer. Adaptation process demands certain temporal and financial investments from the company. Therefore it is necessary to offer the mechanism of an assessment of efficiency of these expenses. It is revealed that now general indicators of assessment of efficiency of adaptation of only three types are used: professional, psychophysiological and social, and psychological. The conclusion is also drawn that, it is not enough for complex definition of results of this process: criteria by which it is possible to carry out a quantitative assessment, have to be “in a sheaf” with the purposes and problems of adaptation. On the example of the concrete organization, the technique of application of universal tool of an assessment of system effectiveness of adaptation of the personnel which can be used in any organization is offered and described.

  1. Methodical support of assessment of enterprise corporate culture

    Directory of Open Access Journals (Sweden)

    M.I. Ovcharenko

    2013-06-01

    Full Text Available The aim of the article. The article summarizes the existing theoretical approaches to the assessment of corporate culture of enterprise and defined the benefits and shortcomings of existing assessment methods.The results of the analysis. In particular, the present methods of such complex phenomena as organization's culture are conventionally divided into three groups: holistic scholar is deeply immersed in the culture of the organization and acts as communication observer; metaphorical the researcher uses language samples of documents, reports that there are stories and conversations that help to identify the fingerprints of culture, quantitative the researcher evaluates many points of view for evaluating the attributes of organization's culture.The authors concluded that it is important to develop methodology based on empirical evidence obtained as a result of a combination of both quantitative and holistic techniques for a comprehensive assessment of corporate culture. It will maximize the assessment of corporate culture. Thorough analysis of existing scientific research and theoretical developments of the mentioned problems the authors revealed the absence of adequate methodological approaches to quantify the level of corporate culture in the company.When calculating the integral indicator of corporate culture LCC Level of corporate culture, the authors performed a mathematical formalization assessment of corporate culture, the algorithm of calculation which includes seven stages.Hierarchical three-level structure of corporate culture was developed for assessing the corporate culture of the company. The limits of each criterion were determined. For identifying the relationship of each qualitative assessment of the value of the i-th parameter with the appropriate level of corporate culture (low, medium, high, high authors used the method of analysis T.Saaty. Formula for calculating the integral index, defined by the range and quality level of

  2. Risk assessment method of major unsafe hydroelectric project

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Based on the characteristics of major unsafe hydroelectric projects and the data from field detection, in situ monitoring, and regular safety inspection, the funda-mental principles of operation risk assessment are proposed in this paper. Mean-while, a three layer hierarchical system is constructed, and an improved analytical hierarchical process combining genetic algorithm and analytical hierarchical process is established, with corresponding program. The operation risk of some unsafe dam was assessed with the principles, method and program presented in this paper and the major factors which would affect the operation of the dam were pointed out.

  3. Risk assessment method of major unsafe hydroelectric project

    Institute of Scientific and Technical Information of China (English)

    WU ZhongRu; SU HuaiZhi; GUO HaiQing

    2008-01-01

    Based on the characteristics of major unsafe hydroelectric projects and the data from field detection, in situ monitoring, and regular safety inspection, the funda- menial principles of operation risk assessment are proposed in this paper. Mean- while, a three layer hierarchical system is constructed, and an improved analytical hierarchical process combining genetic algorithm and analytical hierarchical process is established, with corresponding program. The operation risk of some unsafe dam was assessed with the principles, method and program presented in this paper and the major factors which would affect the operation of the dam were pointed out.

  4. Solving Ratio-Dependent Predator-Prey System with Constant Effort Harvesting Using Homotopy Perturbation Method

    Directory of Open Access Journals (Sweden)

    Abdoul R. Ghotbi

    2008-01-01

    Full Text Available Due to wide range of interest in use of bioeconomic models to gain insight into the scientific management of renewable resources like fisheries and forestry, homotopy perturbation method is employed to approximate the solution of the ratio-dependent predator-prey system with constant effort prey harvesting. The results are compared with the results obtained by Adomian decomposition method. The results show that, in new model, there are less computations needed in comparison to Adomian decomposition method.

  5. Neutron Scattering in Hydrogenous Moderators, Studied by Time Dependent Reaction Rate Method

    Energy Technology Data Exchange (ETDEWEB)

    Larsson, L.G.; Moeller, E.; Purohit, S.N.

    1966-03-15

    The moderation and absorption of a neutron burst in water, poisoned with the non-1/v absorbers cadmium and gadolinium, has been followed on the time scale by multigroup calculations, using scattering kernels for the proton gas and the Nelkin model. The time dependent reaction rate curves for each absorber display clear differences for the two models, and the separation between the curves does not depend much on the absorber concentration. An experimental method for the measurement of infinite medium reaction rate curves in a limited geometry has been investigated. This method makes the measurement of the time dependent reaction rate generally useful for thermalization studies in a small geometry of a liquid hydrogenous moderator, provided that the experiment is coupled to programs for the calculation of scattering kernels and time dependent neutron spectra. Good agreement has been found between the reaction rate curve, measured with cadmium in water, and a calculated curve, where the Haywood kernel has been used.

  6. Application of geosites assessment method in geopark context

    Science.gov (United States)

    Martin, Simon; Perret, Amandine; Renau, Pierre; Cartier-Moulin, Olivier; Regolini-Bissig, Géraldine

    2014-05-01

    The regional natural park of the Monts d'Ardèche (Ardèche and Haute-Loire departments, France) is candidate to the European Geopark Network (EGN) in 2014. The area has a wide geodiversity - with rocks from Cambrian to Pleistocene (basalt flows) - and interesting features like phonolitic protrusions, maars and granite boulders fields. Around 115 sites were selected and documented through a geosites inventory carried out in the territory. This pre-selection was supervised by the Ardèche Geological Society and is therefore expert advice based. In the context of EGN candidature, these potential geosites were assessed with a simplified method. It follows the spirit of the method from the University of Lausanne (Reynard et al., 2007) and its recent developments: assessment of the scientific (central) value and of a set of additional values (ecological and cultural). As this assessment aimed to offer a management tool to the future geopark's authorities, a special focus was given to management aspects. In particular, the opportunities to use the site for education (from schools to universities) and for tourism as well as the existence of protection and of interpretive facilities were documented and assessed. Several interesting conclusions may be drawn from this case study: (1) expert assessment is effective when it is based on a pre-existing inventory which is well structured and documented; (2) even simplified, an assessment method is a very useful framework to expert assessment as it focuses the discussions on most important points and helps to balance the assessment; (3) whereas the inventory can be extensively detailed and partly academic, the assessment in the geopark context is objective-driven in order to answer management needs. The place of the geosites assessment among the three key players of a geopark construction process (i.e. territory's managers, local geoscientists and EGN) is also discussed. This place can be defined as the point of consensus of needs

  7. Assessment of disinfection of hospital surfaces using different monitoring methods

    Directory of Open Access Journals (Sweden)

    Adriano Menis Ferreira

    2015-06-01

    Full Text Available OBJECTIVE: to assess the efficiency of cleaning/disinfection of surfaces of an Intensive Care Unit.METHOD: descriptive-exploratory study with quantitative approach conducted over the course of four weeks. Visual inspection, bioluminescence adenosine triphosphate and microbiological indicators were used to indicate cleanliness/disinfection. Five surfaces (bed rails, bedside tables, infusion pumps, nurses' counter, and medical prescription table were assessed before and after the use of rubbing alcohol at 70% (w/v, totaling 160 samples for each method. Non-parametric tests were used considering statistically significant differences at p<0.05.RESULTS: after the cleaning/disinfection process, 87.5, 79.4 and 87.5% of the surfaces were considered clean using the visual inspection, bioluminescence adenosine triphosphate and microbiological analyses, respectively. A statistically significant decrease was observed in the disapproval rates after the cleaning process considering the three assessment methods; the visual inspection was the least reliable.CONCLUSION: the cleaning/disinfection method was efficient in reducing microbial load and organic matter of surfaces, however, these findings require further study to clarify aspects related to the efficiency of friction, its frequency, and whether or not there is association with other inputs to achieve improved results of the cleaning/disinfection process.

  8. Fuzzy Comprehensive Assessment Method to Determine Tectonic Stress Patterns

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hai; QI Lan; HAO Caizhe; GUO Lei

    2007-01-01

    The tectonic stress patterns were determined by a fuzzy comprehensive assessment method. Data of in-situ survey and fault information were utilized in the method. First, by making pressure and tension in the directions of along-river, cross-river, shear clockwise, and shear counter-clockwise, 26 types of tectonic stress patterns were presented. And the stress vector of each pat-tern was obtained with FE software by taking unit displacement as boundary load. Then, by takingthe 26 types of tectonic stress patterns as index set and 3 main stresses as factor set and choosing various operators, comparison of directions of computational stress vector and survey stress vector was made and the most possible tectonic stress pattern was obtained. Taking the 26 types of tectonic stress patterns as index set and strike angle as factor set, comparison of relationships between formation of fault and tectonic stress was made, and the tectonic stress patterns were assessed with known fault information. By summarizing the above assessment results, the most impossible tectonic stress pattern was obtained. Finally an engineering case was quoted to validate that the method is more feasible and reliable than traditional empirical method.

  9. How to assess the quality of your analytical method?

    Science.gov (United States)

    Topic, Elizabeta; Nikolac, Nora; Panteghini, Mauro; Theodorsson, Elvar; Salvagno, Gian Luca; Miler, Marijana; Simundic, Ana-Maria; Infusino, Ilenia; Nordin, Gunnar; Westgard, Sten

    2015-10-01

    Laboratory medicine is amongst the fastest growing fields in medicine, crucial in diagnosis, support of prevention and in the monitoring of disease for individual patients and for the evaluation of treatment for populations of patients. Therefore, high quality and safety in laboratory testing has a prominent role in high-quality healthcare. Applied knowledge and competencies of professionals in laboratory medicine increases the clinical value of laboratory results by decreasing laboratory errors, increasing appropriate utilization of tests, and increasing cost effectiveness. This collective paper provides insights into how to validate the laboratory assays and assess the quality of methods. It is a synopsis of the lectures at the 15th European Federation of Clinical Chemistry and Laboratory Medicine (EFLM) Continuing Postgraduate Course in Clinical Chemistry and Laboratory Medicine entitled "How to assess the quality of your method?" (Zagreb, Croatia, 24-25 October 2015). The leading topics to be discussed include who, what and when to do in validation/verification of methods, verification of imprecision and bias, verification of reference intervals, verification of qualitative test procedures, verification of blood collection systems, comparability of results among methods and analytical systems, limit of detection, limit of quantification and limit of decision, how to assess the measurement uncertainty, the optimal use of Internal Quality Control and External Quality Assessment data, Six Sigma metrics, performance specifications, as well as biological variation. This article, which continues the annual tradition of collective papers from the EFLM continuing postgraduate courses in clinical chemistry and laboratory medicine, aims to provide further contributions by discussing the quality of laboratory methods and measurements and, at the same time, to offer continuing professional development to the attendees.

  10. The Utility of the Bifactor Method for Unidimensionality Assessment When Other Methods Disagree

    Directory of Open Access Journals (Sweden)

    Yong Luo

    2016-10-01

    Full Text Available This article provides an empirical illustration of the utility of the bifactor method for unidimensionality assessment when other methods disagree. Specifically, we used two popular methods for unidimensionality assessment: (a evaluating the model fit of a one-factor model using Mplus, and (b DIMTEST to show that different unidimensionality methods may lead to different results, and argued that in such cases the bifactor method can be particularly useful. Those procedures were applied to English Placement Test (EPT, a high-stakes English proficiency test in Saudi Arabia, to determine whether EPT is unidimensional so that a unidimensional item response theory (IRT model can be used for calibration and scoring. We concluded that despite the inconsistency between the one-factor model approach and DIMTEST, the bifactor method indicates that, for practical purposes, unidimensionality assumption holds for EPT.

  11. Dispersal per recruit: An efficient method for assessing sustainability in marine reserve networks

    OpenAIRE

    D. M. Kaplan; Botsford, L W; Jorgensen, S.

    2006-01-01

    Marine reserves are an increasingly important tool for the management of marine ecosystems around the world. However, the effects of proposed marine reserve configurations on sustainability and yield of populations are typically not estimated because of the computational intensity of direct simulation and uncertainty in larval dispersal and density-dependent recruitment. Here we develop a method for efficiently assessing a marine reserve configuration for persistence a...

  12. Diagnostic accuracy of the MMPI-2 to assess imbalances emphasising in people with substance dependence

    Directory of Open Access Journals (Sweden)

    Pablo González-Romero

    2017-07-01

    Full Text Available The acceptance and respect of the rules governing society and the family unit are essential pillars for the development of a therapeutic program for people with substance dependence disorders. This study proposes a double objective using the scales of the MMPI-2 detectors of mismatches emphasising: what information can provide and what the diagnostic accuracy of the MMPI-2 is to assess these mismatches. As a reference, psychopathic deviation (Pd, social introversion (Si, antisocial practices (ASP, social responsibility (Re, social unrest (SOD, introversion/low positive emotion (PSY-INTR, family problems (FAM, and conjugal stress (MDS were taken. Of the 226 participants, 113 are people with substance dependence and 113 have no dependence or any pathology. Their differences and diagnostic accuracy through the ROC curve were analysed. The results showed different contribution and diagnostic accuracy of the scales.

  13. Dogmas in the assessment of usability evaluation methods

    DEFF Research Database (Denmark)

    Hornbæk, Kasper

    2010-01-01

    Usability evaluation methods (UEMs) are widely recognised as an essential part of systems development. Assessments of the performance of UEMs, however, have been criticised for low validity and limited reliability. The present study extends this critique by describing seven dogmas in recent work...... on UEMs. The dogmas include using inadequate procedures and measures for assessment, focusing on win-lose outcomes, holding simplistic models of how usability evaluators work, concentrating on evaluation rather than on design and working from the assumption that usability problems are real. We discuss...... research approaches that may help move beyond the dogmas. In particular, we emphasise detailed studies of evaluation processes, assessments of the impact of UEMs on design carried out in real-world systems development and analyses of how UEMs may be combined...

  14. Going beyond the Millennium Ecosystem Assessment: an index system of human dependence on ecosystem services.

    Science.gov (United States)

    Yang, Wu; Dietz, Thomas; Liu, Wei; Luo, Junyan; Liu, Jianguo

    2013-01-01

    The Millennium Ecosystem Assessment (MA) estimated that two thirds of ecosystem services on the earth have degraded or are in decline due to the unprecedented scale of human activities during recent decades. These changes will have tremendous consequences for human well-being, and offer both risks and opportunities for a wide range of stakeholders. Yet these risks and opportunities have not been well managed due in part to the lack of quantitative understanding of human dependence on ecosystem services. Here, we propose an index of dependence on ecosystem services (IDES) system to quantify human dependence on ecosystem services. We demonstrate the construction of the IDES system using household survey data. We show that the overall index and sub-indices can reflect the general pattern of households' dependences on ecosystem services, and their variations across time, space, and different forms of capital (i.e., natural, human, financial, manufactured, and social capitals). We support the proposition that the poor are more dependent on ecosystem services and further generalize this proposition by arguing that those disadvantaged groups who possess low levels of any form of capital except for natural capital are more dependent on ecosystem services than those with greater control of capital. The higher value of the overall IDES or sub-index represents the higher dependence on the corresponding ecosystem services, and thus the higher vulnerability to the degradation or decline of corresponding ecosystem services. The IDES system improves our understanding of human dependence on ecosystem services. It also provides insights into strategies for alleviating poverty, for targeting priority groups of conservation programs, and for managing risks and opportunities due to changes of ecosystem services at multiple scales.

  15. Methods for sustainable assessment of housing: A comparative analysis of five international methods

    Directory of Open Access Journals (Sweden)

    Felipe Quesada Molina

    2014-06-01

    Full Text Available This paper compares the most internationally important methods of residential building assessment (BREEAM, LEED, VERDE, CASBEE and Qualitel in order to establish bases for future development. This article is divided into three parts: the first, reviews the emergence and objectives of the methods; the second, presents the methodology used in the study; and the third, analyzes and compares these methods. To conclude, the dimensions and categories the methods address are established, as well as the methodological structure, rating system, and limits of the methods.

  16. Methods for Developing Emissions Scenarios for Integrated Assessment Models

    Energy Technology Data Exchange (ETDEWEB)

    Prinn, Ronald [MIT; Webster, Mort [MIT

    2007-08-20

    The overall objective of this research was to contribute data and methods to support the future development of new emissions scenarios for integrated assessment of climate change. Specifically, this research had two main objectives: 1. Use historical data on economic growth and energy efficiency changes, and develop probability density functions (PDFs) for the appropriate parameters for two or three commonly used integrated assessment models. 2. Using the parameter distributions developed through the first task and previous work, we will develop methods of designing multi-gas emission scenarios that usefully span the joint uncertainty space in a small number of scenarios. Results on the autonomous energy efficiency improvement (AEEI) parameter are summarized, an uncertainty analysis of elasticities of substitution is described, and the probabilistic emissions scenario approach is presented.

  17. Assessing proprioception:A critical review of methods

    Institute of Scientific and Technical Information of China (English)

    Jia Han; Gordon Waddington; Roger Adams; Judith Anson; Yu Liu

    2016-01-01

    To control movement, the brain has to integrate proprioceptive information from a variety of mechanoreceptors. The role of proprioception in daily activities, exercise,and sports has been extensively investigated, using different techniques, yet the proprioceptive mechanisms underlying human movement control are still unclear. In the current work we have reviewed understanding of proprioception and the three testing methods: threshold to detection of passive motion, joint position reproduction, and active movement extent discrimination, all of which have been used for assessing proprioception. The origin of the methods, the different testing apparatus, and the procedures and protocols used in each approach are compared and discussed. Recommendations are made for choosing an appropriate technique when assessing proprioceptive mechanisms in different contexts.

  18. Theory of and Method for Nontraditional Mining Assessment

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    @@NONTRADITIONAL MINING ASSESSMENT THEORY Nontraditional Mineral Resources Nontraditional mineral resources refers to the potential mineral resources that are ignored, undiscovered and unutilized under present technical, economic and environmental conditions. This research scope can be listed as follows: (1)Nontraditional mineral resources refer to new types, new depths, new scopes, new techniques and new utilization. (2)Nontraditional theories and methods include new theories,new technologies and new methods in the aspects of ore-forming, prospecting, mining, metallurgy and mining assessment such as nontraditional ore-forming predication. (3) Nontraditional mining refers to the types of clean and unpolluted mining, intensive mining, high value-added mining, high technology mining, post-mining economy and comprehensive service mining.

  19. A system boundary identification method for life cycle assessment

    DEFF Research Database (Denmark)

    Li, Tao; Zhang, Hongchao; Liu, Zhichao

    2014-01-01

    of processes considered, and the gradient of the fitting curve trends to zero gradually. According to the threshold rules, a relatively accurate system boundary could be obtained.It is found from this research that the system boundary curve describes the growth of life cycle impact assessment (LCIA) results......Life cycle assessment (LCA) is a useful tool for quantifying the overall environmental impacts of a product, process, or service. The scientific scope and boundary definition are important to ensure the accuracy of LCA results. Defining the boundary in LCA is difficult and there are no commonly...... accepted scientific methods yet. The objective of this research is to present a comprehensive discussion of system boundaries in LCA and to develop an appropriate boundary delimitation method.A product system is partitioned into the primary system and interrelated subsystems. The hierarchical relationship...

  20. Cardiotocography (CTG as the screening method of fetal condition assessment

    Directory of Open Access Journals (Sweden)

    V. Zulčić-Nakić

    2007-02-01

    Full Text Available A basic function of fetal monitoring is an analysis of fetal cardiac action. Cardiotocography (CTG cannot provide all necessary information for assessment of the fetal condition as it is not sufficiently reliable and gives a large number of false positive results that increase the number of cesarean sections. An objective of this work was to establish CTG reliability as a method for assessment of intrapartal fetal condition. Based on CTG parameters (baseline fetal heart rate, fetal heart rate variability, oscillations and decelerations 100 pathological CTG records, collected at Obstetrics and Gynecology Department of the Tuzla University Clinic Hospital from 01.12.2004 to 05.08.2005 were identified. Using binomial distribution they were classified as non-pathological (indicating absence of asphyxia and pathological (indicating possible presence of asphyxia. After the delivery the condition of newborns was assessed according to the Apgar score. Based on comparison between certain pathological parametres of CTG records and newborns’ conditions at birth the results indicated high positive predictive values whereas sensitivity and accuracy were low. Apgar score 1. from 7 upwards was given to 96 (96% newborns whereas Apgar score 2 from 7 upwards was given to all the newborns with previous pathological CTG records. Results have confirmed that CTG can be used only as a screening method for assessment of intrapartal fetal condition.

  1. Improved GIS-based Methods for Traffic Noise Impact Assessment

    DEFF Research Database (Denmark)

    Nielsen, Otto Anker; Bloch, Karsten Sand

    1996-01-01

    -model from the 2D digital map by utilising the information in the BBR-register. Thus, the method can also estimate the noise on each floor and it takes care of the differences in barriers from tall buildings versus low buildings. The practical testing of the methods in Middelfart showed that the traditional....... The number of buildings within the buffers are enumerated. This technique provides an inaccurate assessment of the noise diffusion since it does not correct for buildings barrier and reflection to noise. The paper presents the results from a research project where the traditional noise buffer technique...... was compared with a new method which includes these corrections. Both methods follow the Common Nordic Noise Calculation Model, although the traditional buffer technique ignores parts of the model. The basis for the work was a digital map of roads and building polygons, combined with a traffic- and road...

  2. Employment of kernel methods on wind turbine power performance assessment

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Sweeney, Christian Walsted; Marhadi, Kun S.

    2015-01-01

    A power performance assessment technique is developed for the detection of power production discrepancies in wind turbines. The method employs a widely used nonparametric pattern recognition technique, the kernel methods. The evaluation is based on the trending of an extracted feature from...... the kernel matrix, called similarity index, which is introduced by the authors for the first time. The operation of the turbine and consequently the computation of the similarity indexes is classified into five power bins offering better resolution and thus more consistent root cause analysis. The accurate...

  3. Comparison of two nutritional assessment methods in gastroenterology patients

    Institute of Scientific and Technical Information of China (English)

    Branka; F; Filipovi; Milan; Gaji; Nikola; Milini; Branislav; Milovanovi; Branislav; R; Filipovi; Mirjana; Cvetkovi; Nela; ibali

    2010-01-01

    AIM:To investigate and compare efficacy and differences in the nutritional status evaluation of gastroenterology patients by application of two methods:subjective global assessment(SGA) and nutritional risk index(NRI).METHODS:The investigation was performed on 299 hospitalized patients,aged 18-84 years(average life span 55.57 ± 12.84),with different gastrointe-stinal pathology,admitted to the Department of Gastroenterohepatology,Clinical and Hospital Center "Bezanijska Kosa" during a period of 180 d.All the...

  4. Combining different methods improves assessment of competence in colonoscopy

    DEFF Research Database (Denmark)

    Konge, Lars; Svendsen, Morten Bo Søndergaard; Preisler, Louise

    2017-01-01

    score calculations were used to explore different combinations of the measures. RESULTS: Twenty physicians were included in the study. The reliability (Cronbach's alpha) were 0.92, 0.57, 0.87 and 0.55 for the subjective score assessed under direct observation, time to cecum, distance between operator......'s hands and colonoscopy progression score, respectively. Equal weight (=25%) to all four methods resulted in a reliability of 0.91 and optimal weighting of the methods (55%, 10%, 25% and 10%, respectively) resulted in a maximum reliability of 0.95. CONCLUSION: Combining subjective expert ratings...

  5. The Current Status of Peer Assessment Techniques and Sociometric Methods.

    Science.gov (United States)

    Bukowski, William M; Castellanos, Melisa; Persram, Ryan J

    2017-09-01

    Current issues in the use of peer assessment techniques and sociometric methods are discussed. Attention is paid to the contributions of the four articles in this volume. Together these contributions point to the continual level of change and progress in these techniques. They also show that the paradigm underlying these methods has been unchanged for decades. It is argued that this domain is ripe for a paradigm change that takes advantage of recent developments in statistical techniques and technology. © 2017 Wiley Periodicals, Inc.

  6. A Novel Situation Assessment Method for Network Survivability

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; WANG Huiqiang; ZHAO Guosheng

    2006-01-01

    Survivability has emerged as a new phase for the development of network security technique, and quantifying survivability for network system helps to evaluate it exactly for the system in different environments. In this paper, we adopt a stochastic method called sequential Monte Carlo and try to reflect dynamic evolvement process of network survivability situation from several time sequences. The experiment results show that this method has the features of quantitative description, real-time calculation and dynamic tracking, and it is a good situation assessment solution for network survivability.

  7. [Methods of radiological bone age assessment (author's transl)].

    Science.gov (United States)

    Fendel, H

    1976-09-01

    An assessment of the bone age can be made in different manners. Numerical methods calculating the number of existing ossification centers are to inaccurate. The use of "age-of-appearance" tables gives a more accurate evaluation. In both methods, however, x-ray films of several body parts must be made. Therefore, they are complicated and lead to a higher patient radiation exposure. Methods using hand and wrist as a representative area of the whole skeleton are of greater value for routine bone-age assessments. There is a wide-spread use of the Greulich-Pyle atlas. The atlas-method is fully sufficient in the great majority of cases when certain rules are considered. A more detailed information can be achieved by using the so-called "bone-by-bone" evaluation. A score system was introduced by Tanner and Whitehouse which should be used to a greater extent than is done up to now. Metrical methods give no real information about the bone age but additional informations which can be helpful in following examinations with short intervals.

  8. Numerical methods for assessment of the ship's pollutant emissions

    Science.gov (United States)

    Jenaru, A.; Acomi, N.

    2016-08-01

    The maritime transportation sector constitutes a source of atmospheric pollution. To avoid or minimize ships pollutant emissions the first step is to assess them. Two methods of estimation of the ships’ emissions are proposed in this paper. These methods prove their utility for shipboard and shore based management personnel from the practical perspective. The methods were demonstrated for a product tanker vessel where a permanent monitoring system for the pollutant emissions has previously been fitted. The values of the polluting agents from the exhaust gas were determined for the ship from the shipyard delivery and were used as starting point. Based on these values, the paper aimed at numerical assessing of ship's emissions in order to determine the ways for avoiding environmental pollution: the analytical method of determining the concentrations of the exhaust gas components, by using computation program MathCAD, and the graphical method of determining the concentrations of the exhaust gas components, using variation diagrams of the parameters, where the results of the on board measurements were introduced, following the application of pertinent correction factors. The results should be regarded as a supporting tool during the decision making process linked to the reduction of ship's pollutant emissions.

  9. A fast RCS accuracy assessment method for passive radar calibrators

    Science.gov (United States)

    Zhou, Yongsheng; Li, Chuanrong; Tang, Lingli; Ma, Lingling; Liu, QI

    2016-10-01

    In microwave radar radiometric calibration, the corner reflector acts as the standard reference target but its structure is usually deformed during the transportation and installation, or deformed by wind and gravity while permanently installed outdoor, which will decrease the RCS accuracy and therefore the radiometric calibration accuracy. A fast RCS accuracy measurement method based on 3-D measuring instrument and RCS simulation was proposed in this paper for tracking the characteristic variation of the corner reflector. In the first step, RCS simulation algorithm was selected and its simulation accuracy was assessed. In the second step, the 3-D measuring instrument was selected and its measuring accuracy was evaluated. Once the accuracy of the selected RCS simulation algorithm and 3-D measuring instrument was satisfied for the RCS accuracy assessment, the 3-D structure of the corner reflector would be obtained by the 3-D measuring instrument, and then the RCSs of the obtained 3-D structure and corresponding ideal structure would be calculated respectively based on the selected RCS simulation algorithm. The final RCS accuracy was the absolute difference of the two RCS calculation results. The advantage of the proposed method was that it could be applied outdoor easily, avoiding the correlation among the plate edge length error, plate orthogonality error, plate curvature error. The accuracy of this method is higher than the method using distortion equation. In the end of the paper, a measurement example was presented in order to show the performance of the proposed method.

  10. Assessing Security of Supply: Three Methods Used in Finland

    Science.gov (United States)

    Sivonen, Hannu

    Public Private Partnership (PPP) has an important role in securing supply in Finland. Three methods are used in assessing the level of security of supply. First, in national expert groups, a linear mathematical model has been used. The model is based on interdependency estimates. It ranks societal functions or its more detailed components, such as items in the food supply chain, according to the effect and risk pertinent to the interdependencies. Second, the security of supply is assessed in industrial branch committees (clusters and pools) in the form of indicators. The level of security of supply is assessed against five generic factors (dimension 1) and tens of business branch specific functions (dimension 2). Third, in two thousand individual critical companies, the maturity of operational continuity management is assessed using Capability Maturity Model (CMM) in an extranet application. The pool committees and authorities obtain an anonymous summary. The assessments are used in allocating efforts for securing supply. The efforts may be new instructions, training, exercising, and in some cases, investment and regulation.

  11. Assessment of disinfection of hospital surfaces using different monitoring methods.

    Science.gov (United States)

    Ferreira, Adriano Menis; de Andrade, Denise; Rigotti, Marcelo Alessandro; de Almeida, Margarete Teresa Gottardo; Guerra, Odanir Garcia; dos Santos Junior, Aires Garcia

    2015-01-01

    to assess the efficiency of cleaning/disinfection of surfaces of an Intensive Care Unit. descriptive-exploratory study with quantitative approach conducted over the course of four weeks. Visual inspection, bioluminescence adenosine triphosphate and microbiological indicators were used to indicate cleanliness/disinfection. Five surfaces (bed rails, bedside tables, infusion pumps, nurses' counter, and medical prescription table) were assessed before and after the use of rubbing alcohol at 70% (w/v), totaling 160 samples for each method. Non-parametric tests were used considering statistically significant differences at pdisinfection process, 87.5, 79.4 and 87.5% of the surfaces were considered clean using the visual inspection, bioluminescence adenosine triphosphate and microbiological analyses, respectively. A statistically significant decrease was observed in the disapproval rates after the cleaning process considering the three assessment methods; the visual inspection was the least reliable. the cleaning/disinfection method was efficient in reducing microbial load and organic matter of surfaces, however, these findings require further study to clarify aspects related to the efficiency of friction, its frequency, and whether or not there is association with other inputs to achieve improved results of the cleaning/disinfection process.

  12. Economic Benefits: Metrics and Methods for Landscape Performance Assessment

    Directory of Open Access Journals (Sweden)

    Zhen Wang

    2016-04-01

    Full Text Available This paper introduces an expanding research frontier in the landscape architecture discipline, landscape performance research, which embraces the scientific dimension of landscape architecture through evidence-based designs that are anchored in quantitative performance assessment. Specifically, this paper summarizes metrics and methods for determining landscape-derived economic benefits that have been utilized in the Landscape Performance Series (LPS initiated by the Landscape Architecture Foundation. This paper identifies 24 metrics and 32 associated methods for the assessment of economic benefits found in 82 published case studies. Common issues arising through research in quantifying economic benefits for the LPS are discussed and the various approaches taken by researchers are clarified. The paper also provides an analysis of three case studies from the LPS that are representative of common research methods used to quantify economic benefits. The paper suggests that high(er levels of sustainability in the built environment require the integration of economic benefits into landscape performance assessment portfolios in order to forecast project success and reduce uncertainties. Therefore, evidence-based design approaches increase the scientific rigor of landscape architecture education and research, and elevate the status of the profession.

  13. Integrated Prevention of Social Dependencies in Adolescents through the Scenario Method

    Directory of Open Access Journals (Sweden)

    Marina A. Maznichenko

    2015-06-01

    Full Text Available This article provides a rationale for the need to take an integrated approach to prevention of social dependencies in adolescents. Through this approach, the authors fine-tune the determination of the phenomenon of prevention of social dependencies. The authors bring to light the potential of the scenario method in resolving the above objective. The authors describe the theoretical and practical aspects of scenario planning, concretize its objects, and provide a rationale for the method’s effectiveness in studying the processes of origination and operation of social dependencies in adolescents and projecting the process of their prevention. The authors propound a scenario-planning algorithm. The authors identify and describe model scenarios for the origination of social dependencies in adolescents: “Dependency as an outcome of interaction with an asocial group/person”, “Dependency as a response to a provocation”, “Dependency as a means of deriving pleasure”, “Dependency as a way to escape one’s life problems”, “Dependency as an outcome of the change of a constructive way of interacting with the object of dependency to an unconstructive one”, and “Dependency as a way to express the adolescent’s protest”; the authors determine unproductive stratagems for these scenarios. The authors identify the mechanisms underlying the origination of dependency: “the motive-to-goal shift”, emotional-positive conditioning, social contagion, substitution, compensation for negative emotions, and unblocking. The authors classify by the degree of productivity and correlate with behavioral adolescent scenarios model plots and social-pedagogical scenarios for the interaction of pedagogues and parents in resolving objectives in integrated prevention of social dependencies. The authors provide plots and scenarios for the interaction of pedagogues and parents within the “field of cooperation”, “field of building up cooperation

  14. Assessment of self-consistent field convergence in spin-dependent relativistic calculations

    Science.gov (United States)

    Nakano, Masahiko; Seino, Junji; Nakai, Hiromi

    2016-07-01

    This Letter assesses the self-consistent field (SCF) convergence behavior in the generalized Hartree-Fock (GHF) method. Four acceleration algorithms were implemented for efficient SCF convergence in the GHF method: the damping algorithm, the conventional direct inversion in the iterative subspace (DIIS), the energy-DIIS (EDIIS), and a combination of DIIS and EDIIS. Four different systems with varying complexity were used to investigate the SCF convergence using these algorithms, ranging from atomic systems to metal complexes. The numerical assessments demonstrated the effectiveness of a combination of DIIS and EDIIS for GHF calculations in comparison with the other discussed algorithms.

  15. Analysis of the Toolkit method for the time-dependant Schr\\"odinger equation

    CERN Document Server

    Baudouin, Lucie; Turinici, Gabriel

    2009-01-01

    The goal of this paper is to provide an analysis of the ``toolkit'' method used in the numerical approximation of the time-dependent Schr\\"odinger equation. The ``toolkit'' method is based on precomputation of elementary propagators and was seen to be very efficient in the optimal control framework. Our analysis shows that this method provides better results than the second order Strang operator splitting. In addition, we present two improvements of the method in the limit of low and large intensity control fields.

  16. Analysis of the Toolkit method for the time-dependant Schr\\"odinger equation

    CERN Document Server

    Baudouin, Lucie; Turinici, Gabriel

    2010-01-01

    The goal of this paper is to provide an analysis of the "toolkit" method used in the numerical approximation of the time-dependent Schr\\"odinger equation. The "toolkit" method is based on precomputation of elementary propagators and was seen to be very efficient in the optimal control framework. Our analysis shows that this method provides better results than the second order Strang operator splitting. In addition, we present two improvements of the method in the limit of low and large intensity control fields.

  17. Impact of Different Obesity Assessment Methods after Acute Coronary Syndromes

    Directory of Open Access Journals (Sweden)

    Caroline N. M. Nunes

    2014-07-01

    Full Text Available Background: Abdominal obesity is an important cardiovascular risk factor. Therefore, identifying the best method for measuring waist circumference (WC is a priority. Objective: To evaluate the eight methods of measuring WC in patients with acute coronary syndrome (ACS as a predictor of cardiovascular complications during hospitalization. Methods: Prospective study of patients with ACS. The measurement of WC was performed by eight known methods: midpoint between the last rib and the iliac crest (1, point of minimum circumference (2; immediately above the iliac crest (3, umbilicus (4, one inch above the umbilicus (5, one centimeter above the umbilicus (6, smallest rib and (7 the point of greatest circumference around the waist (8. Complications included: angina, arrhythmia, heart failure, cardiogenic shock, hypotension, pericarditis and death. Logistic regression tests were used for predictive factors. Results: A total of 55 patients were evaluated. During the hospitalization period, which corresponded on average to seven days, 37 (67% patients had complications, with the exception of death, which was not observed in any of the cases. Of these complications, the only one that was associated with WC was angina, and with every cm of WC increase, the risk for angina increased from 7.5 to 9.9%, depending on the measurement site. It is noteworthy the fact that there was no difference between the different methods of measuring WC as a predictor of angina. Conclusion: The eight methods of measuring WC are also predictors of recurrent angina after acute coronary syndromes.

  18. Assessment of Wind Turbine for Site-Specific Conditions using Probabilistic Methods

    DEFF Research Database (Denmark)

    Heras, Enrique Gómez de las; Gutiérrez, Roberto; Azagra, Elena

    2013-01-01

    turbines, helping to the decision making during the site assessment phase of wind farm designs. First, the design equation for the failure mode of interest is defined, where the loads associated to the site-specific wind conditions are compared with the design limits of the structural component. A limit......This paper describes a new approach to assess the structural integrity of wind turbines for sitespecific conditions using probabilistic methods, taking into account the particular uncertainties associated to each site. This new approach intends to improve the site suitability analysis of wind...... state equation is defined making the loads and resistance depending on a set of stochastic variables representing the uncertainties. In this paper, special focus is put on the uncertainties related to the assessment of wind data, which is the main input for the sitespecific load assessment, and can...

  19. Measuring temperature-dependent activation energy in thermally activated processes: a 2D Arrhenius plot method.

    Science.gov (United States)

    Li, Jian V; Johnston, Steven W; Yan, Yanfa; Levi, Dean H

    2010-03-01

    Thermally activated processes are characterized by two key quantities, activation energy (E(a)) and pre-exponential factor (nu(0)), which may be temperature dependent. The accurate measurement of E(a), nu(0), and their temperature dependence is critical for understanding the thermal activation mechanisms of non-Arrhenius processes. However, the classic 1D Arrhenius plot-based methods cannot unambiguously measure E(a), nu(0), and their temperature dependence due to the mathematical impossibility of resolving two unknown 1D arrays from one 1D experimental data array. Here, we propose a 2D Arrhenius plot method to solve this fundamental problem. Our approach measures E(a) at any temperature from matching the first and second moments of the data calculated with respect to temperature and rate in the 2D temperature-rate plane, and therefore is able to unambiguously solve E(a), nu(0), and their temperature dependence. The case study of deep level emission in a Cu(In,Ga)Se(2) solar cell using the 2D Arrhenius plot method reveals clear temperature dependent behavior of E(a) and nu(0), which has not been observable by its 1D predecessors.

  20. Results of phacoemulsification of cataract complicated by lens subluxation depending on the ring setting method

    Directory of Open Access Journals (Sweden)

    N. G. Zavgorodnjaja

    2015-02-01

    Full Text Available Actuality. Cataract extraction complicated by zonular weakness remains one of the urgent problems of eye microsurgery. Aim. To increase the efficiency of cataract surgery complicated by lens subluxation through comparative analysis of frequency and structure of intra- and postoperative complications, as well as the functional results of surgical treatment of patients depending on the ring setting method. Methods and results. 91 patients (93 eyes were examined, who were operated on complicated cataract, and were divided into 2 groups depending on the method of ring setting. Conclusion. It was established that the offered method of implant capsular ring gives the opportunity to reduce the length of patients’ treatment by 2,6 day, to decrease of 28,43% for complications, and also to avoid such heavy operating complications, as a vitreous prolapse, displacement of the eye lens fragments in a vitreous body, dug up capsule bag.

  1. A NEW METHOD TO COMPENSATE CLUTTER RANGE DEPENDENCE FOR FORWARD LOOKING AIRBORNE RADARS

    Institute of Scientific and Technical Information of China (English)

    Jiang Dongchu; He Fei

    2010-01-01

    The clutter direction-Doppler curves are not aligned on the near range bins for forward looking airborne radar. As a result,the performance of clutter suppression by Space-Time Adaptive Processing (STAP) degrades greatly because of the clutter range dependence. To deal with this problem,a new compensated method is proposed in this paper. The method rebuilds the clutter covariance matrix based on spatial high resolution Minimum Variance Distortionless Response (MVDR) spectrum,and then finds a matrix to transform the covariance matrix of short-range gate to the referred far-range gate. The method can compensate the clutter range dependence well. The simulation results show validity of the method.

  2. Randomized Terminal Linker-dependent PCR: A Versatile and Sensitive Method for Detection of DNA Damage

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Objective To design and develop a novel, sensitive and versatile method for in vivo foot printing and studies of DNA damage, such as DNA adducts and strand breaks. Methods Starting with mammalian genomic DNA, single-stranded products were made by repeated primer extension, these products were ligated to a double-stranded linker having a randomized 3′ overhang, and used for PCR.DNA breaks in p53 gene produced by restriction endonuclease AfaI were detected by using this new method followed by Southern hybridization with DIG-labeled probe. Results This randomized terminal linker-dependent PCR (RDPCR) method could generate band signals many-fold stronger than conventional ligation-mediated PCR (LMPCR), and it was more rapid, convenient and accurate than the terminal transferase-dependent PCR (TDPCR). Conclusion DNA strand breakage can be detected sensitively in the gene level by RDPCR. Any lesion that blocks primer extension should be detectable.

  3. Time-dependent dielectric breakdown of MgO magnetic tunnel junctions and novel test method

    Science.gov (United States)

    Kim, Kyungjun; Choi, Chulmin; Oh, Youngtaek; Sukegawa, Hiroaki; Mitani, Seiji; Song, Yunheub

    2017-04-01

    Time-dependent dielectric breakdown (TDDB), which is used to measure reliability, depends on both the thickness of the tunnel barrier and bias voltage. In addition, the heat generated by self-heating in a magnetic tunneling junction (MTJ) affects TDDB. Therefore, we investigated TDDB with the self-heating effect for a MgO tunnel barrier with thicknesses of 1.1 and 1.2 nm by the constant voltage stress (CVS) method. Using the results of this experiment, we predicted a TDDB of 1.0 nm for the tunnel barrier. Also, we suggested the use of not only the CVS method, which is a common way of determining TDDB, but also the constant current stress (CCS) method, which compensates for the disadvantages of the CVS method.

  4. A method to diagnose opioid dependence resulting from heroin versus prescription opioids using the Composite International Diagnostic Interview.

    Science.gov (United States)

    Potter, Jennifer S; Prather, Kristi; Kropp, Frankie; Byrne, Mimmie; Sullivan, C Rollynn; Mohamedi, Nadia; Copersino, Marc L; Weiss, Roger D

    2010-03-01

    Treatment research with opioid-dependent populations has not traditionally distinguished between those dependent on prescription opioids versus dependent upon heroin. Evidence suggests there is a substantial subpopulation of individuals with opioid dependence resulting largely or exclusively from prescription opioid use. Because this subpopulation may respond to treatment differently from heroin users, a method for discriminating DSM-IV opioid dependence due to prescription opioid use would provide more precision when examining this population. This paper describes an innovative method using a currently available diagnostic instrument, to diagnose DSM-IV opioid dependence and distinguish between dependence resulting from prescription opioids versus dependence upon heroin.

  5. BODY COMPOSITION ASSESSMENT WITH SEGMENTAL MULTIFREQUENCY BIOIMPEDANCE METHOD

    Directory of Open Access Journals (Sweden)

    Jukka A. Salmi

    2003-12-01

    Full Text Available Body composition assessment is an important factor in weight management, exercise science and clinical health care. Bioelectrical impedance analysis (BIA is widely used method for estimating body composition. The purpose of this study was to evaluate segmental multi-frequency bioimpedance method (SMFBIA in body composition assessment with underwater weighing (UWW and whole body dual energy x-ray absorptiometry (DXA in healthy obese middle-aged male subjects. The measurements were carried out at the UKK Institute for Health Promotion Research in Tampere, Finland according to standard procedures of BIA, UWW and DXA. Fifty-eight (n=58 male subjects, aged 36-53 years, body mass index (BMI 24.9-40.7, were studied. Of them forty (n=40 underwent also DXA measurement. Fat mass (FM, fat-percentage (F% and fat free mass (FFM were the primary outcome variables. The mean whole body FM (±SD from UWW was 31.5 kg (±7.3. By DXA it was 29.9 kg (±8.1 and by SMFBIA it was 25.5 kg (±7.6, respectively. The Pearson correlation coefficients (r were 0.91 between UWW and SMFBIA, 0.94 between DXA and SMFBIA and 0.91 between UWW and DXA, respectively. The mean segmental FFM (±SD from DXA was 7.7 kg (±1.0 for arms, 41.7 kg (±4.6 for trunk and 21.9 kg (±2.2 for legs. By SMFBIA, it was 8.5 kg (±0.9, 31.7 kg (±2.5 and 20.3 kg (±1.6, respectively. Pearson correlation coefficients were 0.75 for arms, 0.72 for legs and 0.77 for trunk. This study demonstrates that SMFBIA is usefull method to evaluate fat mass (FM, fat free mass (FFM and fat percentage (F% from whole body. Moreover, SMFBIA is suitable method for assessing segmental distribution of fat free mass (FFM compared to whole body DXA. The results of this study indicate that the SMFBIA method may be particularly advantageous in large epidemiological studies as being a simple, rapid and inexpensive method for field use of whole body and segmental body composition assessment

  6. Assessment of rock burst hazards by means of seismic methods

    Energy Technology Data Exchange (ETDEWEB)

    Proskuryakov, V.M.

    1984-10-01

    Use of seismic methods for assessment of stress distribution in coal seams and in rock strata adjacent to coal seams is discussed. Analysis of information on stress distribution permits rock burst hazards to be forecast. Schemes of seismic logging used in coal mining are compared. Recommendations developed by the VNIMI Institute for optimization of seismic logging are analyzed: selecting a seismic method considering tectonics, stratification and rock properties, arrangement of seismic sources and seismic detectors, selecting the optimum parameters of seismic waves (wave frequency recommended for rocks ranges from 400 to 1000 Hz; recommended wave frequency for coal ranges from 200 to 600 Hz), measuring instruments (e.g. the ShchTsS-2 system), and calculation methods used for evaluations of seismic logging. A standardized procedure for seismic logging is recommended.

  7. Methods for assessing biochemical oxygen demand (BOD): a review.

    Science.gov (United States)

    Jouanneau, S; Recoules, L; Durand, M J; Boukabache, A; Picot, V; Primault, Y; Lakel, A; Sengelin, M; Barillon, B; Thouand, G

    2014-02-01

    The Biochemical Oxygen Demand (BOD) is one of the most widely used criteria for water quality assessment. It provides information about the ready biodegradable fraction of the organic load in water. However, this analytical method is time-consuming (generally 5 days, BOD5), and the results may vary according to the laboratory (20%), primarily due to fluctuations in the microbial diversity of the inoculum used. Work performed during the two last decades has resulted in several technologies that are less time-consuming and more reliable. This review is devoted to the analysis of the technical features of the principal methods described in the literature in order to compare their performances (measuring window, reliability, robustness) and to identify the pros and the cons of each method.

  8. Systematic evaluation of observational methods assessing biomechanical exposures at work

    DEFF Research Database (Denmark)

    Takala, Esa-Pekka; Irmeli, Pehkonen; Forsman, Mikael

    2009-01-01

    University of Science and Technology, Trondheim, 9 University of Gothenburg and National Research Centre for the Working Environment, Copenhagen   The aim of this project was to identify and systematically evaluate observational methods to assess workload on the musculoskeletal system. Searches...... observational methods presented in the literature, and to provide recommendations for their use.   METHODS   Search and selection of reference literature    Literature searches were conducted in the following electronic databases: PubMed, Embase, CISDOC, ScienceDirect, and Google Scholar. The searches started......): musculoskeletal, back, neck, extremities. The results were first screened by title and abstract. About 580 potential references were identified, including original scientific reports, reviews and internet sources. Full texts of these references were collated in electronic (or scanned) format for further...

  9. Gas and coal outbursts in Polish minescauses and assessing methods

    Institute of Scientific and Technical Information of China (English)

    WIERZBICKI Miroslaw

    2011-01-01

    The paper presents some information about gas and coal outbursts threat in Polish coal mines.It shows the methodology for threat identification and monitoring for gas and coal outbursts in the Polish coal mines.One of the main methods of assessing threats in the mining industry in Poland and China is desorbometric method.The paper presents some results of estimation of uncertainties of the desorption rate Ap,determined in situ,by use of liquid manometric desorbometer gauge.It was observed that,if there are coal subgrains in desorbometer contaminator,the results of desorption rate may be even up to 60% higher than results obtained for the normative sample.Possibly method of the uncertainty reduction are presented in the paper as well.

  10. Geomorphometry-based method of landform assessment for geodiversity

    Science.gov (United States)

    Najwer, Alicja; Zwoliński, Zbigniew

    2015-04-01

    Climate variability primarily induces the variations in the intensity and frequency of surface processes and consequently, principal changes in the landscape. As a result, abiotic heterogeneity may be threatened and the key elements of the natural diversity even decay. The concept of geodiversity was created recently and has rapidly gained the approval of scientists around the world. However, the problem recognition is still at an early stage. Moreover, little progress has been made concerning its assessment and geovisualisation. Geographical Information System (GIS) tools currently provide wide possibilities for the Earth's surface studies. Very often, the main limitation in that analysis is acquisition of geodata in appropriate resolution. The main objective of this study was to develop a proceeding algorithm for the landform geodiversity assessment using geomorphometric parameters. Furthermore, final maps were compared to those resulting from thematic layers method. The study area consists of two peculiar valleys, characterized by diverse landscape units and complex geological setting: Sucha Woda in Polish part of Tatra Mts. and Wrzosowka in Sudetes Mts. Both valleys are located in the National Park areas. The basis for the assessment is a proper selection of geomorphometric parameters with reference to the definition of geodiversity. Seven factor maps were prepared for each valley: General Curvature, Topographic Openness, Potential Incoming Solar Radiation, Topographic Position Index, Topographic Wetness Index, Convergence Index and Relative Heights. After the data integration and performing the necessary geoinformation analysis, the next step with a certain degree of subjectivity is score classification of the input maps using an expert system and geostatistical analysis. The crucial point to generate the final maps of geodiversity by multi-criteria evaluation (MCE) with GIS-based Weighted Sum technique is to assign appropriate weights for each factor map by

  11. Dependence of Growing High-Quality Gem Diamonds on Growth Rates by Temperature Gradient Method

    Institute of Scientific and Technical Information of China (English)

    ZANG Chuan-Yi; JIA Xiao-Peng; REN Guo-Zhong; WANG Xian-Cheng

    2004-01-01

    @@ Using the temperature gradient method under high pressure and high temperature, we investigate the dependence of growing high-quality gem diamond crystals on the growth rates. It is found that the lower the growth rate of gem diamond crystals, the larger the temperature range of growing high-quality gem diamond crystals, and the easier the control of temperature.

  12. A simple method to discriminate between beta(2)-glycoprotein I- and prothrombin-dependent lupus anticoagulants

    NARCIS (Netherlands)

    Simmelink, MJA; Derksen, RHWM; Arnout, J; De Groot, PG

    2003-01-01

    Lupus anticoagulants (LAC) are a heterogeneous group of autoantibodies that prolong phospholipid-dependent clotting assays. The autoantibodies that cause LAC activity are predominantly directed against beta(2)-glycoprotein I (beta(2)GPI) or prothrombin. In the present study, we describe a method to

  13. Finite-difference, spectral and Galerkin methods for time-dependent problems

    Science.gov (United States)

    Tadmor, E.

    1983-01-01

    Finite difference, spectral and Galerkin methods for the approximate solution of time dependent problems are surveyed. A unified discussion on their accuracy, stability and convergence is given. In particular, the dilemma of high accuracy versus stability is studied in some detail.

  14. A sparse collocation method for solving time-dependent HJB equations using multivariate B-splines

    NARCIS (Netherlands)

    Govindarajan, N.; De Visser, C.C.; Krishnakumar, K.

    2014-01-01

    This paper presents a sparse collocation method for solving the time-dependent Hamilton–Jacobi–Bellman (HJB) equation associated with the continuous-time optimal control problem on a fixed, finite timehorizon with integral cost functional. Through casting the problem in a recursive framework using t

  15. Solving Ratio-Dependent Predatorprey System with Constant Effort Harvesting Using Variational Iteration Method

    DEFF Research Database (Denmark)

    Ghotbi, Abdoul R; Barari, Amin

    2009-01-01

    Due to wide range of interest in use of bio-economic models to gain insight in to the scientific management of renewable resources like fisheries and forestry, variational iteration method (VIM) is employed to approximate the solution of the ratio-dependent predator-prey system with constant effort...

  16. A sparse collocation method for solving time-dependent HJB equations using multivariate B-splines

    NARCIS (Netherlands)

    Govindarajan, N.; De Visser, C.C.; Krishnakumar, K.

    2014-01-01

    This paper presents a sparse collocation method for solving the time-dependent Hamilton–Jacobi–Bellman (HJB) equation associated with the continuous-time optimal control problem on a fixed, finite timehorizon with integral cost functional. Through casting the problem in a recursive framework using

  17. A time-dependent neutron transport method of characteristics formulation with time derivative propagation

    Science.gov (United States)

    Hoffman, Adam J.; Lee, John C.

    2016-02-01

    A new time-dependent Method of Characteristics (MOC) formulation for nuclear reactor kinetics was developed utilizing angular flux time-derivative propagation. This method avoids the requirement of storing the angular flux at previous points in time to represent a discretized time derivative; instead, an equation for the angular flux time derivative along 1D spatial characteristics is derived and solved concurrently with the 1D transport characteristic equation. This approach allows the angular flux time derivative to be recast principally in terms of the neutron source time derivatives, which are approximated to high-order accuracy using the backward differentiation formula (BDF). This approach, called Source Derivative Propagation (SDP), drastically reduces the memory requirements of time-dependent MOC relative to methods that require storing the angular flux. An SDP method was developed for 2D and 3D applications and implemented in the computer code DeCART in 2D. DeCART was used to model two reactor transient benchmarks: a modified TWIGL problem and a C5G7 transient. The SDP method accurately and efficiently replicated the solution of the conventional time-dependent MOC method using two orders of magnitude less memory.

  18. Functional Brain Networks: Does the Choice of Dependency Estimator and Binarization Method Matter?

    Science.gov (United States)

    Jalili, Mahdi

    2016-07-01

    The human brain can be modelled as a complex networked structure with brain regions as individual nodes and their anatomical/functional links as edges. Functional brain networks are constructed by first extracting weighted connectivity matrices, and then binarizing them to minimize the noise level. Different methods have been used to estimate the dependency values between the nodes and to obtain a binary network from a weighted connectivity matrix. In this work we study topological properties of EEG-based functional networks in Alzheimer’s Disease (AD). To estimate the connectivity strength between two time series, we use Pearson correlation, coherence, phase order parameter and synchronization likelihood. In order to binarize the weighted connectivity matrices, we use Minimum Spanning Tree (MST), Minimum Connected Component (MCC), uniform threshold and density-preserving methods. We find that the detected AD-related abnormalities highly depend on the methods used for dependency estimation and binarization. Topological properties of networks constructed using coherence method and MCC binarization show more significant differences between AD and healthy subjects than the other methods. These results might explain contradictory results reported in the literature for network properties specific to AD symptoms. The analysis method should be seriously taken into account in the interpretation of network-based analysis of brain signals.

  19. Assessing the effects of cocaine dependence and pathological gambling using group-wise sparse representation of natural stimulus FMRI data.

    Science.gov (United States)

    Ren, Yudan; Fang, Jun; Lv, Jinglei; Hu, Xintao; Guo, Cong Christine; Guo, Lei; Xu, Jiansong; Potenza, Marc N; Liu, Tianming

    2016-10-04

    Assessing functional brain activation patterns in neuropsychiatric disorders such as cocaine dependence (CD) or pathological gambling (PG) under naturalistic stimuli has received rising interest in recent years. In this paper, we propose and apply a novel group-wise sparse representation framework to assess differences in neural responses to naturalistic stimuli across multiple groups of participants (healthy control, cocaine dependence, pathological gambling). Specifically, natural stimulus fMRI (N-fMRI) signals from all three groups of subjects are aggregated into a big data matrix, which is then decomposed into a common signal basis dictionary and associated weight coefficient matrices via an effective online dictionary learning and sparse coding method. The coefficient matrices associated with each common dictionary atom are statistically assessed for each group separately. With the inter-group comparisons based on the group-wise correspondence established by the common dictionary, our experimental results demonstrated that the group-wise sparse coding and representation strategy can effectively and specifically detect brain networks/regions affected by different pathological conditions of the brain under naturalistic stimuli.

  20. Assessment of a novel method for teaching veterinary parasitology.

    Science.gov (United States)

    Pereira, Mary Mauldin; Yvorchuk-St Jean, Kathleen E; Wallace, Charles E; Krecek, Rosina C

    2014-01-01

    A student-centered innovative method of teaching veterinary parasitology was launched and evaluated at the Ross University School of Veterinary Medicine (RUSVM) in St. Kitts, where Parasitology is a required course for second-semester veterinary students. A novel method, named Iron Parasitology, compared lecturer-centered teaching with student-centered teaching and assessed the retention of parasitology knowledge of students in their second semester and again when they reached their seventh semester. Members of five consecutive classes chose to participate in Iron Parasitology with the opportunity to earn an additional 10 points toward their final grade by demonstrating their knowledge, communication skills, clarity of message, and creativity in the Iron Parasitology exercise. The participants and nonparticipants were assessed using seven parameters. The initial short-term study parameters used to evaluate lecturer- versus student-centered teaching were age, gender, final Parasitology course grade without Iron Parasitology, RUSVM overall grade point average (GPA), RUSVM second-semester GPA, overall GPA before RUSVM, and prerequisite GPA before RUSVM. The long-term reassessment study assessed retention of parasitology knowledge in members of the seventh-semester class who had Iron Parasitology as a tool in their second semester. These students were invited to complete a parasitology final examination during their seventh semester. There were no statistically significant differences for the parameters measured in the initial study. In addition, Iron Parasitology did not have an effect on the retention scores in the reassessment study.

  1. Methods for assessing risks of dermal exposures in the workplace.

    Science.gov (United States)

    McDougal, James N; Boeniger, Mark F

    2002-07-01

    The skin as a route of entry for toxic chemicals has caused increasing concern over the last decade. The assessment of systemic hazards from dermal exposures has evolved over time, often limited by the amount of experimental data available. The result is that there are many methods being used to assess safety of chemicals in the workplace. The process of assessing hazards of skin contact includes estimating the amount of substance that may end up on the skin and estimating the amount that might reach internal organs. Most times, toxicology studies by the dermal route are not available and extrapolations from other exposure routes are necessary. The hazards of particular chemicals can be expressed as "skin notations", actual exposure levels, or safe exposure times. Characterizing the risk of a specific procedure in the workplace involves determining the ratio of exposure standards to an expected exposure. The purpose of this review is to address each of the steps in the process and describe the assumptions that are part of the process. Methods are compared by describing their strengths and weaknesses. Recommendations for research in this area are also included.

  2. Comparison of four methods to assess hydraulic conductivity

    Energy Technology Data Exchange (ETDEWEB)

    Benson, C.H. [Univ. of Wisconsin, Madison, WI (United States). Dept. of Civil and Environmental Engineering; Gunter, J.A. [Gunter (John A.), Round Rock, TX (United States); Boutwell, G.P. [STE, Inc., Baton Rouge, LA (United States); Trautwein, S.J. [Trautwein Soil Testing Equipment Co., Houston, TX (United States); Berzanskis, P.H. [Hoechst-Celanese, Inc., Pampa, TX (United States)

    1997-10-01

    A hydraulic conductivity assessment that was conducted on four test pads constructed to the same specifications with soil from the same source by four different contractors is described. The test pads had distinctly different field hydraulic conductivities, even though they were constructed with similar soil, to similar compaction conditions, and with similar machinery. Adequate hydration time was key in achieving low field hydraulic conductivity. More extensive processing was another factor responsible for low field hydraulic conductivity. Four different test methods were used to assess the hydraulic conductivity of each test pad: (1) sealed double-ring infiltrometers (SDRIs); (2) two-stage borehole permeameters; (3) laboratory hydraulic conductivity tests on large block specimens; and (4) laboratory hydraulic conductivity tests on small specimens collected in thin-wall sampling tubes. The tests were conducted independently by each of the writers. After the tests were completed, the results were submitted and compared. Analysis of the test results show that the three large-scale test methods generally yield similar hydraulic conductivities. For two of the test pads, however, the hydraulic conductivities of the specimens collected in sampling tubes were significantly lower than the field hydraulic conductivities. Both of these test pads had high field hydraulic conductivity. Thus, there is little value in using small specimens to assess field hydraulic conductivity.

  3. A GIS-based method for flood risk assessment

    Science.gov (United States)

    Kalogeropoulos, Kleomenis; Stathopoulos, Nikos; Psarogiannis, Athanasios; Penteris, Dimitris; Tsiakos, Chrisovalantis; Karagiannopoulou, Aikaterini; Krikigianni, Eleni; Karymbalis, Efthimios; Chalkias, Christos

    2016-04-01

    Floods are physical global hazards with negative environmental and socio-economic impacts on local and regional scale. The technological evolution during the last decades, especially in the field of geoinformatics, has offered new advantages in hydrological modelling. This study seeks to use this technology in order to quantify flood risk assessment. The study area which was used is an ungauged catchment and by using mostly GIS hydrological and geomorphological analysis together with a GIS-based distributed Unit Hydrograph model, a series of outcomes have risen. More specifically, this paper examined the behaviour of the Kladeos basin (Peloponnese, Greece) using real rainfall data, as well hypothetical storms. The hydrological analysis held using a Digital Elevation Model of 5x5m pixel size, while the quantitative drainage basin characteristics were calculated and were studied in terms of stream order and its contribution to the flood. Unit Hydrographs are, as it known, useful when there is lack of data and in this work, based on time-area method, a sequences of flood risk assessments have been made using the GIS technology. Essentially, the proposed methodology estimates parameters such as discharge, flow velocity equations etc. in order to quantify flood risk assessment. Keywords Flood Risk Assessment Quantification; GIS; hydrological analysis; geomorphological analysis.

  4. Method of the Material Stress Assessment Using the Infrared Thermography

    Directory of Open Access Journals (Sweden)

    V V Seredin

    2015-09-01

    Full Text Available The solution of a number of geological and engineering problems require knowledge of the stress state of the structural member material. Non-destructive testing methods are used to evaluate the stress state of materials. They are based on such criteria as sound pressure, temperature, ultrasound wave’s characteristics etc., as well as methods evaluating the stress state of the material after destruction. Development of methods of this group is caused by insufficient reliability of theoretical modelling not providing in some case stability of engineering facilities in practice. To avoid the emergencies, it is important to obtain the information on the actual load (stress at which the structure destruction occurred. These methods are especially important as a tool for experts identifying the cause of the accidents at engineering facilities. Additionally, this information will provide a correction of calculation models improving the safe exploitation of facilities. The aim of this study was to develop a method of materials stress state assessment using an infrared thermography data. Experimental studies showed that there exists a relationship between the temperature on the surface of material displacement and the normal stress affecting the area of failure. The temperature on the surface of material increases with an increase in normal stress in a fracture area. On a basis of the revealed relationship, a method of determination of the material stress state using the infrared thermography data was worked out.

  5. Method of ecological assessment of oil-contaminated soils

    Directory of Open Access Journals (Sweden)

    O. I. Romaniuk

    2016-06-01

    Full Text Available A method for determination of the ecological condition of oil-contaminated soils was developed. This method is suitable for use in a wide range of oil concentrations in soil, ranging from 0–20% and provides a quantitative assessment of phytotoxicity – effective toxicity. The method involves the germination on the investigated soil (moisture 33.3% in closed Petri dishes in the dark at +24ºС of seeds of test objects: Linum usitatissimum L., Helianthus annuus L., Fagopyrum vulgare St. We used for biotesting initial growth parameters of test objects during 5 days of growth, whenthe toxic effect of oil is quite evident, but other damaging factors do not become apparent. For each test object, an optimal oil concentration range is suggested. At low concentrations of oil in the soil 15.0% phytotoxicity is >4.0; the level of pollution – catastrophic. The method was tested on an industrial area – dumps of the Borislav Ozokerite Mine. Environmental maps of toxicity drawn up using different test objects: L. usitatissimum, H. annuus, F. vulgare were similar, which additionally confirms the correctness of the method. We recommend the application of the proposed method for identification of sites in a threatening, pre-crisis or crisis state, on which other physical-chemical studies can be further conducted.

  6. Impact of Different Obesity Assessment Methods after Acute Coronary Syndromes

    Energy Technology Data Exchange (ETDEWEB)

    Nunes, Caroline N. M.; Minicucci, Marcos F.; Farah, Elaine; Fusco, Daniéliso; Azevedo, Paula S.; Paiva, Sergio A. R.; Zornoff, Leonardo A. M., E-mail: lzornoff@cardiol.br [Faculdade de Medicina de Botucatu, Botucatu, SP (Brazil)

    2014-07-15

    Abdominal obesity is an important cardiovascular risk factor. Therefore, identifying the best method for measuring waist circumference (WC) is a priority. To evaluate the eight methods of measuring WC in patients with acute coronary syndrome (ACS) as a predictor of cardiovascular complications during hospitalization. Prospective study of patients with ACS. The measurement of WC was performed by eight known methods: midpoint between the last rib and the iliac crest (1), point of minimum circumference (2); immediately above the iliac crest (3), umbilicus (4), one inch above the umbilicus (5), one centimeter above the umbilicus (6), smallest rib and (7) the point of greatest circumference around the waist (8). Complications included: angina, arrhythmia, heart failure, cardiogenic shock, hypotension, pericarditis and death. Logistic regression tests were used for predictive factors. A total of 55 patients were evaluated. During the hospitalization period, which corresponded on average to seven days, 37 (67%) patients had complications, with the exception of death, which was not observed in any of the cases. Of these complications, the only one that was associated with WC was angina, and with every cm of WC increase, the risk for angina increased from 7.5 to 9.9%, depending on the measurement site. It is noteworthy the fact that there was no difference between the different methods of measuring WC as a predictor of angina. The eight methods of measuring WC are also predictors of recurrent angina after acute coronary syndromes.

  7. Fourth-Order Splitting Methods for Time-Dependant Differential Equations

    Institute of Scientific and Technical Information of China (English)

    Jürgen Geiser

    2008-01-01

    This study was suggested by previous work on the simulation of evolution equations with scale-dependent processes, e.g., wave-propagation or heat-transfer, that are modeled by wave equations or heat equations. Here, we study both parabolic and hyperbolic equations. We focus on ADI (alternating direction implicit) methods and LOD (locally one-dimensional) methods, which are standard splitting methods of lower order, e.g. second-order. Our aim is to develop higher-order ADI methods, which are performed by Richardson extrapolation, Crank-Nicolson methods and higher-order LOD methods, based on locally higher-order methods. We discuss the new theoretical results of the stability and consistency of the ADI methods. The main idea is to apply a higherorder time discretization and combine it with the ADI methods. We also discuss the discretization and splitting methods for first-order and second-order evolution equations. The stability analysis is given for the ADI method for first-order time derivatives and for the LOD (locally one-dimensional) methods for second-order time derivatives. The higher-order methods are unconditionally stable. Some numerical experiments verify our results.

  8. Time-dependent probabilistic seismic hazard assessment and its application to Hualien City, Taiwan

    Directory of Open Access Journals (Sweden)

    C.-H. Chan

    2013-05-01

    Full Text Available Here, we propose a time-dependent probabilistic seismic hazard assessment and apply it to Hualien City, Taiwan. A declustering catalog from 1940 to 2005 was used to build up a long-term seismicity rate model using a smoothing Kernel function. We also evaluated short-term seismicity rate perturbations according to the rate-and-state friction model, and the Coulomb stress changes imparted by earthquakes from 2006 to 2010. We assessed both long-term and short-term probabilistic seismic hazards by considering ground motion prediction equations for crustal and subduction earthquakes. The long-term seismic hazard in Hualien City gave a PGA (peak ground acceleration of 0.46 g for the 2.1‰ annual exceedance probability. The result is similar to the levels determined in previous studies. Seismic hazards were significantly elevated following the 2007 ML =5.8 earthquake that occurred approximately 10 km from Hualien City. This work presents an assessment of a suitable mechanism for time-dependent probabilistic seismic hazard determinations using an updated earthquake catalog. Using minor model assumptions, our approach provides a suitable basis for rapid re-evaluations and will benefit decision-makers and public officials regarding seismic hazard mitigation.

  9. How to apply the dependence structure analysis to extreme temperature and precipitation for disaster risk assessment

    Science.gov (United States)

    Feng, Jieling; Li, Ning; Zhang, Zhengtao; Chen, Xi

    2017-06-01

    IPCC reports that a changing climate can affect the frequency and the intensity of extreme events. However, the extremes appear in the tail of the probability distribution. In order to know the relationship between extreme events in the tail of temperature and precipitation, an important but previously unobserved dependence structure is analyzed in this paper. Here, we examine the dependence structure by building a bivariate joint of Gumbel copula model for temperature and precipitation using monthly average temperature (T) and monthly precipitation (P) data from Beijing station in China covering a period of 1951-2015 and find the dependence structure can be divided into two sections, they are the middle part and the upper tail. We show that T and P have a strong positive correlation in the high tail section (T > 25.85 °C and P > 171.1 mm) (=0.66, p < 0.01) while they do not demonstrate the same relation in the other section, which suggests that the identification of a strong influence of T on extreme P needs help from the dependence structure analysis. We also find that in the high tail section, every 1 °C increase in T is associated with 73.45 mm increase in P. Our results suggested that extreme precipitation fluctuations by changes in temperature will allow the data dependence structure to be included in extreme affect for the disaster risk assessment under future climate change scenarios. Copula bivariate jointed probability distribution is useful to the dependence structure analysis.

  10. Screening-Level Ecological Risk Assessment Methods, Revision 3

    Energy Technology Data Exchange (ETDEWEB)

    Mirenda, Richard J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2012-08-16

    This document provides guidance for screening-level assessments of potential adverse impacts to ecological resources from release of environmental contaminants at the Los Alamos National Laboratory (LANL or the Laboratory). The methods presented are based on two objectives, namely: to provide a basis for reaching consensus with regulators, managers, and other interested parties on how to conduct screening-level ecological risk investigations at the Laboratory; and to provide guidance for ecological risk assessors under the Environmental Programs (EP) Directorate. This guidance promotes consistency, rigor, and defensibility in ecological screening investigations and in reporting those investigation results. The purpose of the screening assessment is to provide information to the risk managers so informed riskmanagement decisions can be made. This document provides examples of recommendations and possible risk-management strategies.

  11. Methods for assessing autophagy and autophagic cell death.

    Science.gov (United States)

    Tasdemir, Ezgi; Galluzzi, Lorenzo; Maiuri, M Chiara; Criollo, Alfredo; Vitale, Ilio; Hangen, Emilie; Modjtahedi, Nazanine; Kroemer, Guido

    2008-01-01

    Autophagic (or type 2) cell death is characterized by the massive accumulation of autophagic vacuoles (autophagosomes) in the cytoplasm of cells that lack signs of apoptosis (type 1 cell death). Here we detail and critically assess a series of methods to promote and inhibit autophagy via pharmacological and genetic manipulations. We also review the techniques currently available to detect autophagy, including transmission electron microscopy, half-life assessments of long-lived proteins, detection of LC3 maturation/aggregation, fluorescence microscopy, and colocalization of mitochondrion- or endoplasmic reticulum-specific markers with lysosomal proteins. Massive autophagic vacuolization may cause cellular stress and represent a frustrated attempt of adaptation. In this case, cell death occurs with (or in spite of) autophagy. When cell death occurs through autophagy, on the contrary, the inhibition of the autophagic process should prevent cellular demise. Accordingly, we describe a strategy for discriminating cell death with autophagy from cell death through autophagy.

  12. Cognitive assessment in mathematics with the least squares distance method.

    Science.gov (United States)

    Ma, Lin; Çetin, Emre; Green, Kathy E

    2012-01-01

    This study investigated the validation of comprehensive cognitive attributes of an eighth-grade mathematics test using the least squares distance method and compared performance on attributes by gender and region. A sample of 5,000 students was randomly selected from the data of the 2005 Turkish national mathematics assessment of eighth-grade students. Twenty-five math items were assessed for presence or absence of 20 cognitive attributes (content, cognitive processes, and skill). Four attributes were found to be misspecified or nonpredictive. However, results demonstrated the validity of cognitive attributes in terms of the revised set of 17 attributes. The girls had similar performance on the attributes as the boys. The students from the two eastern regions significantly underperformed on the most attributes.

  13. Assessment of Wind Turbine for Site-Specific Conditions using Probabilistic Methods

    DEFF Research Database (Denmark)

    Heras, Enrique Gómez de las; Gutiérrez, Roberto; Azagra, Elena

    2013-01-01

    be very dependent on the site. The uncertainties on the wind properties depend on issues like the available wind data, the quality of the measurement sensors, the type of terrain or the accuracy of the engineering models for horizontal and vertical spatial extrapolation. An example is included showing two......This paper describes a new approach to assess the structural integrity of wind turbines for sitespecific conditions using probabilistic methods, taking into account the particular uncertainties associated to each site. This new approach intends to improve the site suitability analysis of wind...... turbines, helping to the decision making during the site assessment phase of wind farm designs. First, the design equation for the failure mode of interest is defined, where the loads associated to the site-specific wind conditions are compared with the design limits of the structural component. A limit...

  14. Total System Performance Assessment - License Application Methods and Approach

    Energy Technology Data Exchange (ETDEWEB)

    J. McNeish

    2003-12-08

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issues (KTIs) identified in agreements with the U.S. Nuclear Regulatory Commission, the ''Yucca Mountain Review Plan'' (YMRP), ''Final Report'' (NRC 2003 [163274]), and the NRC final rule 10 CFR Part 63 (NRC 2002 [156605]). This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are used in this document.

  15. New actigraphic assessment method for periodic leg movements (PLM).

    Science.gov (United States)

    Kazenwadel, J; Pollmächer, T; Trenkwalder, C; Oertel, W H; Kohnen, R; Künzel, M; Krüger, H P

    1995-10-01

    A new actigraphic method by which periodic leg movements (PLM) can be measured is presented. Data acquistion and analysis were brought into line to distinguish short-lasting repetive leg movements from random motor restlessness. The definition of PLM follows the generally accepted criteria for PLM scoring. Thirty restless legs patients, all also suffering from PLM, were investigated three times by polysomnography, including tibialis anterior surface electromyography and actigraphy. A high correlation (reliability) was found for the number of PLM per hour spent in bed between the two methods. Furthermore, the actigraph records PLM specifically. An index of random motor restlessness is not sufficient for a reliable PLM according. In addition, periodic movements in sleep (PMS) and PLM show comparable variability in general. The actigraphic assessment of PLM, however, gives a better measure because PMS recordings may result in a substantial underestimation of PLM when sleep efficiency is reduced. This method is an ambulatory assessment tool that can also be used for screening purposes.

  16. Antioxidant activity of wine assessed by different in vitro methods

    Directory of Open Access Journals (Sweden)

    Di Lorenzo Chiara

    2017-01-01

    Full Text Available Epidemiological studies have suggested that a diet rich in antioxidant compounds could help in counteracting the effects of reactive oxygen species, reducing the risk factors for chronic diseases. The moderate consumption of wine, especially red wine, has been associated with the reduction in mortalities from cardiovascular diseases. One of the possible reasons for the protective effect of wine can be identified in the high content of polyphenols (mainly flavonoids, which have significant antioxidant activity. Even though several in vitro tests have been developed for the measure of the antioxidant property, no method has showed a satisfactory correlation with the in vivo situation. On these bases, the aim of this study was the application and comparison of different in vitro methods to assess the antioxidant activity of red, rosé and white wines. The methods were: 1 Folin-Cocalteau's assay for the quantification of total polyphenol content; 2 the DPPH (1,1-diphenyl-2-picrylhydrazyl spectrophotometric assay and the Trolox Equivalent Antioxidant Capacity (TEAC spectrophotometric assay for measuring the antioxidant activity of samples; 3 High Performance Thin Layer Chromatography for separation of phenolic substances and assessment of the associated antioxidant activity; 4 electrochemical detection by using a biosensor. Although all the approaches show some limitations, this battery of tests offers a more reliable body of data on the antioxidant activity of vine derivatives.

  17. Electromechanical impedance method to assess dental implant stability

    Science.gov (United States)

    Tabrizi, Aydin; Rizzo, Piervincenzo; Ochs, Mark W.

    2012-11-01

    The stability of a dental implant is a prerequisite for supporting a load-bearing prosthesis and establishment of a functional bone-implant system. Reliable and noninvasive methods able to assess the bone interface of dental and orthopedic implants (osseointegration) are increasingly demanded for clinical diagnosis and direct prognosis. In this paper, we propose the electromechanical impedance method as a novel approach for the assessment of dental implant stability. Nobel Biocare® implants with a size of 4.3 mm diameter ×13 mm length were placed inside bovine bones that were then immersed in a solution of nitric acid to allow material degradation. The degradation simulated the inverse process of bone healing. The implant-bone systems were monitored by bonding a piezoceramic transducer (PZT) to the implants’ abutment and measuring the admittance of the PZT over time. It was found that the PZT’s admittance and the statistical features associated with its analysis are sensitive to the degradation of the bones and can be correlated to the loss of calcium measured by means of the atomic absorption spectroscopy method. The present study shows promising results and may pave the road towards an innovative approach for the noninvasive monitoring of dental implant stability and integrity.

  18. The commission errors search and assessment (CESA) method

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B.; Dang, V. N

    2007-05-15

    Errors of Commission (EOCs) refer to the performance of inappropriate actions that aggravate a situation. In Probabilistic Safety Assessment (PSA) terms, they are human failure events that result from the performance of an action. This report presents the Commission Errors Search and Assessment (CESA) method and describes the method in the form of user guidance. The purpose of the method is to identify risk-significant situations with a potential for EOCs in a predictive analysis. The main idea underlying the CESA method is to catalog the key actions that are required in the procedural response to plant events and to identify specific scenarios in which these candidate actions could erroneously appear to be required. The catalog of required actions provides a basis for a systematic search of context-action combinations. To focus the search towards risk-significant scenarios, the actions that are examined in the CESA search are prioritized according to the importance of the systems and functions that are affected by these actions. The existing PSA provides this importance information; the Risk Achievement Worth or Risk Increase Factor values indicate the systems/functions for which an EOC contribution would be more significant. In addition, the contexts, i.e. PSA scenarios, for which the EOC opportunities are reviewed are also prioritized according to their importance (top sequences or cut sets). The search through these context-action combinations results in a set of EOC situations to be examined in detail. CESA has been applied in a plant-specific pilot study, which showed the method to be feasible and effective in identifying plausible EOC opportunities. This experience, as well as the experience with other EOC analyses, showed that the quantification of EOCs remains an issue. The quantification difficulties and the outlook for their resolution conclude the report. (author)

  19. Element stacking method for topology optimization with material-dependent boundary and loading conditions

    DEFF Research Database (Denmark)

    Yoon, Gil Ho; Park, Y.K.; Kim, Y.Y.

    2007-01-01

    A new topology optimization scheme, called the element stacking method, is developed to better handle design optimization involving material-dependent boundary conditions and selection of elements of different types. If these problems are solved by existing standard approaches, complicated finite...... element models or topology optimization reformulation may be necessary. The key idea of the proposed method is to stack multiple elements on the same discretization pixel and select a single or no element. In this method, stacked elements on the same pixel have the same coordinates but may have...

  20. Dynamic Assessment in Iranian EFL Classrooms: A Post- method Enquiry

    Directory of Open Access Journals (Sweden)

    Seyed Javad Es-hagi Sardrood

    2011-11-01

    Full Text Available Derived from the emerging paradigm shift in English language teaching and assessment, there has been a renewal of interest in dynamic assessment (DA to be used as an alternative to the traditional static testing in language classrooms. However, to date, DA practice has been mostly limited to clinical treatments of children with learning disabilities, and it has not been widely incorporated into the EFL contexts. In order to find out the reasons behind the slow trend of DA practice, this research adopted a framework, based on the post method pedagogical principles and recommendations, to delve into the prospect of methodological realization of DA approaches in Iranian EFL classrooms. To this end, two instruments, a questionnaire and an interview were developed to explore the practicality of DA through seeking 51 Iranian EFL teachers' perception of DA practice in their classrooms. The results indicated that most of the teachers were negative about the practice of DA in their classrooms and believed that a full-fledged implementation of DA in Iranian EFL classrooms is too demanding. The feasibility of DA in Iranian EFL classrooms, where teachers are deprived of DA training, guideline, and technological resources, is questioned seriously due to the factors such as time-constrained nature of DA procedures, large number of students in EFL classrooms, the common practice of static tests as the mainstream, and overreliance on the teachers' teaching and assessment abilities. The paper suggests the framework of inquiry in this study, which was derived from the post method pedagogy, to be utilized as a blueprint for a critical appraisal of any alternative method or theory which is introduced into ELT contexts.

  1. A solution quality assessment method for swarm intelligence optimization algorithms.

    Science.gov (United States)

    Zhang, Zhaojun; Wang, Gai-Ge; Zou, Kuansheng; Zhang, Jianhua

    2014-01-01

    Nowadays, swarm intelligence optimization has become an important optimization tool and wildly used in many fields of application. In contrast to many successful applications, the theoretical foundation is rather weak. Therefore, there are still many problems to be solved. One problem is how to quantify the performance of algorithm in finite time, that is, how to evaluate the solution quality got by algorithm for practical problems. It greatly limits the application in practical problems. A solution quality assessment method for intelligent optimization is proposed in this paper. It is an experimental analysis method based on the analysis of search space and characteristic of algorithm itself. Instead of "value performance," the "ordinal performance" is used as evaluation criteria in this method. The feasible solutions were clustered according to distance to divide solution samples into several parts. Then, solution space and "good enough" set can be decomposed based on the clustering results. Last, using relative knowledge of statistics, the evaluation result can be got. To validate the proposed method, some intelligent algorithms such as ant colony optimization (ACO), particle swarm optimization (PSO), and artificial fish swarm algorithm (AFS) were taken to solve traveling salesman problem. Computational results indicate the feasibility of proposed method.

  2. Epidemiological designs for vaccine safety assessment: methods and pitfalls.

    Science.gov (United States)

    Andrews, Nick

    2012-09-01

    Three commonly used designs for vaccine safety assessment post licensure are cohort, case-control and self-controlled case series. These methods are often used with routine health databases and immunisation registries. This paper considers the issues that may arise when designing an epidemiological study, such as understanding the vaccine safety question, case definition and finding, limitations of data sources, uncontrolled confounding, and pitfalls that apply to the individual designs. The example of MMR and autism, where all three designs have been used, is presented to help consider these issues.

  3. [Assessing the quality of French language web sites pertaining to alcohol dependency].

    Science.gov (United States)

    Coquard, Olivier; Fernandez, Sebastien; Khazaal, Yasser

    2008-01-01

    The objective of this article is to systematically assess the quality of web-based information in French language on the alcohol dependence. The authors analysed, using a standardised pro forma, the 20 most highly ranked pages identified by 3 common internet search engines using 2 keywords. Results show that a total of 45 sites were analysed. The authors conclude that the overall quality of the sites was relatively poor, especially for the description of possible treatments, however with a wide variability. Content quality was not correlated with other aspects of quality such as interactivity, aesthetic or accountability.

  4. A Method Based on Intuitionistic Fuzzy Dependent Aggregation Operators for Supplier Selection

    Directory of Open Access Journals (Sweden)

    Fen Wang

    2013-01-01

    Full Text Available Recently, resolving the decision making problem of evaluation and ranking the potential suppliers have become as a key strategic factor for business firms. In this paper, two new intuitionistic fuzzy aggregation operators are developed: dependent intuitionistic fuzzy ordered weighed averaging (DIFOWA operator and dependent intuitionistic fuzzy hybrid weighed aggregation (DIFHWA operator. Some of their main properties are studied. A method based on the DIFHWA operator for intuitionistic fuzzy multiple attribute decision making is presented. Finally, an illustrative example concerning supplier selection is given.

  5. Singular boundary method using time-dependent fundamental solution for scalar wave equations

    Science.gov (United States)

    Chen, Wen; Li, Junpu; Fu, Zhuojia

    2016-11-01

    This study makes the first attempt to extend the meshless boundary-discretization singular boundary method (SBM) with time-dependent fundamental solution to two-dimensional and three-dimensional scalar wave equation upon Dirichlet boundary condition. The two empirical formulas are also proposed to determine the source intensity factors. In 2D problems, the fundamental solution integrating along with time is applied. In 3D problems, a time-successive evaluation approach without complicated mathematical transform is proposed. Numerical investigations show that the present SBM methodology produces the accurate results for 2D and 3D time-dependent wave problems with varied velocities c and wave numbers k.

  6. Sequential sampling: a novel method in farm animal welfare assessment.

    Science.gov (United States)

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  7. A New Method for Spatial Health Risk Assessment of Pollutants

    Directory of Open Access Journals (Sweden)

    Mohamad Sakizadeh*

    2017-03-01

    Full Text Available Background: The area of contaminated lands exposed to the health risk of environmental pollutants is a matter of argument. In this study, a new method was developed to estimate the amount of area that is exposed to higher than normal levels of Cr, Mn, and V. Methods: Overall, 170 soil samples were collected from the upper 10 cm of soil in an arid area in central part of Iran in Semnan Province. The values of Cr, Mn, and V were detected by ICP-OES technique. A geostatistical method known as sequential Gaussian co-simulation was applied to consider the spatial risk of these toxic elements. Results: The moderate spatial dependence of Cr indicates the contribution of both intrinsic and extrinsic factor to the levels of this heavy metal in the study area, whereas, Mn and V can be attributed to intrinsic factors (such as lithology. There has not been any significant influence due to agricultural practices on the Cr values in the region. The surface of contaminated area for manganese, produced by risk curve on surface method, was higher than chromium and vanadium. Conclusion: The produced risk curves as rendered in this study can be adopted in similar studies to help managers to estimate the total area that requires cleanup action.

  8. Compatibility assessment of methods used for soil hydrophobicity determination

    Science.gov (United States)

    Papierowska, Ewa; Szatyłowicz, Jan; Kalisz, Barbara; Łachacz, Andrzej; Matysiak, Wojciech; Debaene, Guillaume

    2016-04-01

    Soil hydrophobicity is a global problem. Effect of hydrophobicity on the soil environment is very important, because it can cause irreversible changes in ecosystems, leading to their complete degradation. The choice of method used to determine soil hydrophobicity is not simple because there is no obvious criteria for their selection. The results obtained by various methods may not be coherent and may indicate different degrees of hydrophobicity within the same soil sample. The objective of the study was to assess the compatibility between methods used to determine the hydrophobicity of selected organic and mineral-organic soils. Two groups of soil materials were examined: hydrogenic (87 soil samples) and autogenic soils (19 soil samples) collected from 41 soil profiles located in north-eastern Poland. Air-dry soil samples were used. Hydrophobicity was determined using two different methods i.e. on the basis of wetting contact angle measurements between water and solid phase of soils and with water drop penetration time tests. The value of the wetting contact angle was measured using the sessile drop method with optical goniometer CAM 100 (KSV Instruments). The wetting contact angles were determined at room temperature (20° C) within 10 min after sample preparation using standard procedure. In addition, water drop penetration time was measured. In order to compare the methods used for the assessment of soil hydrophobicity, the agreement between observers model was applied. In this model five categories of soil hydrophobicity were proposed according to the class used in the soil hydrofobicity classification based on water drop penetration time test. Based on this classification the values of the weighted kappa coefficients were calculated using SAS 9.4 (SAS Institute, 2013, Cary NC) for evaluating relationships between between the different investigated methods. The results of agreement were presented in forms of agreement charts. Research results indicated good

  9. Ozone risk assessment for plants: Central role of metabolism-dependent changes in reducing power

    Energy Technology Data Exchange (ETDEWEB)

    Dizengremel, Pierre [Faculte des Sciences et Techniques, UMR1137 Ecologie et Ecophysiologie Forestieres, Nancy-Universite, BP239, F-54506 Vandoeuvre-les-Nancy Cedex (France)], E-mail: pierre.dizengremel@scbiol.uhp-nancy.fr; Le Thiec, Didier [INRA, UMR1137 Ecologie et Ecophysiologie Forestieres, Centre INRA de Nancy, F-54280 Champenoux (France)], E-mail: le_thiec@nancy.inra.fr; Bagard, Matthieu [Faculte des Sciences et Techniques, UMR1137 Ecologie et Ecophysiologie Forestieres, Nancy-Universite, BP239, F-54506 Vandoeuvre-les-Nancy Cedex (France)], E-mail: matthieu.bagard@scbiol.uhp-nancy.fr; Jolivet, Yves [Faculte des Sciences et Techniques, UMR1137 Ecologie et Ecophysiologie Forestieres, Nancy-Universite, BP239, F-54506 Vandoeuvre-les-Nancy Cedex (France)], E-mail: yves.jolivet@scbiol.uhp-nancy.fr

    2008-11-15

    The combination of stomatal-dependent ozone flux and total ascorbate level is currently presented as a correct indicator for determining the degree of sensitivity of plants to ozone. However, the large changes in carbon metabolism could play a central role in the strategy of the foliar cells in response to chronic ozone exposure, participating in the supply of reducing power and carbon skeletons for repair and detoxification, and modifying the stomatal mode of functioning. To reinforce the accuracy of the definition of the threshold for ozone risk assessment, it is proposed to also consider the redox pool (NAD(P)H), the ratio between carboxylases and the water use efficiency as indicators of the differential ozone tolerance of plants. - We propose reducing power, Rubisco/PEPc ratio and water use efficiency as additional indicators in ozone risk assessment for plants.

  10. Vulnerability assessment of groundwater-dependent ecosystems based on integrated groundwater flow modell construction

    Science.gov (United States)

    Tóth, Ádám; Simon, Szilvia; Galsa, Attila; Havril, Timea; Monteiro Santos, Fernando A.; Müller, Imre; Mádl-Szőnyi, Judit

    2017-04-01

    Groundwater-dependent ecosystems (GDEs) are highly influenced by the amount of groundwater, seasonal variation of precipitation and consequent water table fluctuation and also the anthropogenic activities. They can be regarded as natural surface manifestations of the flowing groundwater. The preservation of environment and biodiversity of these GDEs is an important issue worldwide, however, the water management policy and action plan could not be constructed in absense of proper hydrogeological knowledge. The concept of gravity-driven regional groundwater flow could aid the understanding of flow pattern and interpretation of environmental processes and conditions. Unless the required well data are available, the geological-hydrogeological numerical model of the study area cannot be constructed based only on borehole information. In this case, spatially continuous geophysical data can support groundwater flow model building: systematically combined geophysical methods can provide model input. Integration of lithostratigraphic, electrostratigraphic and hydrostratigraphic information could aid groundwater flow model construction: hydrostratigraphic units and their hydraulic behaviour, boundaries and geometry can be obtained. Groundwater-related natural manifestations, such as GDEs, can be explained with the help of the revealed flow pattern and field mapping of features. Integrated groundwater flow model construction for assessing the vulnerability of GDEs was presented via the case study of the geologically complex area of Tihany Peninsula, Hungary, with the aims of understanding the background and occurrence of groundwater-related environmental phenomena, surface water-groundwater interaction, and revealing the potential effect of anthropogenic activity and climate change. In spite of its important and protected status, fluid flow model of the area, which could support water management and natural protection policy, had not been constructed previously. The 3D

  11. Assessment of the effectiveness of yoga therapy as an adjunct in patients with alcohol dependence syndrome

    Directory of Open Access Journals (Sweden)

    Dipesh Bhagabati

    2017-01-01

    Full Text Available Introduction: Substance use disorders, alcohol use in particular, are among the leading disorders in psychiatry in terms of prevalence. They put a lot of burden on health as well as family, society, and economic status of the patient. What more challenging is that such patients often suffer from comorbid anxiety and depression, which has the potential to perpetuate the alcohol use. Yoga is an alternative and complementary therapy which is widely practiced by people in India. However, its effectiveness in alcohol use disorders is not tested systematically. Aims and objectives: To study the effectiveness of yoga as an adjunctive therapy in patients of alcohol use disorders and to evaluate its ability to reduce comorbid depression, anxiety, and craving. Materials and methods: Hundred patients of alcohol use disorders as per the tenth revision of the International Classification of Diseases and Related Health Problems (ICD-10 were selected and were divided into two groups each containing 50 patients. The case group received structured yoga session in addition to standard pharmacotherapy while the control group received only pharmacotherapy. Assessment of depression (Hamilton depression rating scale [HAM-D], anxiety (Hamilton anxiety rating scale [HAM-A], and craving (Obsessive-compulsive drinking scale [OCDS] was done at baseline, two weeks, and at one month. Results were compared between the two groups and statistical analysis was done. Results: Both the case and control groups were similar in HAM-D (p=0.9634, HAM-A (p=0.7744, and OCDS (p= 0.8626 scores at baseline. There was significant reduction in HAM-A score at one month (p=0.0091, and OCDS score at two weeks (p=0.0428 and one month (p<0.0001 respectively in yoga group as compared to control group. Within case group, only reduction in HAM-A (p<0.001 and p<0.01 and OCDS (p<0.0001 and p<0.0001 scores were progressively better statistically at two weeks and one month while reduction in HAM-D score

  12. Recent Research Advances in the Risk Assessment Method of an Underground Pressure Pipeline

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    This paper reviews the risk assessment method of an underground pressure pipeline, introduces the risk assessment method of expert grading, fuzzy integrative assessment, probabilistic risk assessment and extenics assessment in an underground pressure pipeline. Moreover, it puts forward the developing orientation of risk assessment.

  13. A Systems-Level Approach to Building Sustainable Assessment Cultures: Moderation, Quality Task Design and Dependability of Judgement

    Science.gov (United States)

    Colbert, Peta; Wyatt-Smith, Claire; Klenowski, Val

    2012-01-01

    This article considers the conditions that are necessary at system and local levels for teacher assessment to be valid, reliable and rigorous. With sustainable assessment cultures as a goal, the article examines how education systems can support local-level efforts for quality learning and dependable teacher assessment. This is achieved through…

  14. A Systems-Level Approach to Building Sustainable Assessment Cultures: Moderation, Quality Task Design and Dependability of Judgement

    Science.gov (United States)

    Colbert, Peta; Wyatt-Smith, Claire; Klenowski, Val

    2012-01-01

    This article considers the conditions that are necessary at system and local levels for teacher assessment to be valid, reliable and rigorous. With sustainable assessment cultures as a goal, the article examines how education systems can support local-level efforts for quality learning and dependable teacher assessment. This is achieved through…

  15. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    Directory of Open Access Journals (Sweden)

    Margaret M. MacDonell

    2013-01-01

    Full Text Available The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1 planning, scoping, and problem formulation; (2 environmental fate and transport; (3 exposure analysis extending to human factors; (4 toxicity analysis; and (5 risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities.

  16. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    Science.gov (United States)

    MacDonell, Margaret M.; Haroun, Lynne A.; Teuschler, Linda K.; Rice, Glenn E.; Hertzberg, Richard C.; Butler, James P.; Chang, Young-Soo; Clark, Shanna L.; Johns, Alan P.; Perry, Camarie S.; Garcia, Shannon S.; Jacobi, John H.; Scofield, Marcienne A.

    2013-01-01

    The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1) planning, scoping, and problem formulation; (2) environmental fate and transport; (3) exposure analysis extending to human factors; (4) toxicity analysis; and (5) risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities. PMID:23762048

  17. Using different methods to assess the discomfort during car driving.

    Science.gov (United States)

    Ravnik, David; Otáhal, Stanislav; Dodic Fikfak, Metoda

    2008-03-01

    This study investigated the discomfort caused by car driving. Discomfort estimates were achieved by self-administered questionnaire, measured by different testing methods, and through the goniometry of principal angles. Data from a total of 200 non-professional drivers who fulfilled the questionnaire was analysed. 118 subjects were analysed by goniometry and 30 drivers were assessed using the OWAS (Ovaco orking Posture Analysis), RULA (Rapid Upper Limb Assessment), and CORLETT tests. The aim of this paper was to assess the appearance of the discomfort and to find some correlations between drivers' postures. Results suggest that different levels of discomfort are perceived in different body regions when driving cars. Differences appear mostly between the genders concerning the discomfort. With the questionnaire and the different estimation techniques, it is possible to identify 'at risk' drivers and ensure urgent attention when necessary. It can be concluded that the questionnare and the CORLETT test are good in predicting location of discomfort. TheB org CRI10scale is good indicator of the level of the discomfort, while OWAS and RULA can appraise the body posture to predict discomfort appearance. According to the goniometry data, the drivers posture could be one of the contributing factors in appearing of discomfort.

  18. Comparative Assessment of Environmental Flow Estimation Methods in a Mediterranean Mountain River

    Science.gov (United States)

    Papadaki, Christina; Soulis, Konstantinos; Ntoanidis, Lazaros; Zogaris, Stamatis; Dercas, Nicholas; Dimitriou, Elias

    2017-08-01

    The ecological integrity of rivers ultimately depends on flow regime. Flow degradation is especially prominent in Mediterranean systems and assessing environmental flows in modified rivers is difficult, especially in environments with poor hydrologic monitoring and data availability. In many Mediterranean countries, which are characterized by pronounced natural variability and low summer flows, water management actions usually focus on prescribing minimum acceptable flows estimated by hydrologic methods. In this study, a comparative assessment of environmental flow estimation methods is developed in a river with poorly monitored flows and limited understanding of past reference conditions. This assessment incorporates both a hydrologic and a fish habitat simulation effort that takes into consideration hydrologic seasonality in a Greek mountainous river. The results of this study indicate that especially in data scarce regions the utilization of biotic indicators through habitat models, may provide valuable information, beyond that achievable with hydrologic methods, for developing regional environmental flow criteria. Despite the widespread use of the method, challenges in transferability of fish habitat simulation provide undefined levels of uncertainty and may require the concurrent use of different assessment tools and site-specific study.

  19. Towards a Quality Assessment Method for Learning Preference Profiles in Negotiation

    Science.gov (United States)

    Hindriks, Koen V.; Tykhonov, Dmytro

    In automated negotiation, information gained about an opponent's preference profile by means of learning techniques may significantly improve an agent's negotiation performance. It therefore is useful to gain a better understanding of how various negotiation factors influence the quality of learning. The quality of learning techniques in negotiation are typically assessed indirectly by means of comparing the utility levels of agreed outcomes and other more global negotiation parameters. An evaluation of learning based on such general criteria, however, does not provide any insight into the influence of various aspects of negotiation on the quality of the learned model itself. The quality may depend on such aspects as the domain of negotiation, the structure of the preference profiles, the negotiation strategies used by the parties, and others. To gain a better understanding of the performance of proposed learning techniques in the context of negotiation and to be able to assess the potential to improve the performance of such techniques a more systematic assessment method is needed. In this paper we propose such a systematic method to analyse the quality of the information gained about opponent preferences by learning in single-instance negotiations. The method includes measures to assess the quality of a learned preference profile and proposes an experimental setup to analyse the influence of various negotiation aspects on the quality of learning. We apply the method to a Bayesian learning approach for learning an opponent's preference profile and discuss our findings.

  20. Isospin-dependent relativistic microscopic optical potential in the Dirac Brueckner-Hartree-Fock method

    Institute of Scientific and Technical Information of China (English)

    RONG; Jian; MA; Zhongyu

    2004-01-01

    The relativistic microscopic optical potential in the asymmetric nuclear matter is studied in the framework of the Dirac Brueckner-Hartree-Fock method. A new decomposition of the Dirac structure of the nuclear self-energy in nuclear matter is adopted. The self-energy of a nucleon with E> 0 in nuclear matter is calculated with the G matrix in the Hartree-Fock approach. The optical potential of a nucleon in the nuclear medium is identified with the nucleon self-energy. The energy and asymmetric parameter dependence of the relativistic optical potentials for proton and neutron are discussed. The resulting Schroedinger equivalent potentials have reasonable behaviors of the energy dependence. The asymmetric parameter dependence of relativistic optical potentials and Schroedinger potentials are emphasized.

  1. Sweepless time-dependent transport calculations using the staggered block Jacobi method

    Energy Technology Data Exchange (ETDEWEB)

    Davidson, G [Los Alamos National Laboratory; Larsen, E W [Los Alamos National Laboratory

    2009-01-01

    The Staggered-Block Jacobi (SBJ) method is a new numerical SN transport method for solving time-dependent problems without sweeps or low-order acceleration. Because it is a Jacobian method, it is trivial to parallelize and will scale linearly with the number of processors, It is highly accurate in thick-diffusive problems and unconditionally stable when combined with the lumped linear discontinuous finite element spatial discretization. In this way, the SBJ method is complementary to sweep-based methods, which are accurate and efficient in thin, streaming regions but inefficient in thick, diffusive problems without acceleration. We have extended previous work by demonstrating how sweep-based methods and the SBJ method may be combined to produce a method which is accurate and efficient without acceleration in all optical thicknesses while still retaining good parallel efficiency. Furthermore, iterations may also be added to the SBJ method. This is particularly useful for improving the accuracy of the SBJ method in intermediate-thickness problems.

  2. A hierarchical network modeling method for railway tunnels safety assessment

    Science.gov (United States)

    Zhou, Jin; Xu, Weixiang; Guo, Xin; Liu, Xumin

    2017-02-01

    Using network theory to model risk-related knowledge on accidents is regarded as potential very helpful in risk management. A large amount of defects detection data for railway tunnels is collected in autumn every year in China. It is extremely important to discover the regularities knowledge in database. In this paper, based on network theories and by using data mining techniques, a new method is proposed for mining risk-related regularities to support risk management in railway tunnel projects. A hierarchical network (HN) model which takes into account the tunnel structures, tunnel defects, potential failures and accidents is established. An improved Apriori algorithm is designed to rapidly and effectively mine correlations between tunnel structures and tunnel defects. Then an algorithm is presented in order to mine the risk-related regularities table (RRT) from the frequent patterns. At last, a safety assessment method is proposed by consideration of actual defects and possible risks of defects gained from the RRT. This method cannot only generate the quantitative risk results but also reveal the key defects and critical risks of defects. This paper is further development on accident causation network modeling methods which can provide guidance for specific maintenance measure.

  3. Signal Processing Methods for Liquid Rocket Engine Combustion Stability Assessments

    Science.gov (United States)

    Kenny, R. Jeremy; Lee, Erik; Hulka, James R.; Casiano, Matthew

    2011-01-01

    The J2X Gas Generator engine design specifications include dynamic, spontaneous, and broadband combustion stability requirements. These requirements are verified empirically based high frequency chamber pressure measurements and analyses. Dynamic stability is determined with the dynamic pressure response due to an artificial perturbation of the combustion chamber pressure (bomb testing), and spontaneous and broadband stability are determined from the dynamic pressure responses during steady operation starting at specified power levels. J2X Workhorse Gas Generator testing included bomb tests with multiple hardware configurations and operating conditions, including a configuration used explicitly for engine verification test series. This work covers signal processing techniques developed at Marshall Space Flight Center (MSFC) to help assess engine design stability requirements. Dynamic stability assessments were performed following both the CPIA 655 guidelines and a MSFC in-house developed statistical-based approach. The statistical approach was developed to better verify when the dynamic pressure amplitudes corresponding to a particular frequency returned back to pre-bomb characteristics. This was accomplished by first determining the statistical characteristics of the pre-bomb dynamic levels. The pre-bomb statistical characterization provided 95% coverage bounds; these bounds were used as a quantitative measure to determine when the post-bomb signal returned to pre-bomb conditions. The time for post-bomb levels to acceptably return to pre-bomb levels was compared to the dominant frequency-dependent time recommended by CPIA 655. Results for multiple test configurations, including stable and unstable configurations, were reviewed. Spontaneous stability was assessed using two processes: 1) characterization of the ratio of the peak response amplitudes to the excited chamber acoustic mode amplitudes and 2) characterization of the variability of the peak response

  4. Thermography as a quantitative imaging method for assessing postoperative inflammation

    Science.gov (United States)

    Christensen, J; Matzen, LH; Vaeth, M; Schou, S; Wenzel, A

    2012-01-01

    Objective To assess differences in skin temperature between the operated and control side of the face after mandibular third molar surgery using thermography. Methods 127 patients had 1 mandibular third molar removed. Before the surgery, standardized thermograms were taken of both sides of the patient's face using a Flir ThermaCam™ E320 (Precisions Teknik AB, Halmstad, Sweden). The imaging procedure was repeated 2 days and 7 days after surgery. A region of interest including the third molar region was marked on each image. The mean temperature within each region of interest was calculated. The difference between sides and over time were assessed using paired t-tests. Results No significant difference was found between the operated side and the control side either before or 7 days after surgery (p > 0.3). The temperature of the operated side (mean: 32.39 °C, range: 28.9–35.3 °C) was higher than that of the control side (mean: 32.06 °C, range: 28.5–35.0 °C) 2 days after surgery [0.33 °C, 95% confidence interval (CI): 0.22–0.44 °C, p 0.1). After 2 days, the operated side was not significantly different from the temperature pre-operatively (p = 0.12), whereas the control side had a lower temperature (0.57 °C, 95% CI: 0.29–0.86 °C, p < 0.001). Conclusions Thermography seems useful for quantitative assessment of inflammation between the intervention side and the control side after surgical removal of mandibular third molars. However, thermography cannot be used to assess absolute temperature changes due to normal variations in skin temperature over time. PMID:22752326

  5. A quantitative assessment method for Ascaris eggs on hands.

    Directory of Open Access Journals (Sweden)

    Aurelie Jeandron

    Full Text Available The importance of hands in the transmission of soil transmitted helminths, especially Ascaris and Trichuris infections, is under-researched. This is partly because of the absence of a reliable method to quantify the number of eggs on hands. Therefore, the aim of this study was to develop a method to assess the number of Ascaris eggs on hands and determine the egg recovery rate of the method. Under laboratory conditions, hands were seeded with a known number of Ascaris eggs, air dried and washed in a plastic bag retaining the washing water, in order to determine recovery rates of eggs for four different detergents (cationic [benzethonium chloride 0.1% and cetylpyridinium chloride CPC 0.1%], anionic [7X 1% - quadrafos, glycol ether, and dioctyl sulfoccinate sodium salt] and non-ionic [Tween80 0.1% -polyethylene glycol sorbitan monooleate] and two egg detection methods (McMaster technique and FLOTAC. A modified concentration McMaster technique showed the highest egg recovery rate from bags. Two of the four diluted detergents (benzethonium chloride 0.1% and 7X 1% also showed a higher egg recovery rate and were then compared with de-ionized water for recovery of helminth eggs from hands. The highest recovery rate (95.6% was achieved with a hand rinse performed with 7X 1%. Washing hands with de-ionized water resulted in an egg recovery rate of 82.7%. This washing method performed with a low concentration of detergent offers potential for quantitative investigation of contamination of hands with Ascaris eggs and of their role in human infection. Follow-up studies are needed that validate the hand washing method under field conditions, e.g. including people of different age, lower levels of contamination and various levels of hand cleanliness.

  6. A Deterministic-Monte Carlo Hybrid Method for Time-Dependent Neutron Transport Problems

    Energy Technology Data Exchange (ETDEWEB)

    Justin Pounders; Farzad Rahnema

    2001-10-01

    A new deterministic-Monte Carlo hybrid solution technique is derived for the time-dependent transport equation. This new approach is based on dividing the time domain into a number of coarse intervals and expanding the transport solution in a series of polynomials within each interval. The solutions within each interval can be represented in terms of arbitrary source terms by using precomputed response functions. In the current work, the time-dependent response function computations are performed using the Monte Carlo method, while the global time-step march is performed deterministically. This work extends previous work by coupling the time-dependent expansions to space- and angle-dependent expansions to fully characterize the 1D transport response/solution. More generally, this approach represents and incremental extension of the steady-state coarse-mesh transport method that is based on global-local decompositions of large neutron transport problems. An example of a homogeneous slab is discussed as an example of the new developments.

  7. Dissociation and dissociative ionization of H2+ using the time-dependent surface flux method

    CERN Document Server

    Yue, Lun

    2014-01-01

    The time-dependent surface flux method developed for the description of electronic spectra [L. Tao and A. Scrinzi, New J. Phys. 14, 013021 (2012); A. Scrinzi, New J. Phys. 14, 085008 (2012)] is extended to treat dissociation and dissociative ionization processes of H2+ interacting with strong laser pulses. By dividing the simulation volume into proper spatial regions associated with the individual reaction channels and monitoring the probability flux, the joint energy spectrum for the dissociative ionization process and the energy spectrum for dissociation is obtained. The methodology is illustrated by solving the time-dependent Schr\\"{o}dinger equation (TDSE) for a collinear one-dimensional model of H2+ with electronic and nuclear motions treated exactly and validated by comparison with published results for dissociative ionization. The results for dissociation are qualitatively explained by analysis based on dressed diabatic Floquet potential energy curves, and the method is used to investigate the breakdow...

  8. Assessment of shock capturing schemes for discontinuous Galerkin method

    Institute of Scientific and Technical Information of China (English)

    于剑; 阎超; 赵瑞

    2014-01-01

    This paper carries out systematical investigations on the performance of several typical shock-capturing schemes for the discontinuous Galerkin (DG) method, including the total variation bounded (TVB) limiter and three artificial diffusivity schemes (the basis function-based (BF) scheme, the face residual-based (FR) scheme, and the element residual-based (ER) scheme). Shock-dominated flows (the Sod problem, the Shu-Osher problem, the double Mach reflection problem, and the transonic NACA0012 flow) are considered, addressing the issues of accuracy, non-oscillatory property, dependence on user-specified constants, resolution of discontinuities, and capability for steady solutions. Numerical results indicate that the TVB limiter is more efficient and robust, while the artificial diffusivity schemes are able to preserve small-scale flow structures better. In high order cases, the artificial diffusivity schemes have demonstrated superior performance over the TVB limiter.

  9. Photodissociation of NaH using time-dependent Fourier grid method

    Indian Academy of Sciences (India)

    Anindita Bhattacharjee; Krishna Rai Dastidar

    2002-03-01

    We have solved the time dependent Schrödinger equation by using the Chebyshev polynomial scheme and Fourier grid Hamiltonian method to calculate the dissociation cross section of NaH molecule by 1-photon absorption from the 1+ state to the 1 state. We have found that the results differ significantly from an earlier calculation [1] although we have used the same set of potential energy curves [2].

  10. An analytical method for determining the temperature dependent moisture diffusivities of pumpkin seeds during drying process

    Energy Technology Data Exchange (ETDEWEB)

    Can, Ahmet [Department of Mechanical Engineering, University of Trakya, 22030 Edirne (Turkey)

    2007-02-15

    This paper presents an analytical method, which determines the moisture diffusion coefficients for the natural and forced convection hot air drying of pumpkin seeds and their temperature dependence. In order to obtain scientific data, the pumpkin seed drying process was investigated under both natural and forced hot air convection regimes. This paper presents the experimental results in which the drying air was heated by solar energy. (author)

  11. A Novel Absorbing Boundary Condition for the Frequency-DependentFinite-Difference Time-Domain Method

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A new absorbing boundary condition (ABC) for frequency-dependent finite-difference time-domain algorithm for the arbitrary dispersive media is presented. The concepts of the digital systems are introduced to the (FD)2TD method. On the basis of digital filter designing and vector algebra, the absorbing boundary condition under arbitrary angle of incidence are derived. The transient electromagnetic problems in two-dimensions and three-dimensions are calculated and the validity of the ABC is verified.

  12. Culture-dependent and culture-independent methods reveal diverse methylotrophic communities in terrestrial environments

    OpenAIRE

    Eyice, Özge; Schäfer, Hendrik

    2016-01-01

    One-carbon compounds such as methanol, dimethylsulfide (DMS) and dimethylsulfoxide (DMSO) are significant intermediates in biogeochemical cycles. They are suggested to affect atmospheric chemistry and global climate. Methylotrophic microorganisms are considered as a significant sink for these compounds; therefore, we analyzed the diversity of terrestrial bacteria that utilize methanol, DMS and DMSO as carbon and energy source using culture-dependent and culture-independent methods. The effect...

  13. Method of Assessment of Hard Rock Workability using Bucket Wheel Excavators

    Science.gov (United States)

    Machniak, Łukasz; Kozioł, Wiesław

    2017-03-01

    A new hypothesis concerning a process of the mining solid rocks using bucket wheel excavators (BWE). Destroying of the rock mass structure is a result of breaking and not, as so far accepted, of cutting. This approach excludes, for the description of solid rock workability using bucket wheel excavators, used classifications based on individual linear or surface resistances of cutting. The possibility of a replacement mechanism for determining of the workability by bucket wheel excavators using rippers was assumed. On this basis, an innovative method for assessing the workability of solid rocks was developed, which is a combination of an derived empirical energy relationship LSE of breaking by tractor rippers from a compressive strength, a seismic wave velocity, a density of solid rock, and the modified classification of workability by bucket wheel excavators according to Bulukbasi (1991). The proposed method allows for multi-parameter assessment of the workability class based on the parameters that are independent variables in the specified dependencies.

  14. Mathematical economics methods in assessing the effects of institutional factors on foreign trade

    Science.gov (United States)

    Kazantseva, M. A.; Nepp, A. N.

    2016-12-01

    Foreign trade activity (FT) is an essential driver of economic development; therefore, factors affecting its efficiency should be analysed. Along with the conventional economic factors affecting FT development, a focus should be given to institutional factors, whose role also cannot be neglected. Recent studies show institutional factors to produce both qualitative and quantitative effects on a country's economic development, with various criteria and assessment approaches having been developed for their estimation. This paper classifies mathematical methods used to assess the effect of institutional factors on FT efficiency. An analysis of conventional mathematical models describing the relationship between institutional factors and FT indicators is provided. Mathematical methods are currently the major instrument for the analysis of FT parameters and their dependence on various external factors.

  15. Comparison of culture-dependent and -independent methods for bacterial community monitoring during Montasio cheese manufacturing.

    Science.gov (United States)

    Carraro, Lisa; Maifreni, Michela; Bartolomeoli, Ingrid; Martino, Maria Elena; Novelli, Enrico; Frigo, Francesca; Marino, Marilena; Cardazzo, Barbara

    2011-04-01

    The microbial community in milk is of great importance in the manufacture of traditional cheeses produced using raw milk and natural cultures. During milk curdling and cheese ripening, complex interactions occur in the microbial community, and accurate identification of the microorganisms involved provides essential information for understanding their role in these processes and in flavor production. Recent improvements in molecular biological methods have led to their application to food matrices, and thereby opened new perspectives for the study of microbial communities in fermented foods. In this study, a description of microbial community composition during the manufacture and ripening of Montasio cheese was provided. A combined approach using culture-dependent and -independent methods was applied. Culture-dependent identification was compared with 16S clone libraries sequencing data obtained from both DNA and reverse-transcribed RNA (cDNA) amplification and real-time quantitative PCR (qPCR) assays developed to detect and quantify specific bacterial species/genera (Streptococcus thermophilus, Lactobacillus casei, Pediococcus pentosaceus, Enterococcus spp., Pseudomonas spp.). S. thermophilus was the predominant LAB species throughout the entire ripening period of Montasio cheese. The culture-independent method demonstrates the relevant presence of Pseudomonas spp. and Lactococcus piscium at the beginning of ripening. The culture-dependent approach and the two culture-independent approaches produced complementary information, together generating a general view of cheese microbial ecology.

  16. Systems and Methods for Parameter Dependent Riccati Equation Approaches to Adaptive Control

    Science.gov (United States)

    Kim, Kilsoo (Inventor); Yucelen, Tansel (Inventor); Calise, Anthony J. (Inventor)

    2015-01-01

    Systems and methods for adaptive control are disclosed. The systems and methods can control uncertain dynamic systems. The control system can comprise a controller that employs a parameter dependent Riccati equation. The controller can produce a response that causes the state of the system to remain bounded. The control system can control both minimum phase and non-minimum phase systems. The control system can augment an existing, non-adaptive control design without modifying the gains employed in that design. The control system can also avoid the use of high gains in both the observer design and the adaptive control law.

  17. The adaptive CCCG({eta}) method for efficient solution of time dependent partial differential equations

    Energy Technology Data Exchange (ETDEWEB)

    Campos, F.F. [Universidade Federal de Minas Gerais, Belo Horizonte (Brazil); Birkett, N.R.C. [Oxford Univ. Computing Lab. (United Kingdom)

    1996-12-31

    The Controlled Cholesky factorisation has been shown to be a robust preconditioner for the Conjugate Gradient method. In this scheme the amount of fill-in is defined in terms of a parameter {eta}, the number of extra elements allowed per column. It is demonstrated how an optimum value of {eta} can be automatically determined when solving time dependent p.d.e.`s using an implicit time step method. A comparison between CCCG({eta}) and the standard ICCG solving parabolic problems on general grids shows CCCG({eta}) to be an efficient general purpose solver.

  18. Numerical method for solving the three-dimensional time-dependent neutron diffusion equation

    Energy Technology Data Exchange (ETDEWEB)

    Khaled, S.M. [Institute of Nuclear Techniques, Budapest University of Technology and Economics, Budapest (Hungary)]. E-mail: K_S_MAHMOUD@hotmail.com; Szatmary, Z. [Institute of Nuclear Techniques, Budapest University of Technology and Economics, Budapest (Hungary)]. E-mail: szatmary@reak.bme.hu

    2005-07-01

    A numerical time-implicit method has been developed for solving the coupled three-dimensional time-dependent multi-group neutron diffusion and delayed neutron precursor equations. The numerical stability of the implicit computation scheme and the convergence of the iterative associated processes have been evaluated. The computational scheme requires the solution of large linear systems at each time step. For this purpose, the point over-relaxation Gauss-Seidel method was chosen. A new scheme was introduced instead of the usual source iteration scheme. (author)

  19. Interactive Rapid Dose Assessment Model (IRDAM): reactor-accident assessment methods. Vol. 2

    Energy Technology Data Exchange (ETDEWEB)

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness, the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This document describes the technical bases for IRDAM including methods, models and assumptions used in calculations. IRDAM calculates whole body (5-cm depth) and infant thyroid doses at six fixed downwind distances between 500 and 20,000 meters. Radionuclides considered primarily consist of noble gases and radioiodines. In order to provide a rapid assessment capability consistent with the capacity of the Osborne-1 computer, certain simplifying approximations and assumptions are made. These are described, along with default values (assumptions used in the absence of specific input) in the text of this document. Two companion volumes to this one provide additional information on IRDAM. The user's Guide (NUREG/CR-3012, Volume 1) describes the setup and operation of equipment necessary to run IRDAM. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios.

  20. A proposed assessment method for image of regional educational institutions

    Directory of Open Access Journals (Sweden)

    Kataeva Natalya

    2017-01-01

    Full Text Available Market of educational services in the current Russian economic conditions is a complex of a huge variety of educational institutions. Market of educational services is already experiencing a significant influence of the demographic situation in Russia. This means that higher education institutions are forced to fight in a tough competition for high school students. Increased competition in the educational market forces universities to find new methods of non-price competition in attraction of potential students and throughout own educational and economic activities. Commercialization of education places universities in a single plane with commercial companies who study a positive perception of the image and reputation as a competitive advantage, which is quite acceptable for use in strategic and current activities of higher education institutions to ensure the competitiveness of educational services and educational institution in whole. Nevertheless, due to lack of evidence-based proposals in this area there is a need for scientific research in terms of justification of organizational and methodological aspects of image use as a factor in the competitiveness of the higher education institution. Theoretically and practically there are different methods and ways of evaluating the company’s image. The article provides a comparative assessment of the existing valuation methods of corporate image and the author’s method of estimating the image of higher education institutions based on the key influencing factors. The method has been tested on the Vyatka State Agricultural Academy (Russia. The results also indicate the strengths and weaknesses of the institution, highlights ways of improving, and adjusts the efforts for image improvement.

  1. A Survey of Insulin-Dependent Diabetes—Part II: Control Methods

    Directory of Open Access Journals (Sweden)

    Daisuke Takahashi

    2008-01-01

    Full Text Available We survey blood glucose control schemes for insulin-dependent diabetes therapies and systems. These schemes largely rely on mathematical models of the insulin-glucose relations, and these models are typically derived in an empirical or fundamental way. In an empirical way, the experimental insulin inputs and resulting blood-glucose outputs are used to generate a mathematical model, which includes a couple of equations approximating a very complex system. On the other hand, the insulin-glucose relation is also explained from the well-known facts of other biological mechanisms. Since these mechanisms are more or less related with each other, a mathematical model of the insulin-glucose system can be derived from these surrounding relations. This kind of method of the mathematical model derivation is called a fundamental method. Along with several mathematical models, researchers develop autonomous systems whether they involve medical devices or not to compensate metabolic disorders and these autonomous systems employ their own control methods. Basically, in insulin-dependent diabetes therapies, control methods are classified into three categories: open-loop, closed-loop, and partially closed-loop controls. The main difference among these methods is how much the systems are open to the outside people.

  2. Comparison of methods for assessing thyroid function in nonthyroidal illness

    Energy Technology Data Exchange (ETDEWEB)

    Melmed, S.; Geola, F.L.; Reed, A.W.; Pekary, A.E.; Park, J.; Hershman, J.M.

    1982-02-01

    Various tests of thyroid function were studied in sick patients with nonthyroidal illness (NTI) in order to determine the utility of each test for differentiating these patients from a group with hypothyroidism. We evaluated each test in 22 healthy volunteers who served as controls, 20 patients with hypothyroidism, 14 patients admitted to medical intensive care unit whose serum T/sub 4/ was less than 5 ..mu..g/dl, 13 patients with chronic liver disease, 32 patients on chronic hemodialysis for renal failure, 13 ambulatory oncology patients receiving chemotherapy 16 pregnant women, 7 women on estrogens, and 20 hyperthyroid patients. On all samples, we measured serum T/sub 4/, the free T/sub 4/ index by several methods, free T/sub 4/ by equilibrium dialysis, free T/sub 4/ calculated from thyronine-binding globulin (TBG) RIA, free T/sub 4/ by three commercial kits (Gammacoat, Immophase, and Liquisol), T/sub 3/, rT/sub 3/, and TSH (by 3 different RIA). Although all of the methods used for measuring free T/sub 4/ (including free T/sub 4/ index, free T/sub 4/ by dialysis, free T/sub 4/ assessed by TBG, and free T/sub 4/ assessed by the 3 commercial kits) were excellent for the diagnosis of hypothyroidism, hyperthyroidism, and euthyroidism in the presence of high TBG, none of these methods showed that free T/sub 4/ was consistently normal in patients with NTI; with each method, a number of NTI patients had subnormal values. In the NTI groups, free T/sub 4/ measured by dialysis and the free T/sub 4/ index generally correlated significantly with the commercial free T/sub 4/ methods. Serum rT/sub 3/ was elevated or normal in NTI patients and low in hypothyroid subjects. Serum TSH provided the most reliable differentiation between patients with primary hypothyroidism and those with NTI and low serum T/sub 4/ levels.

  3. Proposed SPAR Modeling Method for Quantifying Time Dependent Station Blackout Cut Sets

    Energy Technology Data Exchange (ETDEWEB)

    John A. Schroeder

    2010-06-01

    Abstract: The U.S. Nuclear Regulatory Commission’s (USNRC’s) Standardized Plant Analysis Risk (SPAR) models and industry risk models take similar approaches to analyzing the risk associated with loss of offsite power and station blackout (LOOP/SBO) events at nuclear reactor plants. In both SPAR models and industry models, core damage risk resulting from a LOOP/SBO event is analyzed using a combination of event trees and fault trees that produce cut sets that are, in turn, quantified to obtain a numerical estimate of the resulting core damage risk. A proposed SPAR method for quantifying the time-dependent cut sets is sometimes referred to as a convolution method. The SPAR method reflects assumptions about the timing of emergency diesel failures, the timing of subsequent attempts at emergency diesel repair, and the timing of core damage that may be different than those often used in industry models. This paper describes the proposed SPAR method.

  4. Fin efficiency analysis of convective straight fins with temperature dependent thermal conductivity using variational iteration method

    Energy Technology Data Exchange (ETDEWEB)

    Coskun, Safa Bozkurt [Department of Civil Engineering, Nigde University, 51245 Nigde (Turkey)], E-mail: sbcoskun@nigde.edu.tr; Atay, Mehmet Tarik [Department of Mathematics, Nigde University, 51245 Nigde (Turkey)

    2008-12-15

    For enhancing heat transfer between primary surface and the environment, utilization of radiating extended surfaces are common. Especially for large temperature differences; variable thermal conductivity has a strong effect on performance of such a surface. In this paper, variational iteration method is used to analyze the efficiency of convective straight fins with temperature dependent thermal conductivity. VIM produces analytical expressions for the solution of nonlinear differential equations. In order to show the effectiveness of variational iteration method (VIM), the results obtained from VIM analysis is compared with available solutions obtained using Adomian decomposition method (ADM) and the results from finite element analysis. This work assures that VIM is a promising method for the efficiency analysis of convective straight fin problems.

  5. Assessment of substitution model adequacy using frequentist and Bayesian methods.

    Science.gov (United States)

    Ripplinger, Jennifer; Sullivan, Jack

    2010-12-01

    In order to have confidence in model-based phylogenetic methods, such as maximum likelihood (ML) and Bayesian analyses, one must use an appropriate model of molecular evolution identified using statistically rigorous criteria. Although model selection methods such as the likelihood ratio test and Akaike information criterion are widely used in the phylogenetic literature, model selection methods lack the ability to reject all models if they provide an inadequate fit to the data. There are two methods, however, that assess absolute model adequacy, the frequentist Goldman-Cox (GC) test and Bayesian posterior predictive simulations (PPSs), which are commonly used in conjunction with the multinomial log likelihood test statistic. In this study, we use empirical and simulated data to evaluate the adequacy of common substitution models using both frequentist and Bayesian methods and compare the results with those obtained with model selection methods. In addition, we investigate the relationship between model adequacy and performance in ML and Bayesian analyses in terms of topology, branch lengths, and bipartition support. We show that tests of model adequacy based on the multinomial likelihood often fail to reject simple substitution models, especially when the models incorporate among-site rate variation (ASRV), and normally fail to reject less complex models than those chosen by model selection methods. In addition, we find that PPSs often fail to reject simpler models than the GC test. Use of the simplest substitution models not rejected based on fit normally results in similar but divergent estimates of tree topology and branch lengths. In addition, use of the simplest adequate substitution models can affect estimates of bipartition support, although these differences are often small with the largest differences confined to poorly supported nodes. We also find that alternative assumptions about ASRV can affect tree topology, tree length, and bipartition support. Our

  6. Assessment of Proper Bonding Methods and Mechanical Characterization FPGA CQFPs

    Science.gov (United States)

    Davis, Milton C.

    2008-01-01

    This presentation discusses fractured leads on field-programmable gate array (FPGA) during flight vibration. Actions taken to determine root cause and resolution of the failure include finite element analysis (FEA) and vibration testing and scanning electron microscopy (with X-ray microanalysis) and energy dispersive spectrometry (SEM/EDS) failure assessment. Bonding methods for surface mount parts is assessed, including critical analysis and assessment of random fatigue damage. Regarding ceramic quad flat pack (CQFP) lead fracture, after disassembling the attitude control electronics (ACE) configuration, photographs showed six leads cracked on FPGA RTSX72SU-1 CQ208B package located on the RWIC card. An identical package (FPGA RTSX32SU-1 CQ208B) mounted on the RWIC did not results in cracked pins due to vibration. FPGA lead failure theories include workmanship issues in the lead-forming, material defect in the leads of the FPGA packages, and the insecure mounting of the board in the card guides, among other theories. Studies were conducted using simple calculations to determine the response and fatigue life of the package. Shorter packages exhibited more response when loaded by out-of-plane displacement of PCB while taller packages exhibit more response when loaded by in-plane acceleration of PCB. Additionally, under-fill did not contribute to reducing stress in leads due to out-of-plane PCB loading or from component twisting, as much as corner bonding. The combination of corner bond and under-fill is best to address mechanical and thermal S/C environment. Test results of bonded parts showed reduced (dampened) amplitude and slightly shifted peaks at the un-bonded natural frequency and an additional response at the bonded frequency. Stress due to PCBB out-of-plane loading was decreased on in the corners when only a corner bond was used. Future work may address CQFP fatigue assessment, including the investigation of discrepancy in predicted fatigue damage, as well as

  7. A simple method for assessing intestinal inflammation in Crohn's disease

    Science.gov (United States)

    Tibble, J; Teahon, K; Thjodleifsson, B; Roseth, A; Sigthorsson, G; Bridger, S; Foster, R; Sherwood, R; Fagerhol, M; Bjarnason, I

    2000-01-01

    BACKGROUND AND AIMS—Assessing the presence and degree of intestinal inflammation objectively, simply, and reliably is a significant problem in gastroenterology. We assessed faecal excretion of calprotectin, a stable neutrophil specific marker, as an index of intestinal inflammation and its potential use as a screening test to discriminate between patients with Crohn's disease and those with irritable bowel syndrome.
METHODS—The validity of faecal calprotectin as a marker of intestinal inflammation was assessed in 22 patients with Crohn's disease (35 studies) by comparing faecal excretions and concentrations using four day faecal excretion of 111indium white cells. A cross sectional study assessed the sensitivity of faecal calprotectin concentration for the detection of established Crohn's disease (n=116). A prospective study assessed the value of faecal calprotectin in discriminating between patients with Crohn's disease and irritable bowel syndrome in 220 patients referred to a gastroenterology clinic.
RESULTS—Four day faecal excretion of 111indium (median 8.7%; 95% confidence interval (CI) 7-17%; normal <1.0%) correlated significantly (p<0.0001) with daily (median ranged from 39 to 47 mg; normal <3 mg; r=0.76-0.82) and four day faecal calprotectin excretion (median 101 mg; 95% CI 45-168 mg; normal <11 mg; r=0.80) and single stool calprotectin concentrations (median 118 mg/l; 95% CI 36-175 mg/l; normal <10 mg/l; r=0.70) in patients with Crohn's disease. The cross sectional study showed a sensitivity of 96% for calprotectin in discriminating between normal subjects (2 mg/l; 95% CI 2-3 mg/l) and those with Crohn's disease (91 mg/l; 95% CI 59-105 mg/l). With a cut off point of 30 mg/l faecal calprotectin has 100% sensitivity and 97% specificity in discriminating between active Crohn's disease and irritable bowel syndrome.
CONCLUSION—The calprotectin method may be a useful adjuvant for discriminating between patients with Crohn's disease and

  8. BISAP: A NOVEL METHOD FOR ASSESSING SEVERITY OF ACUTE PANCREATITIS

    Directory of Open Access Journals (Sweden)

    Ramalingeshwara

    2014-09-01

    Full Text Available BACKGROUND: There are many multifactorial scoring systems, radiological scores, and biochemical markers are available for early prediction of severity, and mortality in patients with acute pancreatitis (AP. The bedside index for severity in acute pancreatitis (BISAP has been considered as an accurate method for risk stratification in patients with acute pancreatitis. OBJECTIVE: This study aimed to evaluate the usefulness of the BISAP as a predictor for severe pancreatitis. METHODS AND MATERIAL: We analyzed 100 patients diagnosed with acute pancreatitis at our hospital between October 2012 and April 2013. We used BISAP score in all such patients within 24 hours of admission. Patient were assessed for organ failure and followed throughout the period of hospitalization for complications. Statistical analysis was made using the student t test and chi-square test and statistical significance was analyzed. RESULTS: Out of 100 patients, 20% had severe pancreatitis. Acute Pancreatitis was seen male (87%, in 4th decade (70%, alcohol was the most common etiology (60%, biliary pancreatitis (25%, remaining idiopathic (15%. Patients with BISAP >= 3 was associated with transient or persistent organ failure and pancreatic necrosis. CONCLUSION: BISAP scoring is a simple clinical method to identify patients at risk of increased mortality within 24 hours of presentation in patients with acute pancreatitis.

  9. Object relations and real life relationships: a cross method assessment.

    Science.gov (United States)

    Handelzalts, Jonathan E; Fisher, Shimrit; Naot, Rachel

    2014-04-01

    This study examines the relationship between the psychoanalytic concept of object relations and real life behavior of being in an intimate relationship among heterosexual women. In a multi-method approach we used two different measures; the self-report Bell Object Relations and Reality Testing Inventory (BORRTI; Bell, Billington & Becker, 1986) and the performance based Thematic Apperception Test (TAT) Social Cognition & Object Relations Scale- Global Rating Method SCORS-G (Westen, 1995) to measure the object relations of 60 women. The Alienation subscale of the BORRTI and understanding of social causality subscale of the SCORS-G explained 34.8% of variance of the intimate relationship variable. Thus, women involved in a romantic relationship reported lower rates of alienation on the BORRTI and produced TAT narratives that were more adaptive with regard to understanding of social causality as measured by the SCORS-G than those not currently in a relationship. Results are discussed with reference to the relationship between object relations and real life measures of healthy individuals and in light of the need for a multi-method approach of assessment.

  10. Probabilistic seismic hazard assessment of Italy using kernel estimation methods

    Science.gov (United States)

    Zuccolo, Elisa; Corigliano, Mirko; Lai, Carlo G.

    2013-07-01

    A representation of seismic hazard is proposed for Italy based on the zone-free approach developed by Woo (BSSA 86(2):353-362, 1996a), which is based on a kernel estimation method governed by concepts of fractal geometry and self-organized seismicity, not requiring the definition of seismogenic zoning. The purpose is to assess the influence of seismogenic zoning on the results obtained for the probabilistic seismic hazard analysis (PSHA) of Italy using the standard Cornell's method. The hazard has been estimated for outcropping rock site conditions in terms of maps and uniform hazard spectra for a selected site, with 10 % probability of exceedance in 50 years. Both spectral acceleration and spectral displacement have been considered as ground motion parameters. Differences in the results of PSHA between the two methods are compared and discussed. The analysis shows that, in areas such as Italy, characterized by a reliable earthquake catalog and in which faults are generally not easily identifiable, a zone-free approach can be considered a valuable tool to address epistemic uncertainty within a logic tree framework.

  11. A reliability assessment method using system dynamics and application

    Energy Technology Data Exchange (ETDEWEB)

    Kyung, Min Kang; Moosung, Jae [Hanyang Univ., Dept. of Nuclear Engineering, Seoul (Korea, Republic of); Sangman, Kwak [Systemix, Inc, Seoul (Korea, Republic of)

    2005-07-01

    An advanced method for assessing dynamic safety of nuclear power plants is introduced and applied. A commercial software, VENtana SIMulation environment, VENSIM, is used to develop a dynamics model for an example system. In this study the 18-month refuel cycle is simulated for the dynamic analysis. The failure rate when the plant is a zero power like maintenance, test, and refueling processes, which are not properly modeled in conventional method using event/fault trees, is higher than that of the full power. This also means the human failure rate during both standby and shutdown operation is higher than that of normal operations. Various time steps are applied for the different failure cases. The simulation results show that the common cause failure is much affected by the time step process. The results also include the dynamic simulation for the standby-running and shutdown-running cases. The graphical presentation has been easily modeled by a unique graphic designed method incorporated in the VENSIM. The diagrams well understood by operators or system analysts are constructed and evaluated quantitatively using system dynamics. (authors)

  12. Blood oxygen-level dependent functional assessment of cerebrovascular reactivity: Feasibility for intraoperative 3 Tesla MRI.

    Science.gov (United States)

    Fierstra, Jorn; Burkhardt, Jan-Karl; van Niftrik, Christiaan Hendrik Bas; Piccirelli, Marco; Pangalu, Athina; Kocian, Roman; Neidert, Marian Christoph; Valavanis, Antonios; Regli, Luca; Bozinov, Oliver

    2017-02-01

    To assess the feasibility of functional blood oxygen-level dependent (BOLD) MRI to evaluate intraoperative cerebrovascular reactivity (CVR) at 3 Tesla field strength. Ten consecutive neurosurgical subjects scheduled for a clinical intraoperative MRI examination were enrolled in this study. In addition to the clinical protocol a BOLD sequence was implemented with three cycles of 44 s apnea to calculate CVR values on a voxel-by-voxel basis throughout the brain. The CVR range was then color-coded and superimposed on an anatomical volume to create high spatial resolution CVR maps. Ten subjects (mean age 34.8 ± 13.4; 2 females) uneventfully underwent the intraoperative BOLD protocol, with no complications occurring. Whole-brain CVR for all subjects was (mean ± SD) 0.69 ± 0.42, whereas CVR was markedly higher for tumor subjects as compared to vascular subjects, 0.81 ± 0.44 versus 0.33 ± 0.10, respectively. Furthermore, color-coded functional maps could be robustly interpreted for a whole-brain assessment of CVR. We demonstrate that intraoperative BOLD MRI is feasible in creating functional maps to assess cerebrovascular reactivity throughout the brain in subjects undergoing a neurosurgical procedure. Magn Reson Med 77:806-813, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  13. Using experiential learning and OSCEs to teach and assess tobacco dependence education with first-year dental students.

    Science.gov (United States)

    Romito, Laura; Schrader, Stuart; Zahl, David

    2014-05-01

    Previous research has indicated that dentists do not routinely engage in tobacco cessation interventions with their patients due, in part, to a lack of training in the predoctoral curriculum. From 2010 to 2012, this study at one U.S. dental school evaluated the effectiveness of experiential learning and objective structured clinical examinations (OSCEs) to improve first-year dental students' knowledge and beliefs about tobacco dependence and cessation interventions. Analysis indicated acceptable reliability and student performance for the OSCE. In all three years, there were statistically significant increases in student knowledge (p<0.001). In each year, there were also statistically significant shifts in student perceptions of preparedness (p<0.001 to p=0.034) and willingness (p<0.001 to p=0.005) to provide tobacco dependence treatment to patients. Results suggest that OSCEs utilizing standardized patients may be an effective method for assessing tobacco dependence education. Preparing for and participating in an OSCE with a standardized patient may help increase student knowledge and shape the beliefs of early dental students about engaging in patient tobacco cessation interventions. Findings were mixed on the impact of experiential learning on OSCE performance, suggesting further research is needed.

  14. Reduced-reference image quality assessment using moment method

    Science.gov (United States)

    Yang, Diwei; Shen, Yuantong; Shen, Yongluo; Li, Hongwei

    2016-10-01

    Reduced-reference image quality assessment (RR IQA) aims to evaluate the perceptual quality of a distorted image through partial information of the corresponding reference image. In this paper, a novel RR IQA metric is proposed by using the moment method. We claim that the first and second moments of wavelet coefficients of natural images can have approximate and regular change that are disturbed by different types of distortions, and that this disturbance can be relevant to human perceptions of quality. We measure the difference of these statistical parameters between reference and distorted image to predict the visual quality degradation. The introduced IQA metric is suitable for implementation and has relatively low computational complexity. The experimental results on Laboratory for Image and Video Engineering (LIVE) and Tampere Image Database (TID) image databases indicate that the proposed metric has a good predictive performance.

  15. Risk Assessment Techniques and Survey Method for COTS Components

    CERN Document Server

    Gupta, Rashmi

    2012-01-01

    The Rational Unified Process a software engineering process is gaining popularity nowadays. RUP delivers best software practices for component software Development life cycle It supports component based software development. Risk is involved in every component development phase .neglecting those risks sometimes hampers the software growth and leads to negative outcome. In Order to provide appropriate security and protection levels, identifying various risks is very vital. Therefore Risk identification plays a very crucial role in the component based software development This report addresses incorporation of component based software development cycle into RUP phases, assess several category of risk encountered in the component based software. It also entails a survey method to identify the risk factor and evaluating the overall severity of the component software development in terms of the risk. Formula for determining risk prevention cost and finding the risk probability is also been included. The overall go...

  16. A QUALITY ASSESSMENT METHOD FOR 3D ROAD POLYGON OBJECTS

    Directory of Open Access Journals (Sweden)

    L. Gao

    2015-08-01

    Full Text Available With the development of the economy, the fast and accurate extraction of the city road is significant for GIS data collection and update, remote sensing images interpretation, mapping and spatial database updating etc. 3D GIS has attracted more and more attentions from academics, industries and governments with the increase of requirements for interoperability and integration of different sources of data. The quality of 3D geographic objects is very important for spatial analysis and decision-making. This paper presents a method for the quality assessment of the 3D road polygon objects which is created by integrating 2D Road Polygon data with LiDAR point cloud and other height information such as Spot Height data in Hong Kong Island. The quality of the created 3D road polygon data set is evaluated by the vertical accuracy, geometric and attribute accuracy, connectivity error, undulation error and completeness error and the final results are presented.

  17. Performance Assessment Method for a Forged Fingerprint Detection Algorithm

    Science.gov (United States)

    Shin, Yong Nyuo; Jun, In-Kyung; Kim, Hyun; Shin, Woochang

    The threat of invasion of privacy and of the illegal appropriation of information both increase with the expansion of the biometrics service environment to open systems. However, while certificates or smart cards can easily be cancelled and reissued if found to be missing, there is no way to recover the unique biometric information of an individual following a security breach. With the recognition that this threat factor may disrupt the large-scale civil service operations approaching implementation, such as electronic ID cards and e-Government systems, many agencies and vendors around the world continue to develop forged fingerprint detection technology, but no objective performance assessment method has, to date, been reported. Therefore, in this paper, we propose a methodology designed to evaluate the objective performance of the forged fingerprint detection technology that is currently attracting a great deal of attention.

  18. Discourses and Practices in Teaching Methods and Assessment

    Directory of Open Access Journals (Sweden)

    Deepak Gopinath

    2015-02-01

    Full Text Available Translating the purposes of education into practice is particularly challenging for those who are new or have recently entered academia. By reflecting on my first years of teaching in higher education, I discuss two key aspects of my teaching practice: shifts in choice of teaching methods and a critique of different forms of assessment. Through the discussion, I argue that a teacher needs to be reflective on both these aspects and that such reflection needs to be carried out so that the student develops into a “self-directing,” “self-monitoring,” and “self-correcting” individual. At the end of the discussion, the relevance of a “project-based learning” approach starts to become significant in taking my pedagogical practice forward.

  19. Ultrasonic Apparatus and Method to Assess Compartment Syndrome

    Science.gov (United States)

    Yost, William T. (Inventor); Ueno, Toshiaki (Inventor); Hargens, Alan R. (Inventor)

    2009-01-01

    A process and apparatus for measuring pressure buildup in a body compartment that encases muscular tissue. The method includes assessing the body compartment configuration and identifying the effect of pulsatible components on compartment dimensions and muscle tissue characteristics. This process is used in preventing tissue necrosis, and in decisions of whether to perform surgery on the body compartment for prevention of Compartment Syndrome. An apparatus is used for measuring pressure build-up in the body compartment having components for imparting ultrasonic waves such as a transducer, placing the transducer to impart the ultrasonic waves, capturing the imparted ultrasonic waves, mathematically manipulating the captured ultrasonic waves and categorizing pressure build-up in the body compartment from the mathematical manipulations.

  20. Experimental Methods for UAV Aerodynamic and Propulsion Performance Assessment

    Directory of Open Access Journals (Sweden)

    Stefan ANTON

    2015-06-01

    Full Text Available This paper presents an experimental method for assessing the performances and the propulsion power of a UAV in several points based on telemetry. The points in which we make the estimations are chosen based on several criteria and the fallowing parameters are measured: airspeed, time-to-climb, altitude and the horizontal distance. With the estimated propulsion power and knowing the shaft motor power, the propeller efficiency is determined at several speed values. The shaft motor power was measured in the lab using the propeller as a break. Many flights, using the same UAV configuration, were performed before extracting flight data, in order to reduce the instrumental or statistic errors. This paper highlights both the methodology of processing the data and the validation of theoretical results.

  1. Comparative Assessment of Advanced Gay Hydrate Production Methods

    Energy Technology Data Exchange (ETDEWEB)

    M. D. White; B. P. McGrail; S. K. Wurstner

    2009-06-30

    Displacing natural gas and petroleum with carbon dioxide is a proven technology for producing conventional geologic hydrocarbon reservoirs, and producing additional yields from abandoned or partially produced petroleum reservoirs. Extending this concept to natural gas hydrate production offers the potential to enhance gas hydrate recovery with concomitant permanent geologic sequestration. Numerical simulation was used to assess a suite of carbon dioxide injection techniques for producing gas hydrates from a variety of geologic deposit types. Secondary hydrate formation was found to inhibit contact of the injected CO{sub 2} regardless of injectate phase state, thus diminishing the exchange rate due to pore clogging and hydrate zone bypass of the injected fluids. Additional work is needed to develop methods of artificially introducing high-permeability pathways in gas hydrate zones if injection of CO{sub 2} in either gas, liquid, or micro-emulsion form is to be more effective in enhancing gas hydrate production rates.

  2. Rapid assessment methods of resilience for natural and agricultural systems.

    Science.gov (United States)

    Torrico, Juan C; Janssens, Marc J J

    2010-12-01

    The resilience, ecological function and quality of both agricultural and natural systems were evaluated in the mountainous region of the Atlantic Rain Forest of Rio de Janeiro through Rapid Assessment Methods. For this goal new indicators were proposed, such as eco-volume, eco-height, bio-volume, volume efficiency, and resilience index. The following agricultural and natural systems have been compared according: (i) vegetables (leaf, fruit and mixed); (ii) citrus; (iii) ecological system; (iv) cattle, (v) silvo-pastoral system, (vi) forest fragment and (vii) forest in regeneration stage (1, 2 and 3 years old). An alternative measure (index) of resilience was proposed by considering the actual bio-volume as a function of the potential eco-volume. The objectives and hypotheses were fulfilled; it is shown that there does exist a high positive correlation between resilience index, biomass, energy efficiency and biodiversity. Cattle and vegetable systems have lowest resilience, whilst ecological and silvo-pastoral systems have greatest resilience. This new approach offers a rapid, though valuable assessment tool for ecological studies, agricultural development and landscape planning, particularly in tropical countries.

  3. Assessing phytase activity–methods, definitions and pitfalls

    Directory of Open Access Journals (Sweden)

    Linnea Qvirist

    2015-02-01

    Full Text Available Phytases are nutritionally important for increased bioavailability of dietary minerals and phosphate for monogastric animals including humans. Release of minerals and phosphate is accomplished by the enzymatic stepwise degradation of phytate (inositol hexaphosphate, IP6. Activity determinations of phytase is often based on analysis of total released phosphate (Pi, but phytase activity in its purest form represents released product per time from IP6 only. Microbial and plant preparations often also contain mixtures of phosphatases and organic phosphate compounds; hence some released phosphate in enzymatic assays may originate from non-phytase phosphatases degrading non-phytate molecules. Moreover, even purified enzyme extracts assessed via Pi release may result in errors, since commercial IP6 commonly contains contamination of lower inositol phosphates, and further, the products of phytase IP6 hydrolysis are also substrates for the phytase. These facts motivate a quantitative comparative study. We compared enzyme activity determination in phytase assay samples at four different time points, based on analyzing the substrate IP6 versus the product Pi using different selected methods. The calculated activities varied substantially. For example, at 15 min into enzymatic assay, variations from 152 mU/ml (by IP6 analysis on HPIC to 275-586 mU/ml (by Pi analysis using several methods was detected. Our work emphasizes the importance of defining the type of activity assessed, showing that phytase activity based on released Pi may yield false positive results and/or overestimations. We propose to differentiate between phytase activity, being the activity by which IP6 is degraded, and total inositol phosphatase activity, corresponding to total released phosphate during the enzymatic reaction.

  4. Assessment of metal artifact reduction methods in pelvic CT

    Energy Technology Data Exchange (ETDEWEB)

    Abdoli, Mehrsima [Department of Radiation Oncology, The Netherlands Cancer Institute, Plesmanlaan 121, Amsterdam 1066 CX (Netherlands); Mehranian, Abolfazl [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, Geneva CH-1211 (Switzerland); Ailianou, Angeliki; Becker, Minerva [Division of Radiology, Geneva University Hospital, Geneva CH-1211 (Switzerland); Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, Geneva CH-1211 (Switzerland); Geneva Neuroscience Center, Geneva University, Geneva CH-1205 (Switzerland); Department of Nuclear Medicine and Molecular Imaging, University Medical Center Groningen, University of Groningen, Hanzeplein 1, Groningen 9700 RB (Netherlands)

    2016-04-15

    Purpose: Metal artifact reduction (MAR) produces images with improved quality potentially leading to confident and reliable clinical diagnosis and therapy planning. In this work, the authors evaluate the performance of five MAR techniques for the assessment of computed tomography images of patients with hip prostheses. Methods: Five MAR algorithms were evaluated using simulation and clinical studies. The algorithms included one-dimensional linear interpolation (LI) of the corrupted projection bins in the sinogram, two-dimensional interpolation (2D), a normalized metal artifact reduction (NMAR) technique, a metal deletion technique, and a maximum a posteriori completion (MAPC) approach. The algorithms were applied to ten simulated datasets as well as 30 clinical studies of patients with metallic hip implants. Qualitative evaluations were performed by two blinded experienced radiologists who ranked overall artifact severity and pelvic organ recognition for each algorithm by assigning scores from zero to five (zero indicating totally obscured organs with no structures identifiable and five indicating recognition with high confidence). Results: Simulation studies revealed that 2D, NMAR, and MAPC techniques performed almost equally well in all regions. LI falls behind the other approaches in terms of reducing dark streaking artifacts as well as preserving unaffected regions (p < 0.05). Visual assessment of clinical datasets revealed the superiority of NMAR and MAPC in the evaluated pelvic organs and in terms of overall image quality. Conclusions: Overall, all methods, except LI, performed equally well in artifact-free regions. Considering both clinical and simulation studies, 2D, NMAR, and MAPC seem to outperform the other techniques.

  5. Expanding the Aperture of Psychological Assessment: Introduction to the Special Section on Innovative Clinical Assessment Technologies and Methods

    Science.gov (United States)

    Trull, Timothy J.

    2007-01-01

    Contemporary psychological assessment is dominated by tried-and-true methods like clinical interviewing, self-report questionnaires, intellectual assessment, and behavioral observation. These approaches have served as the mainstays of psychological assessment for decades. To be sure, these methods have survived over the years because clinicians…

  6. Direct Measurement of the Pressure Dependence of the Glass Transition Temperature: A Comparison of Methods

    Science.gov (United States)

    Oliver, William, III; Ransom, Timothy; Cooper, James, III

    2013-03-01

    Two methods for the direct measurement of the pressure dependence of the glass-transition temperature Tg are presented and compared. These methods involve the use of the diamond anvil cell (DAC), and hence, enable the ability to measure Tg(P) to record high pressures of several GPa. Such studies are increasingly relevant as new methods have pushed other high-pressure experimental investigations of glass-forming systems into the same pressure regime. Both methods use careful ruby fluorescence measurements in the DAC as temperature is increased from the glass (TTg) . Method 1 observes the disappearance of pressure gradients as the viscous liquid region is entered, whereas method 2 involves observation of slope changes in the P-T curve during temperature ramps. Such slope changes are associated with the significant change in the volume expansion coefficient between the highly viscous, metastable, supercooled liquid state and the solid glassy state. In most cases, the two methods yield good agreement in the Tg(P) curve. Data will be presented for more than one glass-forming system, including the intermediate strength glass-forming system glycerol and the fragile glass former salol. We acknowledge support from the NSF under DMR-0552944

  7. A Self-Optimization Method for System Service Dependability based on Autonomic Computing

    Directory of Open Access Journals (Sweden)

    Qingtao Wu

    2012-11-01

    Full Text Available Under the intrusion or abnormal attacks, how to supply system service dependability autonomously, without being degraded, is the essential requirement to network system service. Autonomic Computing can overcome the heterogeneity and complexity of computing system, has been regarded as a novel and effective approach to implementing autonomous systems to address system security issues. To cope with the problem of declining network service dependability caused by safety threats, we proposed an autonomic method for optimizing system service performance based on Q-learning from the perspective of autonomic computing. First, we get the operations by utilizing the nonlinear mapping relations of the feedforward neural network. Then, we obtain the executive action by perceiving the state parameter changes of the network system in the service performance. Thirdly, we calculate the environment-rewarded function value integrated the changes of the system service performance and the service availability. Finally, we use the self-learning characteristics and prediction ability of the Q-learning to make the system service to achieve optimal performance. Simulation results show that this method is effective for optimizing the overall dependability and service utility of a system.

  8. Assessment Methods in Statistical Education An International Perspective

    CERN Document Server

    Bidgood, Penelope; Jolliffe, Flavia

    2010-01-01

    This book is a collaboration from leading figures in statistical education and is designed primarily for academic audiences involved in teaching statistics and mathematics. The book is divided in four sections: (1) Assessment using real-world problems, (2) Assessment statistical thinking, (3) Individual assessment (4) Successful assessment strategies.

  9. Assessing composition gradients in multifilamentary superconductors by means of magnetometry methods

    Science.gov (United States)

    Baumgartner, T.; Hecher, J.; Bernardi, J.; Pfeiffer, S.; Senatore, C.; Eisterer, M.

    2017-01-01

    We present two magnetometry-based methods suitable for assessing gradients in the critical temperature and hence the composition of multifilamentary superconductors: AC magnetometry and scanning Hall probe microscopy. The novelty of the former technique lies in the iterative evaluation procedure we developed, whereas the strength of the latter is the direct visualization of the temperature dependent penetration of a magnetic field into the superconductor. Using the example of a PIT Nb3Sn wire, we demonstrate the application of these techniques, and compare the respective results to each other and to EDX measurements of the Sn distribution within the sub-elements of the wire.

  10. A Hybrid Nodal Method for Time-Dependent Incompressible Flow in Two-Dimensional Arbitrary Geometries

    Energy Technology Data Exchange (ETDEWEB)

    Toreja, A J; Uddin, R

    2002-10-21

    A hybrid nodal-integral/finite-analytic method (NI-FAM) is developed for time-dependent, incompressible flow in two-dimensional arbitrary geometries. In this hybrid approach, the computational domain is divided into parallelepiped and wedge-shaped space-time nodes (cells). The conventional nodal integral method (NIM) is applied to the interfaces between adjacent parallelepiped nodes (cells), while a finite analytic approach is applied to the interfaces between parallelepiped and wedge-shaped nodes (cells). In this paper, the hybrid method is formally developed and an application of the NI-FAM to fluid flow in an enclosed cavity is presented. Results are compared with those obtained using a commercial computational fluid dynamics code.

  11. Solving time-dependent problems by an RBF-PS method with an optimal shape parameter

    Energy Technology Data Exchange (ETDEWEB)

    Neves, A M A; Roque, C M C; Ferreira, A J M; Jorge, R M N [Departamento de Engenharia Mecanica e Gestao Industrial, Faculdade de Engenharia da Universidade do Porto, Rua Dr. Roberto Frias, 4200-465 Porto (Portugal); Soares, C M M, E-mail: ana.m.neves@fe.up.p, E-mail: croque@fe.up.p, E-mail: ferreira@fe.up.p, E-mail: cristovao.mota.soares@dem.ist.utl.p, E-mail: rnatal@fe.up.p [IDMEC - Instituto de Engenharia Mecanica - Instituto Superior Tecnico, Av. Rovisco Pais, 1096 Lisboa Codex (Portugal)

    2009-08-01

    An hybrid technique is used for the solutions of static and time-dependent problems. The idea is to combine the radial basis function (RBF) collocation method and the pseudospectal (PS) method getting to the RBF-PS method. The approach presented in this paper includes a shape parameter optimization and produces highly accurate results. Different examples of the procedure are presented and different radial basis functions are used. One and two-dimensional problems are considered with various boundary and initial conditions. We consider generic problems, but also results on beams and plates. The displacement and the stress analysis are conducted for static and transient dynamic situations. Results obtained are in good agreement with exact solutions or references considered.

  12. A new method for the fast simulation of models of highly dependable Markov system

    Institute of Scientific and Technical Information of China (English)

    XIAO Gang; LI Zhizhong

    2005-01-01

    To fast evaluate the small probability that starts from the all-components-up state, the system hits the failed sets before returning to the all-components-up state, Important Sampling or Important Splitting is used commonly. In this paper, a new approach distinguished from Important Sampling and Important Splitting is presented to estimate this small probability of highly dependable Markov system. This new approach achieves variance reduction through improving the estimator itself. The new estimator is derived from the integral equation describing the state transitions of Markov system. That the variance of this estimator is less than that of naive simulation at all time is proved theoretically. Two example involved reliability models with deferred repair are used to compare the methods of RB, IGBS, SB-RBS, naive simulation, and the method presented in this paper. Results show our method has the least RE.

  13. A modified next reaction method for simulating chemical systems with time dependent propensities and delays.

    Science.gov (United States)

    Anderson, David F

    2007-12-01

    Chemical reaction systems with a low to moderate number of molecules are typically modeled as discrete jump Markov processes. These systems are oftentimes simulated with methods that produce statistically exact sample paths such as the Gillespie algorithm or the next reaction method. In this paper we make explicit use of the fact that the initiation times of the reactions can be represented as the firing times of independent, unit rate Poisson processes with internal times given by integrated propensity functions. Using this representation we derive a modified next reaction method and, in a way that achieves efficiency over existing approaches for exact simulation, extend it to systems with time dependent propensities as well as to systems with delays.

  14. Model and Method for Multiobjective Time-Dependent Hazardous Material Transportation

    Directory of Open Access Journals (Sweden)

    Zhen Zhou

    2014-01-01

    Full Text Available In most of the hazardous material transportation problems, risk factors are assumed to be constant, which ignores the fact that they can vary with time throughout the day. In this paper, we deal with a novel time-dependent hazardous material transportation problem via lane reservation, in which the dynamic nature of transportation risk in the real-life traffic environment is taken into account. We first develop a multiobjective mixed integer programming (MIP model with two conflicting objectives: minimizing the impact on the normal traffic resulting from lane reservation and minimizing the total transportation risk. We then present a cut-and-solve based ε-constraint method to solve this model. Computational results indicate that our method outperforms the ε-constraint method based on optimization software package CPLEX.

  15. A Classification Method for Seed Viability Assessment with Infrared Thermography

    Directory of Open Access Journals (Sweden)

    Sen Men

    2017-04-01

    Full Text Available This paper presents a viability assessment method for Pisum sativum L. seeds based on the infrared thermography technique. In this work, different artificial treatments were conducted to prepare seeds samples with different viability. Thermal images and visible images were recorded every five minutes during the standard five day germination test. After the test, the root length of each sample was measured, which can be used as the viability index of that seed. Each individual seed area in the visible images was segmented with an edge detection method, and the average temperature of the corresponding area in the infrared images was calculated as the representative temperature for this seed at that time. The temperature curve of each seed during germination was plotted. Thirteen characteristic parameters extracted from the temperature curve were analyzed to show the difference of the temperature fluctuations between the seeds samples with different viability. With above parameters, support vector machine (SVM was used to classify the seed samples into three categories: viable, aged and dead according to the root length, the classification accuracy rate was 95%. On this basis, with the temperature data of only the first three hours during the germination, another SVM model was proposed to classify the seed samples, and the accuracy rate was about 91.67%. From these experimental results, it can be seen that infrared thermography can be applied for the prediction of seed viability, based on the SVM algorithm.

  16. Current Status of Methods to Assess Cancer Drug Resistance

    Directory of Open Access Journals (Sweden)

    Theodor H. Lippert, Hans-Jörg Ruoff, Manfred Volm

    2011-01-01

    Full Text Available Drug resistance is the main cause of the failure of chemotherapy of malignant tumors, resistance being either preexisting (intrinsic resistance or induced by the drugs (acquired resistance. At present, resistance is usually diagnosed during treatment after a long period of drug administration.In the present paper, methods for a rapid assessment of drug resistance are described. Three main classes of test procedures can be found in the literature, i.e. fresh tumor cell culture tests, cancer biomarker tests and positron emission tomography (PET tests. The methods are based on the evaluation of molecular processes, i.e. metabolic activities of cancer cells. Drug resistance can be diagnosed before treatment in-vitro with fresh tumor cell culture tests, and after a short time of treatment in-vivo with PET tests. Cancer biomarker tests, for which great potential has been predicted, are largely still in the development stage. Individual resistance surveillance with tests delivering rapid results signifies progress in cancer therapy management, by providing the possibility to avoid drug therapies that are ineffective and only harmful.

  17. Organisational impact: Definition and assessment methods for medical devices.

    Science.gov (United States)

    Roussel, Christophe; Carbonneil, Cédric; Audry, Antoine

    2016-02-01

    Health technology assessment (HTA) is a rapidly developing area and the value of taking non-clinical fields into consideration is growing. Although the health-economic aspect is commonly recognised, evaluating organisational impact has not been studied nearly as much. The goal of this work was to provide a definition of organisational impact in the sector of medical devices by defining its contours and exploring the evaluation methods specific to this field. Following an analysis of the literature concerning the impact of technologies on organisations as well as the medical literature, and also after reviewing the regulatory texts in this respect, the group of experts identified 12 types of organisational impact. A number of medical devices were carefully screened using the criteria grid, which proved to be operational and to differentiate properly. From the analysis of the practice and of the methods described, the group was then able to derive a few guidelines to successfully evaluate organisational impact. This work shows that taking organisational impact into consideration may be critical alongside of the other criteria currently in favour (clinically and economically). What remains is to confer a role in the decision-making process on this factor and one that meets the economic efficiency principle.

  18. An Assessment of Mean Areal Precipitation Methods on Simulated Stream Flow: A SWAT Model Performance Assessment

    Directory of Open Access Journals (Sweden)

    Sean Zeiger

    2017-06-01

    Full Text Available Accurate mean areal precipitation (MAP estimates are essential input forcings for hydrologic models. However, the selection of the most accurate method to estimate MAP can be daunting because there are numerous methods to choose from (e.g., proximate gauge, direct weighted average, surface-fitting, and remotely sensed methods. Multiple methods (n = 19 were used to estimate MAP with precipitation data from 11 distributed monitoring sites, and 4 remotely sensed data sets. Each method was validated against the hydrologic model simulated stream flow using the Soil and Water Assessment Tool (SWAT. SWAT was validated using a split-site method and the observed stream flow data from five nested-scale gauging sites in a mixed-land-use watershed of the central USA. Cross-validation results showed the error associated with surface-fitting and remotely sensed methods ranging from −4.5 to −5.1%, and −9.8 to −14.7%, respectively. Split-site validation results showed the percent bias (PBIAS values that ranged from −4.5 to −160%. Second order polynomial functions especially overestimated precipitation and subsequent stream flow simulations (PBIAS = −160 in the headwaters. The results indicated that using an inverse-distance weighted, linear polynomial interpolation or multiquadric function method to estimate MAP may improve SWAT model simulations. Collectively, the results highlight the importance of spatially distributed observed hydroclimate data for precipitation and subsequent steam flow estimations. The MAP methods demonstrated in the current work can be used to reduce hydrologic model uncertainty caused by watershed physiographic differences.

  19. A Global Refiability Assessment Method on Aging Offshore Platforms with Corrosion and Cracks

    Institute of Scientific and Technical Information of China (English)

    JI Chun-yan; LI Shan-shan; CHEN Ming-lu

    2009-01-01

    Corrosion and fatigue cracks are major threats to the structural integrity of aging offshore platforms.For the rational estimation of the safety levels of aging platforms,a global reliability assessment approach for aging offshore platforms with corrosion and fatigue cracks is presented in this paper.The base shear capacity is taken as the global ultimate strength of the offshore plaffoms,it is modeled as a random process that decreases with time in the presence of corrosion and fatigue crack propagation.And the corrosion and fatigue crack growth rates in the main members and key joints are modeled as random variables.A simulation method of the extreme wave loads which are applied to the structures of offshore platforms is proposed too.Furthermore,the statistics of global base shear capacity and extreme wave loads are obtained by Monte Carlo simulation method.On the basis of the limit state equation of global failure mode,the instantaneous reliability and time dependent reliability assessment methods are both presented in this paper.Finally the instantaueous reliability index and time dependent failure probability of a jacket platform are estimated with different ages in the demonstration example.

  20. Risk assessment of groundwater level variability using variable Kriging methods

    Science.gov (United States)

    Spanoudaki, Katerina; Kampanis, Nikolaos A.

    2015-04-01

    Assessment of the water table level spatial variability in aquifers provides useful information regarding optimal groundwater management. This information becomes more important in basins where the water table level has fallen significantly. The spatial variability of the water table level in this work is estimated based on hydraulic head measured during the wet period of the hydrological year 2007-2008, in a sparsely monitored basin in Crete, Greece, which is of high socioeconomic and agricultural interest. Three Kriging-based methodologies are elaborated in Matlab environment to estimate the spatial variability of the water table level in the basin. The first methodology is based on the Ordinary Kriging approach, the second involves auxiliary information from a Digital Elevation Model in terms of Residual Kriging and the third methodology calculates the probability of the groundwater level to fall below a predefined minimum value that could cause significant problems in groundwater resources availability, by means of Indicator Kriging. The Box-Cox methodology is applied to normalize both the data and the residuals for improved prediction results. In addition, various classical variogram models are applied to determine the spatial dependence of the measurements. The Matérn model proves to be the optimal, which in combination with Kriging methodologies provides the most accurate cross validation estimations. Groundwater level and probability maps are constructed to examine the spatial variability of the groundwater level in the basin and the associated risk that certain locations exhibit regarding a predefined minimum value that has been set for the sustainability of the basin's groundwater resources. Acknowledgement The work presented in this paper has been funded by the Greek State Scholarships Foundation (IKY), Fellowships of Excellence for Postdoctoral Studies (Siemens Program), 'A simulation-optimization model for assessing the best practices for the

  1. A new non-invasive statistical method to assess the spontaneous cardiac baroreflex in humans.

    Science.gov (United States)

    Ducher, M; Fauvel, J P; Gustin, M P; Cerutti, C; Najem, R; Cuisinaud, G; Laville, M; Pozet, N; Paultre, C Z

    1995-06-01

    1. A new method was developed to evaluate cardiac baroreflex sensitivity. The association of a high systolic blood pressure with a low heart rate or the converse is considered to be under the influence of cardiac baroreflex activity. This method is based on the determination of the statistical dependence between systolic blood pressure and heart rate values obtained non-invasively by a Finapres device. Our computerized analysis selects the associations with the highest statistical dependence. A 'Z-coefficient' quantifies the strength of the statistical dependence. The slope of the linear regression, computed on these selected associations, is used to estimate baroreflex sensitivity. 2. The present study was carried out in 11 healthy resting male subjects. The results obtained by the 'Z-coefficient' method were compared with those obtained by cross-spectrum analysis, which has already been validated in humans. Furthermore, the reproducibility of both methods was checked after 1 week. 3. The results obtained by the two methods were significantly correlated (r = 0.78 for the first and r = 0.76 for the second experiment, P < 0.01). When repeated after 1 week, the average results were not significantly different. Considering individual results, test-retest correlation coefficients were higher with the Z-analysis (r = 0.79, P < 0.01) than with the cross-spectrum analysis (r = 0.61, P < 0.05). 4. In conclusion, as the Z-method gives results similar to but more reproducible than the cross-spectrum method, it might be a powerful and reliable tool to assess baroreflex sensitivity in humans.

  2. Integrated rate-dependent and dual pathway AV nodal functions: principles and assessment framework.

    Science.gov (United States)

    Billette, Jacques; Tadros, Rafik

    2014-01-15

    The atrioventricular (AV) node conducts slowly and has a long refractory period. These features sustain the filtering of atrial impulses and hence are often modulated to optimize ventricular rate during supraventricular tachyarrhythmias. The AV node is also the site of a clinically common reentrant arrhythmia. Its function is assessed for a variety of purposes from its responses to a premature protocol (S1S2, test beats introduced at different cycle lengths) repeatedly performed at different basic rates and/or to an incremental pacing protocol (increasingly faster rates). Puzzlingly, resulting data and interpretation differ with protocols as well as with chosen recovery and refractory indexes, and are further complicated by the presence of built-in fast and slow pathways. This problem applies to endocavitary investigations of arrhythmias as well as to many experimental functional studies. This review supports an integrated framework of rate-dependent and dual pathway AV nodal function that can account for these puzzling characteristics. The framework was established from AV nodal responses to S1S2S3 protocols that, compared with standard S1S2 protocols, allow for an orderly quantitative dissociation of the different factors involved in changes in AV nodal conduction and refractory indexes under rate-dependent and dual pathway function. Although largely based on data from experimental studies, the proposed framework may well apply to the human AV node. In conclusion, the rate-dependent and dual pathway properties of the AV node can be integrated within a common functional framework the contribution of which to individual responses can be quantitatively determined with properly designed protocols and analytic tools.

  3. Dental age assessment among Tunisian children using the Demirjian method

    Directory of Open Access Journals (Sweden)

    Abir Aissaoui

    2016-01-01

    Full Text Available Context: Since Demirjian system of estimating dental maturity was first described, many researchers from different countries have tested its accuracy among diverse populations. Some of these studies have pointed out a need to determine population-specific standards. Aim: The aim of this study is to evaluate the suitability of the Demirjian's method for dental age assessment in Tunisian children. Materials and Methods: This is a prospective study previously approved by the Research Ethics Local Committee of the University Hospital Fattouma Bourguiba of Monastir (Tunisia. Panoramic radiographs of 280 healthy Tunisian children of age 2.8–16.5 years were examined with Demirjian method and scored by three trained observers. Statistical Analysis Used: Dental age was compared to chronological age by using the analysis of variance (ANOVA test. Cohen's Kappa test was performed to calculate the intra- and inter-examiner agreements. Results: Underestimation was seen in children aged between 9 and 16 years and the range of accuracy varied from −0.02 to 3 years. The advancement in dental age as determined by Demirjian system when compared to chronological age ranged from 0.3 to 1.32 year for young males and from 0.26 to 1.37 year for young females (age ranged from 3 to 8 years. Conclusions: The standards provided by Demirjian for French-Canadian children may not be suitable for Tunisian children. Each population of children may need their own specific standard for an accurate estimation of chronological age.

  4. Phylogenetic and functional assessment of orthologs inference projects and methods.

    Directory of Open Access Journals (Sweden)

    Adrian M Altenhoff

    2009-01-01

    Full Text Available Accurate genome-wide identification of orthologs is a central problem in comparative genomics, a fact reflected by the numerous orthology identification projects developed in recent years. However, only a few reports have compared their accuracy, and indeed, several recent efforts have not yet been systematically evaluated. Furthermore, orthology is typically only assessed in terms of function conservation, despite the phylogeny-based original definition of Fitch. We collected and mapped the results of nine leading orthology projects and methods (COG, KOG, Inparanoid, OrthoMCL, Ensembl Compara, Homologene, RoundUp, EggNOG, and OMA and two standard methods (bidirectional best-hit and reciprocal smallest distance. We systematically compared their predictions with respect to both phylogeny and function, using six different tests. This required the mapping of millions of sequences, the handling of hundreds of millions of predicted pairs of orthologs, and the computation of tens of thousands of trees. In phylogenetic analysis or in functional analysis where high specificity is required, we find that OMA and Homologene perform best. At lower functional specificity but higher coverage level, OrthoMCL outperforms Ensembl Compara, and to a lesser extent Inparanoid. Lastly, the large coverage of the recent EggNOG can be of interest to build broad functional grouping, but the method is not specific enough for phylogenetic or detailed function analyses. In terms of general methodology, we observe that the more sophisticated tree reconstruction/reconciliation approach of Ensembl Compara was at times outperformed by pairwise comparison approaches, even in phylogenetic tests. Furthermore, we show that standard bidirectional best-hit often outperforms projects with more complex algorithms. First, the present study provides guidance for the broad community of orthology data users as to which database best suits their needs. Second, it introduces new methodology

  5. Assessment of disease burden among army personnel and dependents in Lucknow city

    Directory of Open Access Journals (Sweden)

    Anil Ahuja

    2015-01-01

    Full Text Available Introduction: Oral health is a valuable asset for an individual. The oral cavity has a significant role to play in providing a satisfactory lifestyle including proper mastication, phonetics, esthetics, appearance, communication abilities and an overall emotional well-being. Very fewer studies have been carried out in the past on disease burden of army personnel and their dependents. Materials and Methods: This study was carried out on 2160 army personnel and their dependents reporting to Command Military Dental Center, Lucknow. The study population was screened for caries, periodontal status and prosthetic status and treatment need, oral hygiene practice and prevalence of the tobacco habit. All relevant information was noted into a Proforma. Statistical analysis was performed using SPSS 16.0 version (Chicago, Inc., USA. The results are presented in percentage and mean (±standard deviation. The unpaired t-test and Chi-square test were used. The P < 0.05 was considered as significant. Results: The oral hygiene awareness is adequate among serving, and dependents and practices of oral hygiene were also adequate. The higher prevalence of the tobacco habit was found among young army personnel than older. There was a significant association of smoking and periodontal disease. Leukoplakia was common oral mucosal lesion between smokers. Conclusion: This study will help to access dental disease occurrence rate and evaluate treatment needs and also to formulate a plan for augmentation of resources. The study will also create awareness about oral hygiene practices and oral habits among army personnel and their dependents.

  6. Time-dependent radiation transport using the staggered-block Jacobi method

    Science.gov (United States)

    Davidson, Gregory Grant

    The time-dependent radiation transport equation describes the dynamics of radiation traveling through and interacting with a background medium. These dynamics are important in a diversity of fields including nuclear reactor kinetics, stellar evolution, and inertial confinement fusion. Except for trivial problems, the transport equation must be solved numerically. This research is concerned with developing a new deterministic time discretization for numerical solutions of the radiation transport equation. To preserve maximal parallelism, a deterministic transport method must maintain locality, meaning that the solution at a point in space is dependent only upon information that is locally available. Furthermore, computational efficiency requires that a method be unconditionally stable, meaning that it provides positive, physically permissible solutions for time steps of any length. Existing unconditionally stable radiation transport methods require mesh sweeps, which make the methods non-local and inhibit their parallelism, thereby reducing their efficiency on large supercomputers. We present a new Staggered-Block Jacobi (SBJ) method, which produces unconditionally stable numerical solutions while maintaining locality. The SBJ time discretization operates by forming blocks of cells. In one dimension, a block is composed of two cells. The incident information into the block is evaluated at the beginning of the time step. This decouples every block, and allows the solution in the blocks to be computed in parallel. We apply the SBJ method to the linear diffusion and transport equations, as well as the linearized thermal radiation transport equations. We find that the SBJ time discretization, applied to the linear diffusion and transport equations, produces methods that are accurate and efficient when the particle wave advances about 20% of a cell per time step, i.e., where the time steps are small or the problem is optically thick. In the case of the thermal radiation

  7. Assessment of regional air quality by a concentration-dependent Pollution Permeation Index

    Science.gov (United States)

    Liang, Chun-Sheng; Liu, Huan; He, Ke-Bin; Ma, Yong-Liang

    2016-10-01

    Although air quality monitoring networks have been greatly improved, interpreting their expanding data in both simple and efficient ways remains challenging. Therefore, needed are new analytical methods. We developed such a method based on the comparison of pollutant concentrations between target and circum areas (circum comparison for short), and tested its applications by assessing the air pollution in Jing-Jin-Ji, Yangtze River Delta, Pearl River Delta and Cheng-Yu, China during 2015. We found the circum comparison can instantly judge whether a city is a pollution permeation donor or a pollution permeation receptor by a Pollution Permeation Index (PPI). Furthermore, a PPI-related estimated concentration (original concentration plus halved average concentration difference) can be used to identify some overestimations and underestimations. Besides, it can help explain pollution process (e.g., Beijing’s PM2.5 maybe largely promoted by non-local SO2) though not aiming at it. Moreover, it is applicable to any region, easy-to-handle, and able to boost more new analytical methods. These advantages, despite its disadvantages in considering the whole process jointly influenced by complex physical and chemical factors, demonstrate that the PPI based circum comparison can be efficiently used in assessing air pollution by yielding instructive results, without the absolute need for complex operations.

  8. The repeated sit-to-stand maneuver is a superior method for cardiac baroreflex assessment: a comparison with the modified Oxford method and Valsalva maneuver.

    Science.gov (United States)

    Horsman, H M; Tzeng, Y C; Galletly, D C; Peebles, K C

    2014-12-01

    Baroreflex assessment has diagnostic and prognostic utility in the clinical and research environments, and there is a need for a reliable, simple, noninvasive method of assessment. The repeated sit-to-stand method induces oscillatory changes in blood pressure (BP) at a desired frequency and is suitable for assessing dynamic baroreflex sensitivity (BRS). However, little is known about the reliability of this method and its ability to discern fundamental properties of the baroreflex. In this study we sought to: 1) evaluate the reliability of the sit-to-stand method for assessing BRS and compare its performance against two established methods (Oxford method and Valsalva maneuver), and 2) examine whether the frequency of the sit-to-stand method influences hysteresis. Sixteen healthy participants underwent three trials of each method. For the sit-to-stand method, which was performed at 0.1 and 0.05 Hz, BRS was quantified as an integrated response (BRSINT) and in response to falling and rising BP (BRSDOWN and BRSUP, respectively). Test retest reliability was assessed using the intraclass correlation coefficient (ICC). Irrespective of frequency, the ICC for BRSINT during the sit-to-stand method was ≥0.88. The ICC for a rising BP evoked by phenylephrine (PEGAIN) in the Oxford method was 0.78 and ≤0.5 for the remaining measures. During the sit-to-stand method, hysteresis was apparent in all participants at 0.1 Hz but was absent at 0.05 Hz. These findings indicate the sit-to-stand method is a statistically reliable BRS assessment tool and suitable for the examination of baroreflex hysteresis. Using this approach we showed that baroreflex hysteresis is a frequency-dependent phenomenon.

  9. Assessment of soil microbial diversity with functional multi-endpoint methods

    DEFF Research Database (Denmark)

    Winding, Anne; Creamer, R. E.; Rutgers, M.

    Soil microbial diversity provides the cornerstone for support of soil ecosystem services by key roles in soil organic matter turnover, carbon sequestration and water infiltration. However, standardized methods to quantify the multitude of microbial functions in soils are lacking. Methods based......-substrates. These methods have been proposed to fill the gap. The techniques vary in how close they are to in situ functions; dependency on growth during incubation; and whether it is only bacteria or also fungi and /or extracellular enzymes. Also they vary in the functions tested and the number of functions. In addition...... techniques of assessing soil microbial functional diversity in a European transect consisting of 81 soil samples covering five Biogeograhical Zones and three land-uses and compare with the vast amount of data delivered in other projects (BISQ, RMQS-bioindicateur). Based on experimental results...

  10. Capture-recapture method for assessing publication bias

    Directory of Open Access Journals (Sweden)

    Jalal Poorolajal

    2010-01-01

    Full Text Available

    Background: Publication bias is an important factor that may result in selection bias and lead to overestimation of the intervention effect. Capture recapture method is considered as a potentially useful procedure for investigating and estimating publication bias.

    Methods: We conducted a systematic review to estimate the duration of protection provided by hepatitis B vaccine by measuring the anamnestic immune response to booster doses of vaccine and retrieved studies from three separate sources, including a electronic databases, b reference lists of the studies, and c conference databases as well as contact with experts and manufacturers. Capture recapture and some conventional methods such as funnel plot, Begg test, Egger test, and trim and fill method were employed for assessing publication bias.

    Results: Based on capture recapture method, completeness of the overall search results was 87.2% [95% CI: 84.6% to 89.0%] and log-linear model suggested 5 [95% CI: 4.2 to 6.2] missing studies. The funnel plot was asymmetric but Begg and Egger tests results were

  11. Analysis of Frequency of Use of Different Scar Assessment Scales Based on the Scar Condition and Treatment Method

    Directory of Open Access Journals (Sweden)

    Seong Hwan Bae

    2014-03-01

    Full Text Available Analysis of scars in various conditions is essential, but no consensus had been reached on the scar assessment scale to select for a given condition. We reviewed papers to determine the scar assessment scale selected depending on the scar condition and treatment method. We searched PubMed for articles published since 2000 with the contents of the scar evaluation using a scar assessment scale with a Journal Citation Report impact factor >0.5. Among them, 96 articles that conducted a scar evaluation using a scar assessment scale were reviewed and analyzed. The scar assessment scales were identified and organized by various criteria. Among the types of scar assessment scales, the Patient and Observer Scar Assessment Scale (POSAS was found to be the most frequently used scale. As for the assessment of newly developed operative scars, the POSAS was most used. Meanwhile, for categories depending on the treatment methods for preexisting scars, the Vancouver Scar Scale (VSS was used in 6 studies following a laser treatment, the POSAS was used in 7 studies following surgical treatment, and the POSAS was used in 7 studies following a conservative treatment. Within the 12 categories of scar status, the VSS showed the highest frequency in 6 categories and the POSAS showed the highest frequency in the other 6 categories. According to our reviews, the POSAS and VSS are the most frequently used scar assessment scales. In the future, an optimal, universal scar scoring system is needed in order to better evaluate and treat pathologic scarring.

  12. Assessment of Nuclear Fuels using Radiographic Thickness Measurement Method

    Energy Technology Data Exchange (ETDEWEB)

    Muhammad Abir; Fahima Islam; Hyoung Koo Lee; Daniel Wachs

    2014-11-01

    The Convert branch of the National Nuclear Security Administration (NNSA) Global Threat Reduction Initiative (GTRI) focuses on the development of high uranium density fuels for research and test reactors for nonproliferation. This fuel is aimed to convert low density high enriched uranium (HEU) based fuel to high density low enriched uranium (LEU) based fuel for high performance research reactors (HPRR). There are five U.S. reactors that fall under the HPRR category, including: the Massachusetts Institute of Technology Reactor (MITR), the National Bureau of Standards Reactor (NBSR), the Missouri University Research Reactor (UMRR), the Advanced Test Reactor (ATR), and the High Flux Isotope Reactor (HFIR). U-Mo alloy fuel phase in the form of either monolithic or dispersion foil type fuels, such as ATR Full-size In center flux trap Position (AFIP) and Reduced Enrichment for Research and Test Reactor (RERTR), are being designed for this purpose. The fabrication process1 of RERTR is susceptible to introducing a variety of fuel defects. A dependable quality control method is required during fabrication of RERTR miniplates to maintain the allowable design tolerances, therefore evaluating and analytically verifying the fabricated miniplates for maintaining quality standards as well as safety. The purpose of this work is to analyze the thickness of the fabricated RERTR-12 miniplates using non-destructive technique to meet the fuel plate specification for RERTR fuel to be used in the ATR.

  13. A real-time impedance based method to assess Rhodococcus equi virulence.

    Directory of Open Access Journals (Sweden)

    Aleksandra A Miranda-CasoLuengo

    Full Text Available Rhodococcus equi is a facultative intracellular pathogen of macrophages and the causative agent of foal pneumonia. R. equi virulence is usually assessed by analyzing intracellular growth in macrophages by enumeration of bacteria following cell lysis, which is time consuming and does not allow for a high throughput analysis. This paper describes the use of an impedance based real-time method to characterize proliferation of R. equi in macrophages, using virulent and attenuated strains lacking the vapA gene or virulence plasmid. Image analysis suggested that the time-dependent cell response profile (TCRP is governed by cell size and roundness as well as cytoxicity of infecting R. equi strains. The amplitude and inflection point of the resulting TCRP were dependent on the multiplicity of infection as well as virulence of the infecting strain, thus distinguishing between virulent and attenuated strains.

  14. A real-time impedance based method to assess Rhodococcus equi virulence.

    Science.gov (United States)

    Miranda-CasoLuengo, Aleksandra A; Miranda-CasoLuengo, Raúl; Lieggi, Nora T; Luo, Haixia; Simpson, Jeremy C; Meijer, Wim G

    2013-01-01

    Rhodococcus equi is a facultative intracellular pathogen of macrophages and the causative agent of foal pneumonia. R. equi virulence is usually assessed by analyzing intracellular growth in macrophages by enumeration of bacteria following cell lysis, which is time consuming and does not allow for a high throughput analysis. This paper describes the use of an impedance based real-time method to characterize proliferation of R. equi in macrophages, using virulent and attenuated strains lacking the vapA gene or virulence plasmid. Image analysis suggested that the time-dependent cell response profile (TCRP) is governed by cell size and roundness as well as cytoxicity of infecting R. equi strains. The amplitude and inflection point of the resulting TCRP were dependent on the multiplicity of infection as well as virulence of the infecting strain, thus distinguishing between virulent and attenuated strains.

  15. Monte Carlo method for photon heating using temperature-dependent optical properties.

    Science.gov (United States)

    Slade, Adam Broadbent; Aguilar, Guillermo

    2015-02-01

    The Monte Carlo method for photon transport is often used to predict the volumetric heating that an optical source will induce inside a tissue or material. This method relies on constant (with respect to temperature) optical properties, specifically the coefficients of scattering and absorption. In reality, optical coefficients are typically temperature-dependent, leading to error in simulation results. The purpose of this study is to develop a method that can incorporate variable properties and accurately simulate systems where the temperature will greatly vary, such as in the case of laser-thawing of frozen tissues. A numerical simulation was developed that utilizes the Monte Carlo method for photon transport to simulate the thermal response of a system that allows temperature-dependent optical and thermal properties. This was done by combining traditional Monte Carlo photon transport with a heat transfer simulation to provide a feedback loop that selects local properties based on current temperatures, for each moment in time. Additionally, photon steps are segmented to accurately obtain path lengths within a homogenous (but not isothermal) material. Validation of the simulation was done using comparisons to established Monte Carlo simulations using constant properties, and a comparison to the Beer-Lambert law for temperature-variable properties. The simulation is able to accurately predict the thermal response of a system whose properties can vary with temperature. The difference in results between variable-property and constant property methods for the representative system of laser-heated silicon can become larger than 100K. This simulation will return more accurate results of optical irradiation absorption in a material which undergoes a large change in temperature. This increased accuracy in simulated results leads to better thermal predictions in living tissues and can provide enhanced planning and improved experimental and procedural outcomes.

  16. Dependence of the efficiency of a multicapillary column on the liquid phase loading method.

    Science.gov (United States)

    Zhdanov, V P; Sidelnikov, V N; Vlasov, A A

    2001-09-14

    One of the main approaches employed to reach fast chromatographic separation is based on using columns containing up to 1000 capillaries with the diameter size down to 10-100 microm. The efficiency of such columns depends on the dispersion of the capillary radius and on the way of the liquid-film loading. We present general equations describing these effects. Specifically, we show theoretically and experimentally that the separation efficiency can be improved by using the loading methods specially designed in order to take into account correlation between the film thickness and capillary radius.

  17. A variation iteration method for isotropic velocity-dependent potentials: Scattering case

    Energy Technology Data Exchange (ETDEWEB)

    Eed, H. [Applied Science Private University, Basic Science Department, Amman (Jordan)

    2014-12-01

    We propose a new approximation scheme to obtain analytic expressions for the Schroedinger equation with isotropic velocity-dependent potential to determine the scattering phase shift. In order to test the validity of our approach, we applied it to an exactly solvable model for nucleon-nucleon scattering. The results of the variation iteration method (VIM) formalism compare quite well with those of the exactly solvable model. The developed formalism can be applied in problems concerning pion-nucleon, nucleon-nucleon, and electron-atom scattering. (orig.)

  18. Dependence of the legume seeds vigour on their maturity and method of harvest

    Directory of Open Access Journals (Sweden)

    Stanisław Grzesiuk

    2014-01-01

    Full Text Available Several methods were used to study 'the vigour and viability of legume seeds (Pisum sativum L. cv. Hamil, Piston arvense L. cv. Mazurska and Lupinus luteus L. cv. Tomik harvested at three main stages of seed repening (green, wax and full. The seeds were tested immediately after harvest (series A and after two weeks of storage in pods (series B. It was found that: 1 the vigour of ripening legume seeds increases with maturation; 2 post-harvest storage in pods increases the degree of ripeness and. consequently. vigour; 3 seeds attain full vigour later than full viability; 4 seed leachate conductivity method gives erroneous results in assessing the vigour of immature seeds: 5 full vigour of maturing seeds of various degrees of ripeness can be determined by simultaneous application of both biological (eg. seedling growth analysis, VI and biochemical (e.g. total dehydrogenase activity methods.

  19. An assessment on epitope prediction methods for protozoa genomes

    Directory of Open Access Journals (Sweden)

    Resende Daniela M

    2012-11-01

    Full Text Available Abstract Background Epitope prediction using computational methods represents one of the most promising approaches to vaccine development. Reduction of time, cost, and the availability of completely sequenced genomes are key points and highly motivating regarding the use of reverse vaccinology. Parasites of genus Leishmania are widely spread and they are the etiologic agents of leishmaniasis. Currently, there is no efficient vaccine against this pathogen and the drug treatment is highly toxic. The lack of sufficiently large datasets of experimentally validated parasites epitopes represents a serious limitation, especially for trypanomatids genomes. In this work we highlight the predictive performances of several algorithms that were evaluated through the development of a MySQL database built with the purpose of: a evaluating individual algorithms prediction performances and their combination for CD8+ T cell epitopes, B-cell epitopes and subcellular localization by means of AUC (Area Under Curve performance and a threshold dependent method that employs a confusion matrix; b integrating data from experimentally validated and in silico predicted epitopes; and c integrating the subcellular localization predictions and experimental data. NetCTL, NetMHC, BepiPred, BCPred12, and AAP12 algorithms were used for in silico epitope prediction and WoLF PSORT, Sigcleave and TargetP for in silico subcellular localization prediction against trypanosomatid genomes. Results A database-driven epitope prediction method was developed with built-in functions that were capable of: a removing experimental data redundancy; b parsing algorithms predictions and storage experimental validated and predict data; and c evaluating algorithm performances. Results show that a better performance is achieved when the combined prediction is considered. This is particularly true for B cell epitope predictors, where the combined prediction of AAP12 and BCPred12 reached an AUC value

  20. Neutral red cytotoxicity assays for assessing in vivo carbon nanotube ecotoxicity in mussels--Comparing microscope and microplate methods.

    Science.gov (United States)

    Miller, M A; Bankier, C; Al-Shaeri, M A M; Hartl, M G J

    2015-12-30

    The purpose of the present study was to compare two neutral red retention methods, the more established but very labour-intensive microscope method (NRR) against the more recently developed microplate method (NRU). The intention was to explore whether the sample volume throughput could be increased and potential operator bias avoided. Mussels Mytilus sp. were exposed in vivo to 50, 250 and 500 μg L(-1) single (SWCNTs) or multi-walled carbon nanotubes (MWCNTs). Using the NRR method, SWCNTs and MWCNTs caused concentration dependent decreases in neutral red retention time. However, a concentration dependent decrease in optical density was not observed using the NRU method. We conclude that the NRU method is not sensitive enough to assess carbon nanotube ecotoxicity in vivo in environmentally relevant media, and recommend using the NRR method.

  1. Analysis of Radiative Radial Fin with Temperature-Dependent Thermal Conductivity Using Nonlinear Differential Transformation Methods

    Directory of Open Access Journals (Sweden)

    Mohsen Torabi

    2013-01-01

    Full Text Available Radiative radial fin with temperature-dependent thermal conductivity is analyzed. The calculations are carried out by using differential transformation method (DTM, which is a seminumerical-analytical solution technique that can be applied to various types of differential equations, as well as the Boubaker polynomials expansion scheme (BPES. By using DTM, the nonlinear constrained governing equations are reduced to recurrence relations and related boundary conditions are transformed into a set of algebraic equations. The principle of differential transformation is briefly introduced and then applied to the aforementioned equations. Solutions are subsequently obtained by a process of inverse transformation. The current results are then compared with previously obtained results using variational iteration method (VIM, Adomian decomposition method (ADM, homotopy analysis method (HAM, and numerical solution (NS in order to verify the accuracy of the proposed method. The findings reveal that both BPES and DTM can achieve suitable results in predicting the solution of such problems. After these verifications, we analyze fin efficiency and the effects of some physically applicable parameters in this problem such as radiation-conduction fin parameter, radiation sink temperature, heat generation, and thermal conductivity parameters.

  2. Evaluation of Current Assessment Methods in Engineering Entrepreneurship Education

    Science.gov (United States)

    Purzer, Senay; Fila, Nicholas; Nataraja, Kavin

    2016-01-01

    Quality assessment is an essential component of education that allows educators to support student learning and improve educational programs. The purpose of this study is to evaluate the current state of assessment in engineering entrepreneurship education. We identified 52 assessment instruments covered in 29 journal articles and conference…

  3. Landmine detection using ensemble discrete hidden Markov models with context dependent training methods

    Science.gov (United States)

    Hamdi, Anis; Missaoui, Oualid; Frigui, Hichem; Gader, Paul

    2010-04-01

    We propose a landmine detection algorithm that uses ensemble discrete hidden Markov models with context dependent training schemes. We hypothesize that the data are generated by K models. These different models reflect the fact that mines and clutter objects have different characteristics depending on the mine type, soil and weather conditions, and burial depth. Model identification is based on clustering in the log-likelihood space. First, one HMM is fit to each of the N individual sequence. For each fitted model, we evaluate the log-likelihood of each sequence. This will result in an N x N log-likelihood distance matrix that will be partitioned into K groups. In the second step, we learn the parameters of one discrete HMM per group. We propose using and optimizing various training approaches for the different K groups depending on their size and homogeneity. In particular, we will investigate the maximum likelihood, and the MCE-based discriminative training approaches. Results on large and diverse Ground Penetrating Radar data collections show that the proposed method can identify meaningful and coherent HMM models that describe different properties of the data. Each HMM models a group of alarm signatures that share common attributes such as clutter, mine type, and burial depth. Our initial experiments have also indicated that the proposed mixture model outperform the baseline HMM that uses one model for the mine and one model for the background.

  4. Charge transport calculations of organic semiconductors by the time-dependent wave-packet diffusion method

    Science.gov (United States)

    Ishii, Hiroyuki; Kobayashi, Nobuhiko; Hirose, Kenji

    2012-02-01

    Organic materials form crystals by relatively weak Van der Waals attraction between molecules, and thus differ fundamentally from covalently bonded semiconductors. Carriers in the organic semiconductors induce the drastic lattice deformation, which is called as polaron state. The polaron effect on the transport is a serious problem. Exactly what conduction mechanism applies to organic semiconductors has not been established. Therefore, we have investigated the transport properties using the Time-Dependent Wave-Packet Diffusion (TD-WPD) method [1]. To consider the polaron effect on the transport, in the methodology, we combine the wave-packet dynamics based on the quantum mechanics theory with the molecular dynamics. As the results, we can describe the electron motion modified by (electron-phonon mediated) time-dependent structural change. We investigate the transport property from an atomistic viewpoint and evaluate the mobility of organic semiconductors. We clarify the temperature dependence of mobility from the thermal activated behavior to the power law behavior. I will talk about these results in my presentation. [1] H. Ishii, N. Kobayashi, K. Hirose, Phys. Rev. B, 82 085435 (2010).

  5. Vulnerability assessment of skiing-dependent businesses to the effects of climate change in Banff and Jasper National Parks, Canada

    Science.gov (United States)

    Reynolds, David Michael

    This qualitative study examines the potential positive and negative socio-economic impacts that may emerge from the long-term effects of climate change on skiing-dependent businesses in Banff and Jasper National Parks, Canada. My goal was to determine whether or not skiing-related tourism in the parks in the 2020s and 2050s is more or less socio-economically vulnerable to the effects of climate change on snow cover, temperatures and ski season length at ski resorts in the parks. My study explored the level of awareness and personal perceptions of 60 skiing-dependent business managers about how the impact of climate change on ski resorts may influence future socio-economics of ski tourism businesses. I employed a vulnerability assessment approach and adopted some elements of grounded theory. My primary data sources are interviews with managers and the outcome of the geographical factors index (GFI). Supporting methods include: an analysis and interpretation of climate model data and an interpretation of the economic analysis of skiing in the parks. The interview data were sorted and coded to establish concepts and findings by interview questions, while the GFI model rated and ranked 24 regional ski resorts in the Canadian Cordillera. The findings answered the research questions and helped me conclude what the future socio-economic vulnerability may be of skiing-dependent businesses in the parks. The interviews revealed that managers are not informed about climate change and they have not seen any urgency to consider the effects on business. The GFI revealed that the ski resorts in the parks ranked in the top ten of 24 ski resorts in the Cordillera based on 14 common geographical factors. The economic reports suggest skiing is the foundation of the winter economy in the parks and any impact on skiing would directly impact other skiing-dependent businesses. Research indicates that the effects of climate change may have less economic impact on skiing-dependent

  6. A robust moving mesh method for spectral collocation solutions of time-dependent partial differential equations

    Science.gov (United States)

    Subich, Christopher J.

    2015-08-01

    This work extends the machinery of the moving mesh partial differential equation (MMPDE) method to the spectral collocation discretization of time-dependent partial differential equations. Unlike previous approaches which bootstrap the moving grid from a lower-order, finite-difference discretization, this work uses a consistent spectral collocation discretization for both the grid movement problem and the underlying, physical partial differential equation. Additionally, this work develops an error monitor function based on filtering in the spectral domain, which concentrates grid points in areas of locally poor resolution without relying on an assumption of locally steep gradients. This makes the MMPDE method more robust in the presence of rarefaction waves which feature rapid change in higher-order derivatives.

  7. An implicit fast Fourier transform method for integration of the time dependent Schrodinger equation

    Energy Technology Data Exchange (ETDEWEB)

    Riley, M.E. [Sandia National Labs., Albuquerque, NM (United States). Laser, Optics, and Remote Sensing Dept.; Ritchie, A.B. [Lawrence Livermore National Lab., CA (United States)

    1997-12-31

    One finds that the conventional exponentiated split operator procedure is subject to difficulties when solving the time-dependent Schrodinger equation for Coulombic systems. By rearranging the kinetic and potential energy terms in the temporal propagator of the finite difference equations, one can find a propagation algorithm for three dimensions that looks much like the Crank-Nicholson and alternating direction implicit methods for one- and two-space-dimensional partial differential equations. The authors report investigations of this novel implicit split operator procedure. The results look promising for a purely numerical approach to certain electron quantum mechanical problems. A charge exchange calculation is presented as an example of the power of the method.

  8. On methods for assessing water-resource risks and vulnerabilities

    Science.gov (United States)

    Gleick, Peter H.

    2015-11-01

    Because of the critical role that freshwater plays in maintaining ecosystem health and supporting human development through agricultural and industrial production there have been numerous efforts over the past few decades to develop indicators and indices of water vulnerability. Each of these efforts has tried to identify key factors that both offer insights into water-related risks and strategies that might be useful for reducing those risks. These kinds of assessments have serious limitations associated with data, the complexity of water challenges, and the changing nature of climatic and hydrologic variables. This new letter by Padowski et al (2015 Environ. Res. Lett. 10 104014) adds to the field by broadening the kinds of measures that should be integrated into such tools, especially in the area of institutional characteristics, and analyzing them in a way that provides new insights into the similarities and differences in water risks facing different countries, but much more can and should be done with new data and methods to improve our understanding of water challenges.

  9. Determination of waterscape beauties through visual quality assessment method.

    Science.gov (United States)

    Bulut, Zohre; Yilmaz, Hasan

    2009-07-01

    Besides being an indispensable life element, water is also in the first rows among the most important landscape elements that have the visual reserve value in both natural and cultural environments. In this study, the aim was to present suggestions about the use of waterscapes in landscape design and planning attempts by determining the water types bearing high visual reserve values among different types of waterscapes bearing high visual reserve values. The visual quality assessment method was used in this study. One hundred and twenty eight university students ranked the six waterscapes in a visual quality survey. The results showed that urban waterscape scenery [visual quality point (VQP) = 6.0391], was the most preferred category, whereas, river scenery (VQP = 3.5547) was the least preferred. The second preferred waterscape was waterfall (in rural landscape) scenery (VQP = 5.8594) and the third was standing water scenery (SWS; VQP = 5.3672). The relationships between landscape parameters and visual quality of landscape indicated that vividness and fascinaty parameters had a significant relation with preference. Some suggestions were made regarding the use of wa terscapes visual value in planning and designing of the landscape.

  10. Assessment of breast cancer tumour size using six different methods

    Energy Technology Data Exchange (ETDEWEB)

    Meier-Meitinger, Martina; Uder, Michael; Schulz-Wendtland, Ruediger; Adamietz, Boris [Erlangen University Hospital, Institute of Diagnostic Radiology, Erlangen (Germany); Haeberle, Lothar; Fasching, Peter A.; Bani, Mayada R.; Heusinger, Katharina; Beckmann, Matthias W. [Erlangen University Hospital, University Breast Center, Department of Gynecology and Obstetrics, Erlangen (Germany); Wachter, David [Erlangen University Hospital, Institute of Pathology, Erlangen (Germany)

    2011-06-15

    Tumour size estimates using mammography (MG), conventional ultrasound (US), compound imaging (CI) and real-time elastography (RTE) were compared with histopathological specimen sizes. The largest diameters of 97 malignant breast lesions were measured. Two US and CI measurements were made: US1/CI1 (hypoechoic nucleus only) and US2/CI2 (hypoechoic nucleus plus hyperechoic halo). Measurements were compared with histopathological tumour sizes using linear regression and Bland-Altman plots. Size prediction was best with ultrasound (US/CI/RTE: R{sup 2} 0.31-0.36); mammography was poorer (R{sup 2} = 0.19). The most accurate method was US2, while US1 and CI1 were poorest. Bland-Altman plots showed better size estimation with US2, CI2 and RTE, with low variation, while mammography showed greatest variability. Smaller tumours were better assessed than larger ones. CI2 and US2 performed best for ductal tumours and RTE for lobular cancers. Tumour size prediction accuracy did not correlate significantly with breast density, but on MG tumours were more difficult to detect in high-density tissue. The size of ductal tumours is best predicted with US2 and CI2, while for lobular cancers RTE is best. Hyperechoic tumour surroundings should be included in US and CI measurements and RTE used as an additional technique in the clinical staging process. (orig.)

  11. Estimated work ability in warm outdoor environments depends on the chosen heat stress assessment metric

    Science.gov (United States)

    Bröde, Peter; Fiala, Dusan; Lemke, Bruno; Kjellstrom, Tord

    2017-04-01

    With a view to occupational effects of climate change, we performed a simulation study on the influence of different heat stress assessment metrics on estimated workability (WA) of labour in warm outdoor environments. Whole-day shifts with varying workloads were simulated using as input meteorological records for the hottest month from four cities with prevailing hot (Dallas, New Delhi) or warm-humid conditions (Managua, Osaka), respectively. In addition, we considered the effects of adaptive strategies like shielding against solar radiation and different work-rest schedules assuming an acclimated person wearing light work clothes (0.6 clo). We assessed WA according to Wet Bulb Globe Temperature (WBGT) by means of an empirical relation of worker performance from field studies (Hothaps), and as allowed work hours using safety threshold limits proposed by the corresponding standards. Using the physiological models Predicted Heat Strain (PHS) and Universal Thermal Climate Index (UTCI)-Fiala, we calculated WA as the percentage of working hours with body core temperature and cumulated sweat loss below standard limits (38 °C and 7.5% of body weight, respectively) recommended by ISO 7933 and below conservative (38 °C; 3%) and liberal (38.2 °C; 7.5%) limits in comparison. ANOVA results showed that the different metrics, workload, time of day and climate type determined the largest part of WA variance. WBGT-based metrics were highly correlated and indicated slightly more constrained WA for moderate workload, but were less restrictive with high workload and for afternoon work hours compared to PHS and UTCI-Fiala. Though PHS showed unrealistic dynamic responses to rest from work compared to UTCI-Fiala, differences in WA assessed by the physiological models largely depended on the applied limit criteria. In conclusion, our study showed that the choice of the heat stress assessment metric impacts notably on the estimated WA. Whereas PHS and UTCI-Fiala can account for

  12. Reduced conductivity dependence method for increase of dipole localization accuracy in the EEG inverse problem.

    Science.gov (United States)

    Yitembe, Bertrand Russel; Crevecoeur, Guillaume; Van Keer, Roger; Dupre, Luc

    2011-05-01

    The EEG is a neurological diagnostic tool with high temporal resolution. However, when solving the EEG inverse problem, its localization accuracy is limited because of noise in measurements and available uncertainties of the conductivity value in the forward model evaluations. This paper proposes the reduced conductivity dependence (RCD) method for decreasing the localization error in EEG source analysis by limiting the propagation of the uncertain conductivity values to the solutions of the inverse problem. We redefine the traditional EEG cost function, and in contrast to previous approaches, we introduce a selection procedure of the EEG potentials. The selected potentials are, as low as possible, affected by the uncertainties of the conductivity when solving the inverse problem. We validate the methodology on the widely used three-shell spherical head model with a single electrical dipole and multiple dipoles as source model. The proposed RCD method enhances the source localization accuracy with a factor ranging between 2 and 4, dependent on the dipole location and the noise in measurements. © 2011 IEEE

  13. EVALUATION OF THE TIME DEPENDENT FAILURE ASSESSMENT CURVES FOR 10CrMo910 AND 316 SS AT 550℃

    Institute of Scientific and Technical Information of China (English)

    F.Z.Xuan; S.D.Tu; Z.D.Wang; C.W.Ma

    2004-01-01

    10CrMo910 and 316 stainless steel are widely adopted in high temperature structures of power generations, chemical processing plants and petroleum refineries. In this work, a total of 10000 hour tensile creep test on 16 specimens of such two materials was conducted at 550℃. On the basis of the experimental results, the isochronous stress-strain curves and time-dependent failure assessment curves of the two materials were given. Finally, the formulae of time dependent failure assessment curve for 10CrMo910 and 316 stainless steel corresponding to long-term creep cases, which could be utilized in the high temperature defects assessment, were established. The procedure for defining the time-dependent failure assessment curves was also presented.

  14. HDMR methods to assess reliability in slope stability analyses

    Science.gov (United States)

    Kozubal, Janusz; Pula, Wojciech; Vessia, Giovanna

    2014-05-01

    -soil masses) resulting in sliding mechanisms have been investigated in this study. The reliability indices values drawn from the HDRM method have been compared with conventional approaches as neural networks: the efficiency of HDRM is shown in the case studied. References Chowdhury R., Rao B.N. and Prasad A.M. 2009. High-dimensional model representation for structural reliability analysis. Commun. Numer. Meth. Engng, 25: 301-337. Chowdhury R. and Rao B. 2010. Probabilistic Stability Assessment of Slopes Using High Dimensional Model Representation. Computers and Geotechnics, 37: 876-884.

  15. Assessment of pulmonary dynamics in normal newborns: a pneumotachographic method.

    Science.gov (United States)

    Estol, P; Píriz, H; Pintos, L; Nieto, F; Simini, F

    1988-01-01

    A pneumotachographic method for assessment of pulmonary dynamics in critically ill newborns in an intensive care setting was developed in our laboratory. Before the results obtained with this method could be applied, the normal range of values were determined in 48 normal term and preterm newborns. Their body weight ranged between 1200 and 4100 g, and postnatal ages between 24 hours and 21 days. In three infants, two determinations were performed after an interval of 7 days. The studies were performed with a pneumotachograph applied to the upper airway by means of an inflatable face mask or latex nasal prongs. The air flow signal was electronically integrated to time to produce a volume signal. Airway pressure was determined proximal to the pneumotachograph. Esophageal pressure was determined with a water filled catheter placed in the lower third of the esophague. Tidal volume (VT), minute ventilation (V), Dynamic compliance (Cdyn), total pulmonary resistance (R), total pulmonary work (Wt), Elastic work (We), and flow resistive work (Wv), were determined. A significant linear correlation was found between Cdyn and body weight (r = 0.50, p less than 0.01) whereas no significative correlation was found between body weight and VT, V or R. Values for VT, V and Cdyn were corrected for body weight and means (X), standard deviation (SD) so as 10th and 90th percentiles are shown in table III. X, SD and percentiles for R were shown in table III. Wt, We and Wv were corrected for V, and X, SD and percentiles shown in table III. Values of VT/Kg, Cdyn/Kg and R are similar to those found by other authors with pneumotachography and plethysmography. The V/Kg values obtained by us were higher than those reported by other authors, which together with the lack of correlation of VT and V with body weight, question the reliability of V values in our study. This could be explained by: 1) excessive increase in dead space in cases in which a face mask was used; 2) nocioceptive stimulus

  16. Evaluating simplified methods for liquefaction assessment for loss estimation

    Science.gov (United States)

    Kongar, Indranil; Rossetto, Tiziana; Giovinazzi, Sonia

    2017-06-01

    Currently, some catastrophe models used by the insurance industry account for liquefaction by applying a simple factor to shaking-induced losses. The factor is based only on local liquefaction susceptibility and this highlights the need for a more sophisticated approach to incorporating the effects of liquefaction in loss models. This study compares 11 unique models, each based on one of three principal simplified liquefaction assessment methods: liquefaction potential index (LPI) calculated from shear-wave velocity, the HAZUS software method and a method created specifically to make use of USGS remote sensing data. Data from the September 2010 Darfield and February 2011 Christchurch earthquakes in New Zealand are used to compare observed liquefaction occurrences to forecasts from these models using binary classification performance measures. The analysis shows that the best-performing model is the LPI calculated using known shear-wave velocity profiles, which correctly forecasts 78 % of sites where liquefaction occurred and 80 % of sites where liquefaction did not occur, when the threshold is set at 7. However, these data may not always be available to insurers. The next best model is also based on LPI but uses shear-wave velocity profiles simulated from the combination of USGS VS30 data and empirical functions that relate VS30 to average shear-wave velocities at shallower depths. This model correctly forecasts 58 % of sites where liquefaction occurred and 84 % of sites where liquefaction did not occur, when the threshold is set at 4. These scores increase to 78 and 86 %, respectively, when forecasts are based on liquefaction probabilities that are empirically related to the same values of LPI. This model is potentially more useful for insurance since the input data are publicly available. HAZUS models, which are commonly used in studies where no local model is available, perform poorly and incorrectly forecast 87 % of sites where liquefaction occurred, even at

  17. Assessing Coupled Protein Folding and Binding Through Temperature-Dependent Isothermal Titration Calorimetry.

    Science.gov (United States)

    Sahu, Debashish; Bastidas, Monique; Lawrence, Chad W; Noid, William G; Showalter, Scott A

    2016-01-01

    Broad interest in the thermodynamic driving forces of coupled macromolecular folding and binding is motivated by the prevalence of disorder-to-order transitions observed when intrinsically disordered proteins (IDPs) bind to their partners. Isothermal titration calorimetry (ITC) is one of the few methods available for completely evaluating the thermodynamic parameters describing a protein-ligand binding event. Significantly, when the effective ΔH° for the coupled folding and binding process is determined by ITC in a temperature series, the constant-pressure heat capacity change (ΔCp) associated with these coupled equilibria is experimentally accessible, offering a unique opportunity to investigate the driving forces behind them. Notably, each of these molecular-scale events is often accompanied by strongly temperature-dependent enthalpy changes, even over the narrow temperature range experimentally accessible for biomolecules, making single temperature determinations of ΔH° less informative than typically assumed. Here, we will document the procedures we have adopted in our laboratory for designing, executing, and globally analyzing temperature-dependent ITC studies of coupled folding and binding in IDP interactions. As a biologically significant example, our recent evaluation of temperature-dependent interactions between the disordered tail of FCP1 and the winged-helix domain from Rap74 will be presented. Emphasis will be placed on the use of publically available analysis programs written in MATLAB that facilitate quantification of the thermodynamic forces governing IDP interactions. Although motivated from the perspective of IDPs, the experimental design principles and data fitting procedures presented here are general to the study of most noncooperative ligand binding equilibria. © 2016 Elsevier Inc. All rights reserved.

  18. Informal caregivers and detection of delirium in postacute care: a correlational study of the confusion assessment method (CAM), confusion assessment method-family assessment method (CAM-FAM) and DSM-IV criteria.

    Science.gov (United States)

    Flanagan, Nina M; Spencer, Gale

    2016-09-01

    Delirium is a common, serious and potentially life-threatening syndrome affecting older adults. This syndrome continues to be under-recognised and under treated by healthcare professionals across all care settings. Older adults who develop delirium have poorer outcomes, higher mortality and higher care costs. The purposes of this study were to correlate the confusion assessment method-family assessment method and confusion assessment method in the detection of delirium in postacute care, to correlate the confusion assessment method-family assessment method and diagnostic and statistical manual of mental disorders text revision criteria in detection of delirium in postacute care, to determine the prevalence of delirium in postacute care elders and to describe the relationship of level of cognitive impairment and delirium in the postacute care setting. Implications for Practice Delirium is disturbing for patients and caregivers. Frequently . family members want to provide information about their loved one. The use of the CAM-FAM and CAM can give a more definitive determination of baseline status. Frequent observations using both instruments may lead to better recognition of delirium and implementation of interventions to prevent lasting sequelae. Descriptive studies determined the strengths of relationship between the confusion assessment method, confusion assessment method-family assessment method, Mini-Cog and diagnostic and statistical manual of mental disorders text revision criteria in detection of delirium in the postacute care setting. Prevalence of delirium in this study was 35%. The confusion assessment method-family assessment method highly correlates with the confusion assessment method and diagnostic and statistical manual of mental disorders text revision criteria for detecting delirium in older adults in the postacute care setting. Persons with cognitive impairment are more likely to develop delirium. Family members recognise symptoms of delirium when

  19. An advective-spectral-mixed method for time-dependent many-body Wigner simulations

    CERN Document Server

    Xiong, Yunfeng; Shao, Sihong

    2016-01-01

    As a phase space language for quantum mechanics, the Wigner function approach bears a close analogy to classical mechanics and has been drawing growing attention, especially in simulating quantum many-body systems. However, deterministic numerical solutions have been almost exclusively confined to one-dimensional one-body systems and few results are reported even for one-dimensional two-body problems. This paper serves as the first attempt to solve the time-dependent many-body Wigner equation through a grid-based advective-spectral-mixed method. The main feature of the method is to resolve the linear advection in $(\\bm{x},t)$-space by an explicit three-step characteristic scheme coupled with the piecewise cubic spline interpolation, while the Chebyshev spectral element method in $\\bm k$-space is adopted for accurate calculation of the nonlocal pseudo-differential term. Not only the time step of the resulting method is not restricted by the usual CFL condition and thus a large time step is allowed, but also th...

  20. Statistical mechanics-based method to extract atomic distance-dependent potentials from protein structures.

    Science.gov (United States)

    Huang, Sheng-You; Zou, Xiaoqin

    2011-09-01

    In this study, we have developed a statistical mechanics-based iterative method to extract statistical atomic interaction potentials from known, nonredundant protein structures. Our method circumvents the long-standing reference state problem in deriving traditional knowledge-based scoring functions, by using rapid iterations through a physical, global convergence function. The rapid convergence of this physics-based method, unlike other parameter optimization methods, warrants the feasibility of deriving distance-dependent, all-atom statistical potentials to keep the scoring accuracy. The derived potentials, referred to as ITScore/Pro, have been validated using three diverse benchmarks: the high-resolution decoy set, the AMBER benchmark decoy set, and the CASP8 decoy set. Significant improvement in performance has been achieved. Finally, comparisons between the potentials of our model and potentials of a knowledge-based scoring function with a randomized reference state have revealed the reason for the better performance of our scoring function, which could provide useful insight into the development of other physical scoring functions. The potentials developed in this study are generally applicable for structural selection in protein structure prediction.

  1. Delineating species with DNA barcodes: a case of taxon dependent method performance in moths.

    Directory of Open Access Journals (Sweden)

    Mari Kekkonen

    Full Text Available The accelerating loss of biodiversity has created a need for more effective ways to discover species. Novel algorithmic approaches for analyzing sequence data combined with rapidly expanding DNA barcode libraries provide a potential solution. While several analytical methods are available for the delineation of operational taxonomic units (OTUs, few studies have compared their performance. This study compares the performance of one morphology-based and four DNA-based (BIN, parsimony networks, ABGD, GMYC methods on two groups of gelechioid moths. It examines 92 species of Finnish Gelechiinae and 103 species of Australian Elachistinae which were delineated by traditional taxonomy. The results reveal a striking difference in performance between the two taxa with all four DNA-based methods. OTU counts in the Elachistinae showed a wider range and a relatively low (ca. 65% OTU match with reference species while OTU counts were more congruent and performance was higher (ca. 90% in the Gelechiinae. Performance rose when only monophyletic species were compared, but the taxon-dependence remained. None of the DNA-based methods produced a correct match with non-monophyletic species, but singletons were handled well. A simulated test of morphospecies-grouping performed very poorly in revealing taxon diversity in these small, dull-colored moths. Despite the strong performance of analyses based on DNA barcodes, species delineated using single-locus mtDNA data are best viewed as OTUs that require validation by subsequent integrative taxonomic work.

  2. Diversity of endophytic bacteria of Dendrobium officinale based on culture-dependent and culture-independent methods

    Directory of Open Access Journals (Sweden)

    Cong Pei

    2017-01-01

    Full Text Available Culture-dependent and culture-independent methods were compared and evaluated in the study of the endophytic diversity of Dendrobium officinale. Culture-independent methods consisted of polymerase chain reaction–denaturing gradient gel electrophoresis (PCR-DGGE and metagenome methods. According to the results, differences were found between the three methods. Three phyla, namely Firmicutes, Proteobacteria, and Actinobacteria, were detected using the culture-dependent method, and two phyla, Firmicutes and Proteobacteria, were detected by the DGGE method. Using the metagenome method, four major phyla were determined, including Proteobacteria (76.54%, Actinobacteria (18.56%, Firmicutes (2.27%, and Bacteroidetes (1.56%. A distinct trend was obtained at the genus level in terms of the method and the corresponding number of genera determined. There were 449 genera and 16 genera obtained from the metagenome and DGGE methods, respectively, and only 7 genera were obtained through the culture-dependent method. By comparison, all the genera from the culture-dependent and DGGE methods were contained in the members determined using the metagenome method. Overall, culture-dependent methods are limited to ‘finding’ endophytic bacteria in plants. DGGE is an alternative to investigating primary diversity patterns; however, the metagenome method is still the best choice for determining the endophytic profile in plants. It is essential to use multiphasic approaches to study cultured and uncultured microbes.

  3. From eyeballing to statistical modelling : methods for assessment of occupational exposure

    NARCIS (Netherlands)

    Kromhout, H.

    1994-01-01

    In this thesis methods for assessment of occupational exposure are evaluated and developed. These methods range from subjective methods (qualitative and semiquantitative) to more objective quantitative methods based on actual measurement of personal exposure to chemical and physical

  4. ABOUT TRACK CIRCUIT CALCULATION METHOD DEPENDENT ON FERROMAGNET PROPERTIES IN CONDITIONS OF TRACTION CURRENT NOISE INFLUENCE

    Directory of Open Access Journals (Sweden)

    A. Yu. Zhuravlev

    2016-02-01

    Full Text Available Purpose. The work is intended to investigate the electromagnetic processes in impedance bond in order to improve noise immunity of track circuits (TC for safe railway operation. Methodology. To achieve this purpose the methods of scientific analysis, mathematical modelling, experimental study, a large-scale simulation were used. Findings. The work examined the interference affecting the normal performance of track circuits. To a large extent, part of track circuit damages account for failures in track circuit equipment. Track circuit equipment is connected directly to the track line susceptible to traction current interference, which causes changes in its electrical characteristics and electromagnetic properties. Normal operability, performance of the main operating modes of the track circuit is determined by previous calculation of its performance and compilation of regulatory tables. The classical method for determination of track circuit parameters was analysed. The classical calculation method assumes representation of individual sections of the electrical track circuit using the quadripole network with known coefficients, usually in the A-form. Determining the coefficients of linear element circuit creates no metrological or mathematical difficulties. However, in circuits containing nonlinear ferromagnets (FM, obtaining the coefficients on the entire induction change range in the cores is quite a difficult task because the classical methods of idling (I and short circuit (SC are not acceptable. This leads to complicated methods for determining both the module and the arguments of quadripole network coefficients. Instead of the classical method, the work proposed the method for calculating the track circuit dependent on nonlinear properties of ferromagnets. Originality. The article examines a new approach to the calculation of TC taking into account the losses in ferromagnets (FM, without determination of equivalent circuit quadripole

  5. Assessment of cognitive impairment in long-term oxygen therapy-dependent COPD patients.

    Science.gov (United States)

    Karamanli, Harun; Ilik, Faik; Kayhan, Fatih; Pazarli, Ahmet Cemal

    2015-01-01

    A number of studies have shown that COPD, particularly in its later and more severe stages, is associated with various cognitive deficits. Thus, the primary goal of the present study was to elucidate the extent of cognitive impairment in patients with long-term oxygen therapy-dependent (LTOTD) COPD. In addition, this study aimed to determine the effectiveness of two cognitive screening tests, the Mini-Mental State Examination (MMSE) and the Montreal Cognitive Assessment (MoCA), for COPD patients and the ability of oxygen therapy to mitigate COPD-related deficits in cognitive function. The present study enrolled 45 subjects: 24 nonuser and 21 regular-user LTOTD-COPD patients. All subjects had a similar grade of education, and there were no significant differences regarding age or sex. The MoCA (cutoff: therapy increased the risk of cognitive impairment (MoCA, P=0.007 and MMSE, P=0.014), and the MoCA and MMSE scores significantly correlated with the number of emergency admissions and the number of hospitalizations in the last year. In the present study, the nonuser LTOTD-COPD group exhibited a significant decrease in cognitive status compared with the regular-user LTOTD-COPD group. This suggests that the assessment of cognitive function in nonuser LTOTD-COPD patients and the use of protective strategies, such as continuous supplemental oxygen treatment, should be considered during the management of COPD in this population. In addition, the MoCA score was superior to the MMSE score for the determination of cognitive impairment in the nonuser LTOTD-COPD patients.

  6. Time-dependent Variation in Life Cycle Assessment of Microalgal Biorefinery Co-products

    Science.gov (United States)

    Montazeri, Mahdokht

    Microalgae can serve as a highly productive biological feedstock for fuels and chemicals. The lipid fraction of algal seeds has been the primary target of research for biofuel production. However, numerous assessments have found that valorization of co-products is essential to achieve economic and environmental goals. The relative proportion of co-products depends on the biomolecular composition of algae at the time of harvesting. In the present study the productivity of lipid, starch, and protein fractions were shown through growth experiments to vary widely with species, feeding regime, and harvesting time. Four algae species were cultivated under nitrogen-replete and -deplete conditions and analyzed at regular harvesting intervals. Dynamic growth results were then used for life cycle assessment using the U.S. Department of Energy's GREET model to determine optimal growth scenarios that minimize life cycle greenhouse gas (GHG) emissions, eutrophication, and cumulative energy demand (CED), while aiming for an energy return on investment (EROI) greater than unity. Per kg of biodiesel produced, C. sorokiniana in N-replete conditions harvested at 12 days was most favorable for GHG emissions and CED, despite having a lipid content of <20%. N. oculata under the same conditions had the lowest life cycle eutrophication impacts, driven by efficient nutrient cycling and valorization of microalgal protein and anaerobic digester residue co-products. The results indicate that growth cycle times that maximize a single fraction do not necessarily result in the most favorable environmental performance on a life cycle basis, underscoring the importance of designing biorefinery systems that simultaneously optimize for lipid and non-lipid fractions.

  7. Current Development in Elderly Comprehensive Assessment and Research Methods

    Directory of Open Access Journals (Sweden)

    Shantong Jiang

    2016-01-01

    Full Text Available Comprehensive geriatric assessment (CGA is a core and an essential part of the comprehensive care of the aging population. CGA uses specific tools to summarize elderly status in several domains that may influence the general health and outcomes of diseases of elderly patients, including assessment of medical, physical, psychological, mental, nutritional, cognitive, social, economic, and environmental status. Here, in this paper, we review different assessment tools used in elderly patients with chronic diseases. The development of comprehensive assessment tools and single assessment tools specially used in a dimension of CGA was discussed. CGA provides substantial insight into the comprehensive management of elderly patients. Developing concise and effective assessment instruments is helpful to carry out CGA widely to create a higher clinical value.

  8. Assessment of motivation for treatment in alcohol dependent patients who sought treatment at a specialized medical service

    Directory of Open Access Journals (Sweden)

    Oliveira Júnior Hercílio Pereira de

    2003-01-01

    Full Text Available INTRODUCTION: Motivation is deemed a critical component for interventions intended to change behaviors related to the use of alcohol and other drugs. The classification of patients in 'stages of change' can be a useful tool for the organization and improvement of treating programs. METHODS: This study assessed the stages of change using the scales URICA and SOCRATES in patients who attended two different treating programs for alcohol dependence in a specialized medical service. We performed an analysis of the association between stages of change and demographic aspects. After three months of treatment, patients were reassessed to evaluate their outcome. RESULTS: In the assessments using URICA, there was an association between stages of change and monthly income and age. There was no evidence that patients move across the stages of change. Using the scale SOCRATES, we found an association between stages of change and monthly income. In the reassessment, there was a significant movement across the stages of change. CONCLUSION: Patients who attend two different treating programs may have different motivation profiles. There was no movement congruent with the stage of change model, suggesting that patients may need more than 3 months to obtain significant changes in their motivation.

  9. Risk assessment of power systems models, methods, and applications

    CERN Document Server

    Li, Wenyuan

    2014-01-01

    Risk Assessment of Power Systems addresses the regulations and functions of risk assessment with regard to its relevance in system planning, maintenance, and asset management. Brimming with practical examples, this edition introduces the latest risk information on renewable resources, the smart grid, voltage stability assessment, and fuzzy risk evaluation. It is a comprehensive reference of a highly pertinent topic for engineers, managers, and upper-level students who seek examples of risk theory applications in the workplace.

  10. Application of the Sakurai-Sugiura projection method to core-excited-state calculation by time-dependent density functional theory.

    Science.gov (United States)

    Tsuchimochi, Takashi; Kobayashi, Masato; Nakata, Ayako; Imamura, Yutaka; Nakai, Hiromi

    2008-11-15

    The Sakurai-Sugiura projection (SS) method was implemented and numerically assessed for diagonalization of the Hamiltonian in time-dependent density functional theory (TDDFT). Since the SS method can be used to specify the range in which the eigenvalues are computed, it may be an efficient tool for use with eigenvalues in a particular range. In this article, the SS method is applied to core excited calculations for which the eigenvalues are located within a particular range, since the eigenvalues are unique to atomic species in molecules. The numerical assessment of formaldehyde molecule by TDDFT with core-valence Becke's three-parameter exchange (B3) plus Lee-Yang-Parr (LYP) correlation (CV-B3LYP) functional demonstrates that the SS method can be used to selectively obtain highly accurate eigenvalues and eigenvectors. Thus, the SS method is a new and powerful alternative for calculating core-excitation energies without high computation costs.

  11. Application of CFD methods for advanced site assessment and micrositing

    Energy Technology Data Exchange (ETDEWEB)

    Strack, M; Riedel, V.; Dutilleux, P. [DEWI German Wind Energy Inst., Wilhelmshaven (Germany)

    2006-07-01

    The DEWI Institute in Germany is in the process of testing a computational fluid dynamic (CFD) method for the site assessment of wind farms. This presentation provided details of flow model testing conducted at the institute and at 2 wind farms in Austria and Spain. Wind profile verification processes are tested at the institute through the use of 130 metre mast, which was selected to verify flow models as it has several years of data which has been extensively evaluated and checked. The verification procedures tested on the mast were then evaluated at a wind farm in Austria with a complex site comprised of steep slopes, large height differences and important terrain structures. Mast measurements at the farm ranged between 50 and 65 m, and sonic detection and ranging (SODAR) measurements were available at 4 different locations. Wind direction at the meteorological masts was determined to be 326.7 degrees and 331.0 degrees. A comparison with SODAR measurements showed considerable variation of energy yield. Flow simulation studies showed a mean deviation of 2.3 per cent, while calculations by the Wind Atlas Analysis and Application Program (WAsP) showed a deviation of 9.1 per cent. The investigation revealed that there were very complex flow patterns at the site, which the flow simulation was able to reproduce. Results of the investigation indicated that very high wind direction resolutions were required to achieve an accurate flow simulation. WaSP was not applicable for the extrapolation of measurement on hub height to the wind turbines. CFD simulation results at a wind farm in Spain with complex terrain showed a percentage error between 0.4 per cent and 6 per cent, and a mean absolute error of between 2.3 per cent and 2.5 per cent. A verification of turbulence intensity results showed an absolute percentage error of between 0 and 7 per cent, and a mean absolute error of between 1.8 and 2.2 per cent. The method allowed wind and turbulence fields to be simulated over

  12. Assessment of methods for mapping snow cover from MODIS

    Science.gov (United States)

    Rittger, Karl; Painter, Thomas H.; Dozier, Jeff

    2013-01-01

    Characterization of snow is critical for understanding Earth’s water and energy cycles. Maps of snow from MODIS have seen growing use in investigations of climate, hydrology, and glaciology, but the lack of rigorous validation of different snow mapping methods compromises these studies. We examine three widely used MODIS snow products: the “binary” (i.e., snow yes/no) global snow maps that were among the initial MODIS standard products; a more recent standard MODIS fractional snow product; and another fractional snow product, MODSCAG, based on spectral mixture analysis. We compare them to maps of snow obtained from Landsat ETM+ data, whose 30 m spatial resolution provides nearly 300 samples within a 500 m MODIS nadir pixel. The assessment uses 172 images spanning a range of snow and vegetation conditions, including the Colorado Rocky Mountains, the Upper Rio Grande, California’s Sierra Nevada, and the Nepal Himalaya. MOD10A1 binary and fractional fail to retrieve snow in the transitional periods during accumulation and melt while MODSCAG consistently maintains its retrieval ability during these periods. Averaged over all regions, the RMSE for MOD10A1 fractional is 0.23, whereas the MODSCAG RMSE is 0.10. MODSCAG performs the most consistently through accumulation, mid-winter and melt, with median differences ranging from -0.16 to 0.04 while differences for MOD10A1 fractional range from -0.34 to 0.35. MODSCAG maintains its performance over all land cover classes and throughout a larger range of land surface properties. Characterizing snow cover by spectral mixing is more accurate than empirical methods based on the normalized difference snow index, both for identifying where snow is and is not and for estimating the fractional snow cover within a sensor’s instantaneous field-of-view. Determining the fractional value is particularly important during spring and summer melt in mountainous terrain, where large variations in snow, vegetation and soil occur over

  13. New methods for assessing the fascinating nature of nature experiences.

    Directory of Open Access Journals (Sweden)

    Yannick Joye

    Full Text Available In recent years, numerous environmental psychology studies have demonstrated that contact with nature as opposed to urban settings can improve an individual's mood, can lead to increased levels of vitality, and can offer an opportunity to recover from stress. According to Attention Restoration Theory (ART the restorative potential of natural environments is situated in the fact that nature can replenish depleted attentional resources. This replenishment takes place, in part, because nature is deemed to be a source of fascination, with fascination being described as having an "attentional", an "affective" and an "effort" dimension. However, the claim that fascination with nature involves these three dimensions is to a large extent based on intuition or derived from introspection-based measurement methods, such as self-reports. In three studies, we aimed to more objectively assess whether these three dimensions indeed applied to experiences related to natural environments, before any (attentional depletion has taken place. The instruments that were used were: (a the affect misattribution procedure (Study 1, (b the dot probe paradigm (Study 2 and (c a cognitively effortful task (Study 3. These instrument were respectively aimed at verifying the affective, attentional and effort dimension of fascination. Overall, the results provide objective evidence for the claims made within the ART framework, that natural as opposed to urban settings are affectively positive (cfr., affective dimension and that people have an attentional bias to natural (rather than urban environments (cfr., attentional dimension. The results regarding the effort dimension are less straightforward, and suggest that this dimension only becomes important in sufficiently difficult cognitive tasks.

  14. Assessing groundwater quality for irrigation using indicator kriging method

    Science.gov (United States)

    Delbari, Masoomeh; Amiri, Meysam; Motlagh, Masoud Bahraini

    2016-11-01

    One of the key parameters influencing sprinkler irrigation performance is water quality. In this study, the spatial variability of groundwater quality parameters (EC, SAR, Na+, Cl-, HCO3 - and pH) was investigated by geostatistical methods and the most suitable areas for implementation of sprinkler irrigation systems in terms of water quality are determined. The study was performed in Fasa county of Fars province using 91 water samples. Results indicated that all parameters are moderately to strongly spatially correlated over the study area. The spatial distribution of pH and HCO3 - was mapped using ordinary kriging. The probability of concentrations of EC, SAR, Na+ and Cl- exceeding a threshold limit in groundwater was obtained using indicator kriging (IK). The experimental indicator semivariograms were often fitted well by a spherical model for SAR, EC, Na+ and Cl-. For HCO3 - and pH, an exponential model was fitted to the experimental semivariograms. Probability maps showed that the risk of EC, SAR, Na+ and Cl- exceeding the given critical threshold is higher in lower half of the study area. The most proper agricultural lands for sprinkler irrigation implementation were identified by evaluating all probability maps. The suitable areas for sprinkler irrigation design were determined to be 25,240 hectares, which is about 34 percent of total agricultural lands and are located in northern and eastern parts. Overall the results of this study showed that IK is an appropriate approach for risk assessment of groundwater pollution, which is useful for a proper groundwater resources management.

  15. New methods for assessing the fascinating nature of nature experiences.

    Science.gov (United States)

    Joye, Yannick; Pals, Roos; Steg, Linda; Evans, Ben Lewis

    2013-01-01

    In recent years, numerous environmental psychology studies have demonstrated that contact with nature as opposed to urban settings can improve an individual's mood, can lead to increased levels of vitality, and can offer an opportunity to recover from stress. According to Attention Restoration Theory (ART) the restorative potential of natural environments is situated in the fact that nature can replenish depleted attentional resources. This replenishment takes place, in part, because nature is deemed to be a source of fascination, with fascination being described as having an "attentional", an "affective" and an "effort" dimension. However, the claim that fascination with nature involves these three dimensions is to a large extent based on intuition or derived from introspection-based measurement methods, such as self-reports. In three studies, we aimed to more objectively assess whether these three dimensions indeed applied to experiences related to natural environments, before any (attentional) depletion has taken place. The instruments that were used were: (a) the affect misattribution procedure (Study 1), (b) the dot probe paradigm (Study 2) and (c) a cognitively effortful task (Study 3). These instrument were respectively aimed at verifying the affective, attentional and effort dimension of fascination. Overall, the results provide objective evidence for the claims made within the ART framework, that natural as opposed to urban settings are affectively positive (cfr., affective dimension) and that people have an attentional bias to natural (rather than urban) environments (cfr., attentional dimension). The results regarding the effort dimension are less straightforward, and suggest that this dimension only becomes important in sufficiently difficult cognitive tasks.

  16. New Methods for Assessing the Fascinating Nature of Nature Experiences

    Science.gov (United States)

    Joye, Yannick; Pals, Roos; Steg, Linda; Evans, Ben Lewis

    2013-01-01

    In recent years, numerous environmental psychology studies have demonstrated that contact with nature as opposed to urban settings can improve an individual’s mood, can lead to increased levels of vitality, and can offer an opportunity to recover from stress. According to Attention Restoration Theory (ART) the restorative potential of natural environments is situated in the fact that nature can replenish depleted attentional resources. This replenishment takes place, in part, because nature is deemed to be a source of fascination, with fascination being described as having an “attentional”, an “affective” and an “effort” dimension. However, the claim that fascination with nature involves these three dimensions is to a large extent based on intuition or derived from introspection-based measurement methods, such as self-reports. In three studies, we aimed to more objectively assess whether these three dimensions indeed applied to experiences related to natural environments, before any (attentional) depletion has taken place. The instruments that were used were: (a) the affect misattribution procedure (Study 1), (b) the dot probe paradigm (Study 2) and (c) a cognitively effortful task (Study 3). These instrument were respectively aimed at verifying the affective, attentional and effort dimension of fascination. Overall, the results provide objective evidence for the claims made within the ART framework, that natural as opposed to urban settings are affectively positive (cfr., affective dimension) and that people have an attentional bias to natural (rather than urban) environments (cfr., attentional dimension). The results regarding the effort dimension are less straightforward, and suggest that this dimension only becomes important in sufficiently difficult cognitive tasks. PMID:23922645

  17. Analysis of Frequency of Use of Different Scar Assessment Scales Based on the Scar Condition and Treatment Method

    OpenAIRE

    Bae, Seong Hwan; Bae, Yong Chan

    2014-01-01

    Analysis of scars in various conditions is essential, but no consensus had been reached on the scar assessment scale to select for a given condition. We reviewed papers to determine the scar assessment scale selected depending on the scar condition and treatment method. We searched PubMed for articles published since 2000 with the contents of the scar evaluation using a scar assessment scale with a Journal Citation Report impact factor >0.5. Among them, 96 articles that conducted a scar evalu...

  18. Quantitative assessment of target dependence of pion fluctuation in hadronic interactions – estimation through erraticity

    Indian Academy of Sciences (India)

    Dipak Ghosh; Argha Deb; Mitali Mondal; Arindam Mondal; Sitram Pal

    2012-12-01

    Event-to-event fluctuation pattern of pions produced by proton and pion beams is studied in terms of the newly defined erraticity measures $ (p, q)$, $_{q}^{'}$ and $_{q}^{'}$ proposed by Cao and Hwa. The analysis reveals the erratic behaviour of the produced pions signifying the chaotic multiparticle production in high-energy hadron–nucleus interactions (- –AgBr interactions at 350 GeV/c and –AgBr interactions at 400 GeV/c). However, the chaoticity does not depend on whether the projectile is proton or pion. The results are compared with the results of the VENUS-generated data for the above interactions which suggests that VENUS event generator is unable to reproduce the event-to-event fluctuations of spatial patterns of final states. A comparative study of –AgBr interactions and - collisions at 400 GeV/c from NA27, with the help of a quantitative parameter for the assessment of pion fluctuation, indicates conclusively that particle production process is more chaotic for hadron–nucleus interactions than for hadron–hadron interactions.

  19. Identification of assessment methods of benefits and costs

    DEFF Research Database (Denmark)

    Kronbak, Lone Grønbæk; Roth, Eva

    This note relates to tasks 4.1of the KnowSeas project and is a guidance-note to give directions towards the assessment of benefit and costs related to fisheries and advice on the further objectives related to this assessment....

  20. Effects of Rater Characteristics and Scoring Methods on Speaking Assessment

    Science.gov (United States)

    Matsugu, Sawako

    2013-01-01

    Understanding the sources of variance in speaking assessment is important in Japan where society's high demand for English speaking skills is growing. Three challenges threaten fair assessment of speaking. First, in Japanese university speaking courses, teachers are typically the only raters, but teachers' knowledge of their students may unfairly…