WorldWideScience

Sample records for assessment ioa analysis

  1. Independent Orbiter Assessment (IOA): Analysis of the Electrical Power Distribution and Control Subsystem, Volume 2

    Science.gov (United States)

    Schmeckpeper, K. R.

    1987-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C) hardware. The EPD and C hardware performs the functions of distributing, sensing, and controlling 28 volt DC power and of inverting, distributing, sensing, and controlling 117 volt 400 Hz AC power to all Orbiter subsystems from the three fuel cells in the Electrical Power Generation (EPG) subsystem. Volume 2 continues the presentation of IOA analysis worksheets and contains the potential critical items list.

  2. Independent Orbiter Assessment (IOA): Analysis of the electrical power distribution and control subsystem, volume 1

    Science.gov (United States)

    Schmeckpeper, K. R.

    1987-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C) hardware. The EPD and C hardware performs the functions of distributing, sensing, and controlling 28 volt DC power and of inverting, distributing, sensing, and controlling 117 volt 400 Hz AC power to all Orbiter subsystems from the three fuel cells in the Electrical Power Generation (EPG) subsystem. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode. Of the 1671 failure modes analyzed, 9 single failures were determined to result in loss of crew or vehicle. Three single failures unique to intact abort were determined to result in possible loss of the crew or vehicle. A possible loss of mission could result if any of 136 single failures occurred. Six of the criticality 1/1 failures are in two rotary and two pushbutton switches that control External Tank and Solid Rocket Booster separation. The other 6 criticality 1/1 failures are fuses, one each per Aft Power Control Assembly (APCA) 4, 5, and 6 and one each per Forward Power Control Assembly (FPCA) 1, 2, and 3, that supply power to certain Main Propulsion System (MPS) valves and Forward Reaction Control System (RCS) circuits.

  3. Independent Orbiter Assessment (IOA): Analysis of the electrical power distribution and control/electrical power generation subsystem

    Science.gov (United States)

    Patton, Jeff A.

    1986-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C)/Electrical Power Generation (EPG) hardware. The EPD and C/EPG hardware is required for performing critical functions of cryogenic reactant storage, electrical power generation and product water distribution in the Orbiter. Specifically, the EPD and C/EPG hardware consists of the following components: Power Section Assembly (PSA); Reactant Control Subsystem (RCS); Thermal Control Subsystem (TCS); Water Removal Subsystem (WRS); and Power Reactant Storage and Distribution System (PRSDS). The IOA analysis process utilized available EPD and C/EPG hardware drawings and schematics for defining hardware assemblies, components, and hardware items. Each level of hardware was evaluated and analyzed for possible failure modes and effects. Criticality was assigned based upon the severity of the effect for each failure mode.

  4. Independent Orbiter Assessment (IOA): FMEA/CIL assessment

    Science.gov (United States)

    Hinsdale, L. W.; Swain, L. J.; Barnes, J. E.

    1988-01-01

    The McDonnell Douglas Astronautics Company (MDAC) was selected to perform an Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL). Direction was given by the Orbiter and GFE Projects Office to perform the hardware analysis and assessment using the instructions and ground rules defined in NSTS 22206. The IOA analysis featured a top-down approach to determine hardware failure modes, criticality, and potential critical items. To preserve independence, the analysis was accomplished without reliance upon the results contained within the NASA and Prime Contractor FMEA/CIL documentation. The assessment process compared the independently derived failure modes and criticality assignments to the proposed NASA post 51-L FMEA/CIL documentation. When possible, assessment issues were discussed and resolved with the NASA subsystem managers. Unresolved issues were elevated to the Orbiter and GFE Projects Office manager, Configuration Control Board (CCB), or Program Requirements Control Board (PRCB) for further resolution. The most important Orbiter assessment finding was the previously unknown stuck autopilot push-button criticality 1/1 failure mode. The worst case effect could cause loss of crew/vehicle when the microwave landing system is not active. It is concluded that NASA and Prime Contractor Post 51-L FMEA/CIL documentation assessed by IOA is believed to be technically accurate and complete. All CIL issues were resolved. No FMEA issues remain that have safety implications. Consideration should be given, however, to upgrading NSTS 22206 with definitive ground rules which more clearly spell out the limits of redundancy.

  5. Independent Orbiter Assessment (IOA): Assessment of the electrical power distribution and control subsystem, volume 1

    Science.gov (United States)

    Schmeckpeper, K. R.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA first completed an analysis of the Electrical Power Distribution and Control (EPD and C) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baseline with proposed Post 51-L updates included. A resolution of each discrepancy from the comparison is provided through additional analysis as required. This report documents the results of that comparison for the Orbiter EPD and C hardware. The IOA product for the EPD and C analysis consisted of 1671 failure mode analysis worksheets that resulted in 468 potential critical items being identified. Comparison was made to the proposed NASA Post 51-L baseline which consisted of FMEAs and 158 CIL items. Volume 1 contains the EPD and C subsystem description, analysis results, ground rules and assumptions, and some of the IOA worksheets.

  6. Independent Orbiter Assessment (IOA): Assessment of the EPD and C/remote manipulator system FMEA/CIL

    Science.gov (United States)

    Robinson, W. W.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Electrical Power Distribution and Control (EPD and C)/Remote Manipulator System (RMS) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA analysis of the EPD and C/RMS hardware initially generated 345 failure mode worksheets and identified 117 Potential Critical Items (PCIs) before starting the assessment process. These analysis results were compared to the proposed NASA Post 51-L baseline of 132 FMEAs and 66 CIL items.

  7. Independent Orbiter Assessment (IOA): Assessment of the landing/deceleration (LDG/DEC) subsystem FMEA/CIL

    Science.gov (United States)

    Odonnell, R. A.; Weissinger, D.

    1988-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Landing/Deceleration (LDG/DEC) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baseline with proposed Post 51-L updates included. A resolution of each discrepancy from the comparison is provided through additional analysis as required. This report documents the results of that comparison for the Orbiter LDG/DEC hardware. The IOA product for the LDG/DEC analysis consisted of 259 failure mode worksheets that resulted in 124 potential critical items being identified. Comparison was made to the NASA baseline which consisted of 267 FMEA's and 120 CIL items. This comparison produced agreement on all but 75 FMEA's which caused differences in 51 CIL items.

  8. The PCA and IOA approaches for life-cycle analysis of greenhouse gas emissions from Thai commodities and energy consumption

    Directory of Open Access Journals (Sweden)

    Pawinee Suksuntornsiri

    2005-01-01

    Full Text Available The use of more substitutable commodities, selected by their total greenhouse gas (GHG emissions would highly contribute to mitigating the effects of global warming. Life-cycle analysis (LCA is a solution that can evaluate the total emissions from a lifetime production of a commodity. It is widely applied to reveal the actual environmental emissions in many countries, however this data could not be applied in other countries due to different emission and energy consumption structures. LCA emission factors within the same country are also different due to different assumptions on the boundary and lifetime of a considered production process. The process chains analysis (PCA, the conventional LCA approach which is mostly applied in Thailand, is accurate in the direct production process, but its analysis in the higher order production process is usually truncated due to a lack of data. It is laborious, time consuming, and hard to gather all the data from the whole production process. This article presents the pros and cons of PCA and inputoutput analysis (IOA and introduces an appropriate approach in a Thai context. The adaptation of energyrelated GHG emissions findings in the revised 1996 Intergovernmental Panel on Climate Change (IPCC guidelines for national GHG inventories are introduced for a commodity emission factor and traced through the whole production chains by IOA. In conclusion, the IOA gives emissions in average values and the historical economic structure is used to derive the emissions. However, emissions from every single link of production lifecycle can be taken into account. A combined PCA and IOA is recommended for an LCA of GHG emissions in Thailand.

  9. Linkage mapping of the locus for inherited ovine arthrogryposis (IOA) to sheep chromosome 5.

    Science.gov (United States)

    Murphy, Angela M; MacHugh, David E; Park, Stephen D E; Scraggs, Erik; Haley, Chris S; Lynn, David J; Boland, Maurice P; Doherty, Michael L

    2007-01-01

    Arthrogryposis is a congenital malformation affecting the limbs of newborn animals and infants. Previous work has demonstrated that inherited ovine arthrogryposis (IOA) has an autosomal recessive mode of inheritance. Two affected homozygous recessive (art/art) Suffolk rams were used as founders for a backcross pedigree of half-sib families segregating the IOA trait. A genome scan was performed using 187 microsatellite genetic markers and all backcross animals were phenotyped at birth for the presence and severity of arthrogryposis. Pairwise LOD scores of 1.86, 1.35, and 1.32 were detected for three microsatellites, BM741, JAZ, and RM006, that are located on sheep Chr 5 (OAR5). Additional markers in the region were identified from the genetic linkage map of BTA7 and by in silico analyses of the draft bovine genome sequence, three of which were informative. Interval mapping of all autosomes produced an F value of 21.97 (p < 0.01) for a causative locus in the region of OAR5 previously flagged by pairwise linkage analysis. Inspection of the orthologous region of HSA5 highlighted a previously fine-mapped locus for human arthrogryposis multiplex congenita neurogenic type (AMCN). A survey of the HSA5 genome sequence identified plausible candidate genes for both IOA and human AMCN.

  10. Water resources and environmental input-output analysis and its key study issues: a review

    Science.gov (United States)

    YANG, Z.; Xu, X.

    2013-12-01

    inland water resources IOA. Recent internal study references related to environmental input-output table, pollution discharge analysis and environmental impact assessment had taken the leading position. Pollution discharge analysis mainly aiming at CO2 discharge had been regard as a new hotspot of environmental IOA. Environmental impact assessment was an important direction of inland environmental IOA in recent years. Key study issues including Domestic Technology Assumption(DTA) and Sectoral Aggregation(SA) had been mentioned remarkably. It was pointed out that multiply multi-region input-output analysis(MIOA) may be helpful to solve DTA. Because there was little study using effective analysis tools to quantify the bias of SA and the exploration of the appropriate sectoral aggregation degree was scarce, research dedicating to explore and solve these two key issues was deemed to be urgently needed. According to the study status, several points of outlook were proposed in the end.

  11. Data Analysis and Assessment Center

    Data.gov (United States)

    Federal Laboratory Consortium — The DoD Supercomputing Resource Center (DSRC) Data Analysis and Assessment Center (DAAC) provides classified facilities to enhance customer interactions with the ARL...

  12. Approaches to Assessment in Multivariate Analysis.

    Science.gov (United States)

    O'Connell, Ann A.

    This paper reviews trends in assessment in quantitative courses and illustrates several options and approaches to assessment for advanced courses at the graduate level, especially in multivariate analysis. The paper provides a summary of how a researcher has used alternatives to traditional methods of assessment in a course on multivariate…

  13. Change point analysis and assessment

    DEFF Research Database (Denmark)

    Müller, Sabine; Neergaard, Helle; Ulhøi, John Parm

    2011-01-01

    The aim of this article is to develop an analytical framework for studying processes such as continuous innovation and business development in high-tech SME clusters that transcends the traditional qualitative-quantitative divide. It integrates four existing and well-recognized approaches to stud...... to studying events, processes and change, mamely change-point analysis, event-history analysis, critical-incident technique and sequence analysis....

  14. Asbestos Workshop: Sampling, Analysis, and Risk Assessment

    Science.gov (United States)

    2012-03-01

    coatings Vinyl/asbestos floor tile Automatic transmission components Clutch facings Disc brake pads Drum brake linings Brake blocks Commercial and...1EMDQ March 2012 ASBESTOS WORKSHOP: SAMPLING, ANALYSIS , AND RISK ASSESSMENT Paul Black, PhD, Neptune and Company Ralph Perona, DABT, Neptune and...Sampling, Analysis , and Risk Assessment 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK

  15. Assessing Analysis and Reasoning in Bioethics

    Science.gov (United States)

    Pearce, Roger S.

    2008-01-01

    Developing critical thinking is a perceived weakness in current education. Analysis and reasoning are core skills in bioethics making bioethics a useful vehicle to address this weakness. Assessment is widely considered to be the most influential factor on learning (Brown and Glasner, 1999) and this piece describes how analysis and reasoning in…

  16. Economic impact assessment in pest risk analysis

    NARCIS (Netherlands)

    Soliman, T.A.A.; Mourits, M.C.M.; Oude Lansink, A.G.J.M.; Werf, van der W.

    2010-01-01

    According to international treaties, phytosanitary measures against introduction and spread of invasive plant pests must be justified by a science-based pest risk analysis (PRA). Part of the PRA consists of an assessment of potential economic consequences. This paper evaluates the main available tec

  17. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    Energy Technology Data Exchange (ETDEWEB)

    Lowry, Peter P. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wagner, Katie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  18. Quality Assessment of Urinary Stone Analysis

    DEFF Research Database (Denmark)

    Siener, Roswitha; Buchholz, Noor; Daudon, Michel

    2016-01-01

    and chemical analysis. The aim of the present study was to assess the quality of urinary stone analysis of laboratories in Europe. Nine laboratories from eight European countries participated in six quality control surveys for urinary calculi analyses of the Reference Institute for Bioanalytics, Bonn, Germany......, between 2010 and 2014. Each participant received the same blinded test samples for stone analysis. A total of 24 samples, comprising pure substances and mixtures of two or three components, were analysed. The evaluation of the quality of the laboratory in the present study was based on the attainment...... spectra and qualification of the staff for an accurate analysis of stone composition. Regular quality control is essential in carrying out routine stone analysis....

  19. Quality Assessment of Urinary Stone Analysis

    DEFF Research Database (Denmark)

    Siener, Roswitha; Buchholz, Noor; Daudon, Michel;

    2016-01-01

    After stone removal, accurate analysis of urinary stone composition is the most crucial laboratory diagnostic procedure for the treatment and recurrence prevention in the stone-forming patient. The most common techniques for routine analysis of stones are infrared spectroscopy, X-ray diffraction...... and chemical analysis. The aim of the present study was to assess the quality of urinary stone analysis of laboratories in Europe. Nine laboratories from eight European countries participated in six quality control surveys for urinary calculi analyses of the Reference Institute for Bioanalytics, Bonn, Germany......, between 2010 and 2014. Each participant received the same blinded test samples for stone analysis. A total of 24 samples, comprising pure substances and mixtures of two or three components, were analysed. The evaluation of the quality of the laboratory in the present study was based on the attainment...

  20. Dynamic analysis and assessment for sustainable development

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The assessment of sustainable development is crucial for constituting sustainable development strategies. Assessment methods that exist so far usually only use an indicator system for making sustainable judgement. These indicators rarely reflect dynamic characteristics. However, sustainable development is influenced by changes in the social-economic system and in the eco-environmental system at different times. Besides the spatial character, sustainable development has a temporal character that can not be neglected; therefore the research system should also be dynamic. This paper focuses on this dynamic trait, so that the assessment results obtained provide more information for judgements in decision-making processes. Firstly the dynamic characteristics of sustainable development are analyzed, which point to a track of sustainable development that is an upward undulating curve. According to the dynamic character and the development rules of a social, economic and ecological system, a flexible assessment approach that is based on tendency analysis, restrictive conditions and a feedback system is then proposed for sustainable development.

  1. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    Energy Technology Data Exchange (ETDEWEB)

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  2. Geochemical and Geochronologic Investigations of Zircon-hosted Melt Inclusions in Rhyolites from the Mesoproterozoic Pea Ridge IOA-REE Deposit, St. Francois Mountains, Missouri

    Science.gov (United States)

    Watts, K. E.; Mercer, C. N.; Vazquez, J. A.

    2015-12-01

    Silicic volcanic and plutonic rocks of an eroded Mesoproterozoic caldera complex were intruded and replaced by iron ore, and cross-cut by REE-enriched breccia pipes (~12% total REO) to form the Pea Ridge iron-oxide-apatite-REE (IOA-REE) deposit. Igneous activity, iron ore formation, and REE mineralization overlapped in space and time, however the source of REEs and other metals (Fe, Cu, Au) integral to these economically important deposits remains unclear. Melt inclusions (MI) hosted in refractory zircon phenocrysts are used to constrain magmatic components and processes in the formation of the Pea Ridge deposit. Homogenized (1.4 kbar, 1000°C, 1 hr) MI in zircons from rhyolites ~600 ft (PR-91) and ~1200 ft (PR-12) laterally from the ore body were analyzed for major elements by EPMA and volatiles and trace elements (H2O, S, F, Cl, REEs, Rb, Sr, Y, Zr, Nb, U, Th) by SHRIMP-RG. Metals (including Cu, Au) will be measured in an upcoming SHRIMP-RG session. U-Pb ages, Ti and REE were determined by SHRIMP-RG for a subset of zircon spots adjacent to MI (1458 ± 18 Ma (PR-12); 1480 ± 45 Ma (PR-91)). MI glasses range from fresh and homogeneous dacite-rhyolite (65-75 wt% SiO2) to heterogeneous, patchy mixtures of K-spar and quartz (PR-12, 91), and more rarely mica, albite and/or anorthoclase (PR-91). MI are commonly attached to monazite and xenotime, particularly along re-entrants and zircon rims (PR-91). Fresh dacite-rhyolite glasses (PR-12) have moderate H2O (~2-2.5 wt%), Rb/Sr ratios (~8) and U (~5-7 ppm), and negative (chondrite-normalized) Eu anomalies (Eu ~0.4-0.7 ppm) (typical of rhyolites), whereas HREEs (Tb, Ho, Tm) are elevated (~2-3 ppm). Patchy K-spar and quartz inclusions (PR-12, 91) have flat LREE patterns, and positive anomalies in Tb, Ho, and Tm. One K-spar inclusion (PR-91) has a ~5-50 fold increase in HREEs (Tb, Dy, Ho, Er, Tm) and U (35 ppm) relative to other MI. U-Pb and REE analyses of its zircon host are not unusual (1484 ± 21 Ma); its irregular shape

  3. Qualitative Analysis for Maintenance Process Assessment

    Science.gov (United States)

    Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor

    1996-01-01

    In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.

  4. Acoustic analysis assessment in speech pathology detection

    Directory of Open Access Journals (Sweden)

    Panek Daria

    2015-09-01

    Full Text Available Automatic detection of voice pathologies enables non-invasive, low cost and objective assessments of the presence of disorders, as well as accelerating and improving the process of diagnosis and clinical treatment given to patients. In this work, a vector made up of 28 acoustic parameters is evaluated using principal component analysis (PCA, kernel principal component analysis (kPCA and an auto-associative neural network (NLPCA in four kinds of pathology detection (hyperfunctional dysphonia, functional dysphonia, laryngitis, vocal cord paralysis using the a, i and u vowels, spoken at a high, low and normal pitch. The results indicate that the kPCA and NLPCA methods can be considered a step towards pathology detection of the vocal folds. The results show that such an approach provides acceptable results for this purpose, with the best efficiency levels of around 100%. The study brings the most commonly used approaches to speech signal processing together and leads to a comparison of the machine learning methods determining the health status of the patient

  5. Seismic vulnerability assessments in risk analysis

    Science.gov (United States)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  6. Assessment of Available Numerical Tools for Dynamic Mooring Analysis

    DEFF Research Database (Denmark)

    Thomsen, Jonas Bjerg; Eskilsson, Claes; Ferri, Francesco

    This report covers a preliminary assessment of available numerical tools to be used in upcoming full dynamic analysis of the mooring systems assessed in the project _Mooring Solutions for Large Wave Energy Converters_. The assessments tends to cover potential candidate software and subsequently c...

  7. A Content Analysis of Intimate Partner Violence Assessments

    Science.gov (United States)

    Hays, Danica G.; Emelianchik, Kelly

    2009-01-01

    With approximately 30% of individuals of various cultural identities experiencing intimate partner violence (IPV) in their lifetimes, it is imperative that professional counselors engage in effective assessment practices and be aware of the limitations of available IPV assessments. A content analysis of 38 IPV assessments was conducted, yielding…

  8. Regional analysis and environmental impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Parzyck, D.C.; Brocksen, R.W.; Emanuel, W.R.

    1976-01-01

    This paper presents a number of techniques that can be used to assess environmental impacts on a regional scale. Regional methodologies have been developed which examine impacts upon aquatic and terrestrial biota in regions through consideration of changes in land use, land cover, air quality, water resource use, and water quality. Techniques used to assess long-range atmospheric transport, water resources, effects on sensitive forest and animal species, and impacts on man are presented in this paper, along with an optimization approach which serves to integrate the analytical techniques in an overall assessment framework. A brief review of the research approach and certain modeling techniques used within one regional studies program is provided. While it is not an all inclusive report on regional analyses, it does present an illustration of the types of analyses that can be performed on a regional scale.

  9. Data Analysis and Next Generation Assessments

    Science.gov (United States)

    Pon, Kathy

    2013-01-01

    For the last decade, much of the work of California school administrators has been shaped by the accountability of the No Child Left Behind Act. Now as they stand at the precipice of Common Core Standards and next generation assessments, it is important to reflect on the proficiency educators have attained in using data to improve instruction and…

  10. Emerging frontier technologies for food safety analysis and risk assessment

    Institute of Scientific and Technical Information of China (English)

    DONG Yi-yang; LIU Jia-hui; WANG Sai; CHEN Qi-long; GUO Tian-yang; ZHANG Li-ya; JIN Yong; SU Hai-jia; TAN Tian-wei

    2015-01-01

    Access to security and safe food is a basic human necessity and essential for a sustainable world. To perform hi-end food safety analysis and risk assessment with state of the art technologies is of utmost importance thereof. With applications as exempliifed by microlfuidic immunoassay, aptasensor, direct analysis in real time, high resolution mass spectrometry, benchmark dose and chemical speciifc adjustment factor, this review presents frontier food safety analysis and risk assess-ment technologies, from which both food quality and public health wil beneift undoubtedly in a foreseeable future.

  11. PIE Nacelle Flow Analysis and TCA Inlet Flow Quality Assessment

    Science.gov (United States)

    Shieh, C. F.; Arslan, Alan; Sundaran, P.; Kim, Suk; Won, Mark J.

    1999-01-01

    This presentation includes three topics: (1) Analysis of isolated boattail drag; (2) Computation of Technology Concept Airplane (TCA)-installed nacelle effects on aerodynamic performance; and (3) Assessment of TCA inlet flow quality.

  12. Self-Assessment in Second Language Testing: A Meta-Analysis and Analysis of Experiential Factors.

    Science.gov (United States)

    Ross, Steven

    1998-01-01

    Summarizes research on self-assessment in second-language testing using a meta-analysis on 60 correlations reported in second-language-testing literature. Self-assessments and teacher assessments of recently instructed English-as-a-Second-Language learners' functional English skills revealed differential validities for self-assessment and teacher…

  13. Dimensionality Assessment of Ordered Polytomous Items With Parallel Analysis

    NARCIS (Netherlands)

    Timmerman, Marieke E.; Lorenzo-Seva, Urbano

    2011-01-01

    Parallel analysis (PA) is an often-recommended approach for assessment of the dimensionality of a variable set. PA is known in different variants, which may yield different dimensionality indications. In this article, the authors considered the most appropriate PA procedure to assess the number of c

  14. Analysis of a Rubric for Assessing Depth of Classroom Reflections

    Science.gov (United States)

    Dalal, Dev K.; Hakel, Milton D.; Sliter, Michael T.; Kirkendall, Sarah R.

    2012-01-01

    Writing reflections is recommended for enhancing retention and transfer of learned material. The benefits of student reflections have been well documented, but the methods for collecting and assessing reflections can be difficult. This study presents the development and analysis of a new, straightforward rubric for assessing depth of student…

  15. Dimensionality Assessment of Ordered Polytomous Items with Parallel Analysis

    Science.gov (United States)

    Timmerman, Marieke E.; Lorenzo-Seva, Urbano

    2011-01-01

    Parallel analysis (PA) is an often-recommended approach for assessment of the dimensionality of a variable set. PA is known in different variants, which may yield different dimensionality indications. In this article, the authors considered the most appropriate PA procedure to assess the number of common factors underlying ordered polytomously…

  16. Material Analysis for a Fire Assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Alexander; Nemer, Martin B.

    2014-08-01

    This report consolidates technical information on several materials and material classes for a fire assessment. The materials include three polymeric materials, wood, and hydraulic oil. The polymers are polystyrene, polyurethane, and melamine- formaldehyde foams. Samples of two of the specific materials were tested for their behavior in a fire - like environment. Test data and the methods used to test the materials are presented. Much of the remaining data are taken from a literature survey. This report serves as a reference source of properties necessary to predict the behavior of these materials in a fire.

  17. Environmental risk assessment in GMO analysis.

    Science.gov (United States)

    Pirondini, Andrea; Marmiroli, Nelson

    2010-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity.

  18. Beam-propagation method - Analysis and assessment

    Science.gov (United States)

    van Roey, J.; van der Donk, J.; Lagasse, P. E.

    1981-07-01

    A method for the calculation of the propagation of a light beam through an inhomogeneous medium is presented. A theoretical analysis of this beam-propagation method is given, and a set of conditions necessary for the accurate application of the method is derived. The method is illustrated by the study of a number of integrated-optic structures, such as thin-film waveguides and gratings.

  19. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  20. Uncertainty analysis in integrated assessment: the users' perspective

    NARCIS (Netherlands)

    Gabbert, S.G.M.; Ittersum, van M.K.; Kroeze, C.; Stalpers, S.I.P.; Ewert, F.; Alkan Olsson, J.

    2010-01-01

    Integrated Assessment (IA) models aim at providing information- and decision-support to complex problems. This paper argues that uncertainty analysis in IA models should be user-driven in order to strengthen science–policy interaction. We suggest an approach to uncertainty analysis that starts with

  1. TEXTS SENTIMENT-ANALYSIS APPLICATION FOR PUBLIC OPINION ASSESSMENT

    Directory of Open Access Journals (Sweden)

    I. A. Bessmertny

    2015-01-01

    Full Text Available The paper describes an approach to the emotional tonality assessment of natural language texts based on special dictionaries. A method for an automatic assessment of public opinion by means of sentiment-analysis of reviews and discussions followed by published Web-documents is proposed. The method is based on statistics of words in the documents. A pilot model of the software system implementing the sentiment-analysis of natural language text in Russian based on a linear assessment scale is developed. A syntactic analysis and words lemmatization are used to identify terms more correctly. Tonality dictionaries are presented in editable format and are open for enhancing. The program system implementing a sentiment-analysis of the Russian texts based on open dictionaries of tonality is presented for the first time.

  2. Metallic Mineral Resources Assessment and Analysis System Design

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    This paper presents the aim and the design structure of the metallic mineral resources assessment and analysis system. This system adopts an integrated technique of data warehouse composed of affairs-processing layer and analysis-application layer. The affairs-processing layer includes multiform databases (such as geological database, geophysical database, geochemical database),while the analysis application layer includes data warehouse, online analysis processing and data mining. This paper also presents in detail the data warehouse of the present system and the appropriate spatial analysis methods and models. Finally, this paper presents the prospect of the system.

  3. Task Analysis Assessment on Intrastate Bus Traffic Controllers

    Science.gov (United States)

    Yen Bin, Teo; Azlis-Sani, Jalil; Nur Annuar Mohd Yunos, Muhammad; Ismail, S. M. Sabri S. M.; Tajedi, Noor Aqilah Ahmad

    2016-11-01

    Public transportation acts as social mobility and caters the daily needs of the society for passengers to travel from one place to another. This is true for a country like Malaysia where international trade has been growing significantly over the past few decades. Task analysis assessment was conducted with the consideration of cognitive ergonomic view towards problem related to human factors. Conducting research regarding the task analysis on bus traffic controllers had allowed a better understanding regarding the nature of work and the overall monitoring activities of the bus services. This paper served to study the task analysis assessment on intrastate bus traffic controllers and the objectives of this study include to conduct task analysis assessment on the bus traffic controllers. Task analysis assessment for the bus traffic controllers was developed via Hierarchical Task Analysis (HTA). There are a total of five subsidiary tasks on level one and only two were able to be further broken down in level two. Development of HTA allowed a better understanding regarding the work and this could further ease the evaluation of the tasks conducted by the bus traffic controllers. Thus, human error could be reduced for the safety of all passengers and increase the overall efficiency of the system. Besides, it could assist in improving the operation of the bus traffic controllers by modelling or synthesizing the existing tasks if necessary.

  4. No-Reference Video Quality Assessment using MPEG Analysis

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Forchhammer, Søren; Korhonen, Jari

    2013-01-01

    We present a method for No-Reference (NR) Video Quality Assessment (VQA) for decoded video without access to the bitstream. This is achieved by extracting and pooling features from a NR image quality assessment method used frame by frame. We also present methods to identify the video coding...... and estimate the video coding parameters for MPEG-2 and H.264/AVC which can be used to improve the VQA. The analysis differs from most other video coding analysis methods since it is without access to the bitstream. The results show that our proposed method is competitive with other recent NR VQA methods...

  5. Cable Hot Shorts and Circuit Analysis in Fire Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    LaChance, Jeffrey; Nowlen, Steven P.; Wyant, Frank

    1999-05-19

    Under existing methods of probabilistic risk assessment (PRA), the analysis of fire-induced circuit faults has typically been conducted on a simplistic basis. In particular, those hot-short methodologies that have been applied remain controversial in regards to the scope of the assessments, the underlying methods, and the assumptions employed. To address weaknesses in fire PRA methodologies, the USNRC has initiated a fire risk analysis research program that includes a task for improving the tools for performing circuit analysis. The objective of this task is to obtain a better understanding of the mechanisms linking fire-induced cable damage to potentially risk-significant failure modes of power, control, and instrumentation cables. This paper discusses the current status of the circuit analysis task.

  6. Formative Assessment and Writing: A Meta-Analysis

    Science.gov (United States)

    Graham, Steve; Hebert, Michael; Harris, Karen R.

    2015-01-01

    To determine whether formative writing assessments that are directly tied to everyday classroom teaching and learning enhance students' writing performance, we conducted a meta-analysis of true and quasi-experiments conducted with students in grades 1 to 8. We found that feedback to students about writing from adults, peers, self, and computers…

  7. Using conversation analysis to assess and treat people with aphasia.

    Science.gov (United States)

    Beeke, Suzanne; Maxim, Jane; Wilkinson, Ray

    2007-05-01

    This article gives an overview of the application to aphasia of conversation analysis (CA), a qualitative methodology for the analysis of recorded, naturally occurring talk produced in everyday human interaction. CA, like pragmatics, considers language use in context, but it differs from other analytical frameworks because the clinician is not making interpretations about how an aspect of language should be coded or judging whether an utterance is successful or adequate in terms of communication. We first outline the CA methodology before discussing its application to the assessment of aphasia, principally through the use of two published assessment tools. We then move on to illustrate applications of CA in the field of aphasia therapy by discussing two single case study interventions. Key conversation behaviors are illustrated with transcripts from interactions recorded by the person with aphasia and the person's habitual conversation partner in the home environment. Finally, we explore the implications of using CA as a tool for assessment and treatment in aphasia.

  8. [Ecological security assessment of Tangshan City based on emergy analysis].

    Science.gov (United States)

    Cao, Ming-lan; Li, Ya-dong

    2009-09-01

    Based on 'pressure-state-response' model and by using emergy analysis method, the urban ecological security assessment system and urban ecological security index (EUESI) were constructed, and the variation of ecological security level of Tangshan City in 1995-2005 was evaluated. During this period, the ecological security level of the city increased first and decreased then. The EUESI increased from 0.017 in 1995 to 0.022 in 1996, then dropped yearly, and became unsecure in 2003. The urban ecological security assessment method based on emergy analysis overcame the disadvantages of conventional assessment system, e.g., numerous and repetitive indicators, non-uniform units, and poor comparability, and reflected the urban ecological security state more objectively, being able to provide scientific basis for urban ecological environment management and decision-making.

  9. Assessing environmental performance by combining life cycle assessment, multi-criteria analysis and environmental performance indicators

    NARCIS (Netherlands)

    Hermann, B.G.; Kroeze, C.; Jawjit, W.

    2007-01-01

    We present a new analytical tool, called COMPLIMENT, which can be used to provide detailed information on the overall environmental impact of a business. COMPLIMENT integrates parts of tools such as life cycle assessment, multi-criteria analysis and environmental performance indicators. It avoids di

  10. Assessing Group Interaction with Social Language Network Analysis

    Science.gov (United States)

    Scholand, Andrew J.; Tausczik, Yla R.; Pennebaker, James W.

    In this paper we discuss a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to assess socially situated working relationships within a group. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships are latent or unrecognized.

  11. Life cycle assessment analysis of supercritical coal power units

    Science.gov (United States)

    Ziębik, Andrzej; Hoinka, Krzysztof; Liszka, Marcin

    2010-09-01

    This paper presents the Life Cycle Assessment (LCA) analysis concerning the selected options of supercritical coal power units. The investigation covers a pulverized power unit without a CCS (Carbon Capture and Storage) installation, a pulverized unit with a "post-combustion" installation (MEA type) and a pulverized power unit working in the "oxy-combustion" mode. For each variant the net electric power amounts to 600 MW. The energy component of the LCA analysis has been determined. It describes the depletion of non-renewable natural resources. The energy component is determined by the coefficient of cumulative energy consumption in the life cycle. For the calculation of the ecological component of the LCA analysis the cumulative CO2 emission has been applied. At present it is the basic emission factor for the LCA analysis of power plants. The work also presents the sensitivity analysis of calculated energy and ecological factors.

  12. Ecological food web analysis for chemical risk assessment.

    Science.gov (United States)

    Preziosi, Damian V; Pastorok, Robert A

    2008-12-01

    Food web analysis can be a critical component of ecological risk assessment, yet it has received relatively little attention among risk assessors. Food web data are currently used in modeling bioaccumulation of toxic chemicals and, to a limited extent, in the determination of the ecological significance of risks. Achieving more realism in ecological risk assessments requires new analysis tools and models that incorporate accurate information on key receptors in a food web paradigm. Application of food web analysis in risk assessments demands consideration of: 1) different kinds of food webs; 2) definition of trophic guilds; 3) variation in food webs with habitat, space, and time; and 4) issues for basic sampling design and collection of dietary data. The different kinds of food webs include connectance webs, materials flow webs, and functional (or interaction) webs. These three kinds of webs play different roles throughout various phases of an ecological risk assessment, but risk assessors have failed to distinguish among web types. When modeling food webs, choices must be made regarding the level of complexity for the web, assignment of species to trophic guilds, selection of representative species for guilds, use of average diets, the characterization of variation among individuals or guild members within a web, and the spatial and temporal scales/dynamics of webs. Integrating exposure and effects data in ecological models for risk assessment of toxic chemicals relies on coupling food web analysis with bioaccumulation models (e.g., Gobas-type models for fish and their food webs), wildlife exposure models, dose-response models, and population dynamics models.

  13. Web-Based Instruction and Learning: Analysis and Needs Assessment

    Science.gov (United States)

    Grabowski, Barbara; McCarthy, Marianne; Koszalka, Tiffany

    1998-01-01

    An analysis and needs assessment was conducted to identify kindergarten through grade 14 (K-14) customer needs with regard to using the World Wide Web (WWW) for instruction and to identify obstacles K-14 teachers face in utilizing NASA Learning Technologies products in the classroom. The needs assessment was conducted as part of the Dryden Learning Technologies Project which is a collaboration between Dryden Flight Research Center (DFRC), Edwards, California and Tne Pennsylvania State University (PSU), University Park, Pennsylvania. The overall project is a multiyear effort to conduct research in the development of teacher training and tools for Web-based science, mathematics and technology instruction and learning.

  14. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka

    2015-12-01

    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  15. Secondary Analysis of Large-Scale Assessment Data: An Alternative to Variable-Centred Analysis

    Science.gov (United States)

    Chow, Kui Foon; Kennedy, Kerry John

    2014-01-01

    International large-scale assessments are now part of the educational landscape in many countries and often feed into major policy decisions. Yet, such assessments also provide data sets for secondary analysis that can address key issues of concern to educators and policymakers alike. Traditionally, such secondary analyses have been based on a…

  16. Assessment of Transport Projects: Risk Analysis and Decision Support

    DEFF Research Database (Denmark)

    Salling, Kim Bang

    2008-01-01

    The subject of this thesis is risk analysis and decision support in the context of transport infrastructure assessment. During my research I have observed a tendency in studies of assessing transport projects of overlooking the substantial amount of uncertainties within the decision making process....... Even though vast amounts of money are spent upon preliminary models, environmental investigations, public hearings, etc., the resulting outcome is given by point estimates, i.e. in terms of net present values or benefit-cost rates. This thesis highlights the perspective of risks when assessing...... transport projects, namely by moving from point estimates to interval results. The main focus of this Ph.D. study has been to develop a valid, flexible and functional decision support tool in which risk oriented aspects of project evaluation is implemented. Throughout the study six papers have been produced...

  17. Individuals' stress assessment using human-smartphone interaction analysis

    DEFF Research Database (Denmark)

    Ciman, Matteo; Wac, Katarzyna

    2017-01-01

    The increasing presence of stress in people’ lives has motivated much research efforts focusing on continuous stress assessment methods of individuals, leveraging smartphones and wearable devices. These methods have several drawbacks, i.e., they use invasive external devices, thus increasing entry...... costs and reducing user acceptance, or they use some of privacy-related information. This paper presents an approach for stress assessment that leverages data extracted from smartphone sensors, and that is not invasive concerning privacy. Two different approaches are presented. One, based on smartphone...... gestures analysis, e.g., ‘tap’, ‘scroll’, ‘swipe’ and ‘text writing’, and evaluated in laboratory settings with 13 participants (F-measure 79-85% within-subject model, 70-80% global model); the second one based on smartphone usage analysis and tested in-the-wild with 25 participants (F-measure 77...

  18. Assessment report on NRP sub-theme `Risk Analysis`

    Energy Technology Data Exchange (ETDEWEB)

    Biesiot, W.; Hendrickx, L. [eds.] [University of Groningen, Center for Energy and Environmental Studies, Groningen (Netherlands); Van Ham, J. [TNO Institute for Environmental Sciences, Delft (Netherlands); Olsthoorn, A.A. [VUA, Free University of Amsterdam, Amsterdam (Netherlands)

    1995-12-31

    An overview and assessment are presented of the three research projects carried out under NRP funding that concern risk-related topics: (1) The risks of nonlinear climate changes, (2) Socio-economic and policy aspects of changes in incidence and intensity of extreme (weather) events, and (3) Characterizing the risks: a comparative analysis of the risks of global warming and of relevant policy strategies. 1 tab., 6 refs.

  19. A Protocol for the Global Sensitivity Analysis of Impact Assessment Models in Life Cycle Assessment.

    Science.gov (United States)

    Cucurachi, S; Borgonovo, E; Heijungs, R

    2016-02-01

    The life cycle assessment (LCA) framework has established itself as the leading tool for the assessment of the environmental impact of products. Several works have established the need of integrating the LCA and risk analysis methodologies, due to the several common aspects. One of the ways to reach such integration is through guaranteeing that uncertainties in LCA modeling are carefully treated. It has been claimed that more attention should be paid to quantifying the uncertainties present in the various phases of LCA. Though the topic has been attracting increasing attention of practitioners and experts in LCA, there is still a lack of understanding and a limited use of the available statistical tools. In this work, we introduce a protocol to conduct global sensitivity analysis in LCA. The article focuses on the life cycle impact assessment (LCIA), and particularly on the relevance of global techniques for the development of trustable impact assessment models. We use a novel characterization model developed for the quantification of the impacts of noise on humans as a test case. We show that global SA is fundamental to guarantee that the modeler has a complete understanding of: (i) the structure of the model and (ii) the importance of uncertain model inputs and the interaction among them.

  20. Modular risk analysis for assessing multiple waste sites

    Energy Technology Data Exchange (ETDEWEB)

    Whelan, G.; Buck, J.W. [Pacific Northwest Lab., Richland, WA (United States); Nazarali, A. [Advanced Sciences, Inc., Richland, WA (United States)

    1994-06-01

    Human-health impacts, especially to the surrounding public, are extremely difficult to assess at installations that contain multiple waste sites and a variety of mixed-waste constituents (e.g., organic, inorganic, and radioactive). These assessments must address different constituents, multiple waste sites, multiple release patterns, different transport pathways (i.e., groundwater, surface water, air, and overland soil), different receptor types and locations, various times of interest, population distributions, land-use patterns, baseline assessments, a variety of exposure scenarios, etc. Although the process is complex, two of the most important difficulties to overcome are associated with (1) establishing an approach that allows for modifying the source term, transport, or exposure component as an individual module without having to re-evaluate the entire installation-wide assessment (i.e., all modules simultaneously), and (2) displaying and communicating the results in an understandable and useable maimer to interested parties. An integrated, physics-based, compartmentalized approach, which is coupled to a Geographical Information System (GIS), captures the regional health impacts associated with multiple waste sites (e.g., hundreds to thousands of waste sites) at locations within and surrounding the installation. Utilizing a modular/GIS-based approach overcomes difficulties in (1) analyzing a wide variety of scenarios for multiple waste sites, and (2) communicating results from a complex human-health-impact analysis by capturing the essence of the assessment in a relatively elegant manner, so the meaning of the results can be quickly conveyed to all who review them.

  1. Spatial risk assessment for critical network infrastructure using sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    Michael M·derl; Wolfgang Rauch

    2011-01-01

    The presented spatial risk assessment method allows for managing critical network infrastructure in urban areas under abnormal and future conditions caused e.g.,by terrorist attacks,infrastructure deterioration or climate change.For the spatial risk assessment,vulnerability maps for critical network infrastructure are merged with hazard maps for an interfering process.Vulnerability maps are generated using a spatial sensitivity analysis of network transport models to evaluate performance decrease under investigated thread scenarios.Thereby parameters are varied according to the specific impact of a particular threat scenario.Hazard maps are generated with a geographical information system using raster data of the same threat scenario derived from structured interviews and cluster analysis of events in the past.The application of the spatial risk assessment is exemplified by means of a case study for a water supply system,but the principal concept is applicable likewise to other critical network infrastructure.The aim of the approach is to help decision makers in choosing zones for preventive measures.

  2. Quantitative Computed Tomography and image analysis for advanced muscle assessment

    Directory of Open Access Journals (Sweden)

    Kyle Joseph Edmunds

    2016-06-01

    Full Text Available Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration.

  3. Quantitative analysis in outcome assessment of instrumented lumbosacral arthrodesis.

    Science.gov (United States)

    Champain, Sabina; Mazel, Christian; Mitulescu, Anca; Skalli, Wafa

    2007-08-01

    The outcome assessment in instrumented lumbosacral fusion mostly focuses on clinical criteria, complications and scores, with a high variability of imaging means, methods of fusion grading and parameters describing degenerative changes, making comparisons between studies difficult. The aim of this retrospective evaluation was to evaluate the interest of quantified radiographic analysis of lumbar spine in global outcome assessment and to highlight the key biomechanical factors involved. Clinical data and Beaujon-Lassale scores were collected for 49 patients who underwent lumbosacral arthrodesis after prior lumbar discectomy (mean follow-up: 5 years). Sagittal standing and lumbar flexion-extension X-ray films allowed quantifying vertebral, lumbar, pelvic and kinematic parameters of the lumbar spine, which were compared to reference values. Statistics were performed to assess evolution for all variables. At long-term follow-up, 90% of patients presented satisfactory clinical outcomes, associated to normal sagittal alignment; vertebral parameters objectified adjacent level degeneration in four cases (8%). Clinical outcome was correlated (r = 0.8) with fusion that was confirmed in 80% of cases, doubtful in 16% and pseudarthrosis seemed to occur in 4% (2) of cases. In addition to clinical data (outcomes comparable to the literature), quantitative analysis accurately described lumbar spine geometry and kinematics, highlighting parameters related to adjacent level's degeneration and a significant correlation between clinical outcome and fusion. Furthermore, criteria proposed to quantitatively evaluate fusion from lumbar dynamic radiographs seem to be appropriate and in agreement with surgeon's qualitative grading in 87% of cases.

  4. NASA Langley Systems Analysis & Concepts Directorate Technology Assessment/Portfolio Analysis

    Science.gov (United States)

    Cavanaugh, Stephen; Chytka, Trina; Arcara, Phil; Jones, Sharon; Stanley, Doug; Wilhite, Alan W.

    2006-01-01

    Systems analysis develops and documents candidate mission and architectures, associated system concepts, enabling capabilities and investment strategies to achieve NASA s strategic objectives. The technology assessment process connects the mission and architectures to the investment strategies. In order to successfully implement a technology assessment, there is a need to collect, manipulate, analyze, document, and disseminate technology-related information. Information must be collected and organized on the wide variety of potentially applicable technologies, including: previous research results, key technical parameters and characteristics, technology readiness levels, relationships to other technologies, costs, and potential barriers and risks. This information must be manipulated to facilitate planning and documentation. An assessment is included of the programmatic and technical risks associated with each technology task as well as potential risk mitigation plans. Risks are assessed and tracked in terms of likelihood of the risk occurring and consequences of the risk if it does occur. The risk assessments take into account cost, schedule, and technical risk dimensions. Assessment data must be simplified for presentation to decision makers. The Systems Analysis and Concepts Directorate (SACD) at NASA Langley Research Center has a wealth of experience in performing Technology Assessment and Portfolio Analysis as this has been a business line since 1978.

  5. Assessing population exposure for landslide risk analysis using dasymetric cartography

    Science.gov (United States)

    Garcia, Ricardo A. C.; Oliveira, Sérgio C.; Zêzere, José L.

    2016-12-01

    Assessing the number and locations of exposed people is a crucial step in landslide risk management and emergency planning. The available population statistical data frequently have insufficient detail for an accurate assessment of potentially exposed people to hazardous events, mainly when they occur at the local scale, such as with landslides. The present study aims to apply dasymetric cartography to improving population spatial resolution and to assess the potentially exposed population. An additional objective is to compare the results with those obtained with a more common approach that uses, as spatial units, basic census units, which are the best spatial data disaggregation and detailed information available for regional studies in Portugal. Considering the Portuguese census data and a layer of residential building footprint, which was used as ancillary information, the number of exposed inhabitants differs significantly according to the approach used. When the census unit approach is used, considering the three highest landslide susceptible classes, the number of exposed inhabitants is in general overestimated. Despite the associated uncertainties of a general cost-benefit analysis, the presented methodology seems to be a reliable approach for gaining a first approximation of a more detailed estimation of exposed people. The approach based on dasymetric cartography allows the spatial resolution of population over large areas to be increased and enables the use of detailed landslide susceptibility maps, which are valuable for improving the exposed population assessment.

  6. A Monte Carlo based spent fuel analysis safeguards strategy assessment

    Energy Technology Data Exchange (ETDEWEB)

    Fensin, Michael L [Los Alamos National Laboratory; Tobin, Stephen J [Los Alamos National Laboratory; Swinhoe, Martyn T [Los Alamos National Laboratory; Menlove, Howard O [Los Alamos National Laboratory; Sandoval, Nathan P [Los Alamos National Laboratory

    2009-01-01

    assessment process, the techniques employed to automate the coupled facets of the assessment process, and the standard burnup/enrichment/cooling time dependent spent fuel assembly library. We also clearly define the diversion scenarios that will be analyzed during the standardized assessments. Though this study is currently limited to generic PWR assemblies, it is expected that the results of the assessment will yield an adequate spent fuel analysis strategy knowledge that will help the down-select process for other reactor types.

  7. Development and assessment of best estimate integrated safety analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu (and others)

    2007-03-15

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published.

  8. Analysis and Comparison of Objective Methods for Image Quality Assessment

    Directory of Open Access Journals (Sweden)

    P. S. Babkin

    2014-01-01

    Full Text Available The purpose of this work is research and modification of the reference objective methods for image quality assessment. The ultimate goal is to obtain a modification of formal assessments that more closely corresponds to the subjective expert estimates (MOS.In considering the formal reference objective methods for image quality assessment we used the results of other authors, which offer results and comparative analyzes of the most effective algorithms. Based on these investigations we have chosen two of the most successful algorithm for which was made a further analysis in the MATLAB 7.8 R 2009 a (PQS and MSSSIM. The publication focuses on the features of the algorithms, which have great importance in practical implementation, but are insufficiently covered in the publications by other authors.In the implemented modification of the algorithm PQS boundary detector Kirsch was replaced by the boundary detector Canny. Further experiments were carried out according to the method of the ITU-R VT.500-13 (01/2012 using monochrome images treated with different types of filters (should be emphasized that an objective assessment of image quality PQS is applicable only to monochrome images. Images were obtained with a thermal imaging surveillance system. The experimental results proved the effectiveness of this modification.In the specialized literature in the field of formal to evaluation methods pictures, this type of modification was not mentioned.The method described in the publication can be applied to various practical implementations of digital image processing.Advisability and effectiveness of using the modified method of PQS to assess the structural differences between the images are shown in the article and this will be used in solving the problems of identification and automatic control.

  9. Computational geometry assessment for morphometric analysis of the mandible.

    Science.gov (United States)

    Raith, Stefan; Varga, Viktoria; Steiner, Timm; Hölzle, Frank; Fischer, Horst

    2017-01-01

    This paper presents a fully automated algorithm for geometry assessment of the mandible. Anatomical landmarks could be reliably detected and distances were statistically evaluated with principal component analysis. The method allows for the first time to generate a mean mandible shape with statistically valid geometrical variations based on a large set of 497 CT-scans of human mandibles. The data may be used in bioengineering for designing novel oral implants, for planning of computer-guided surgery, and for the improvement of biomechanical models, as it is shown that commercially available mandible replicas differ significantly from the mean of the investigated population.

  10. TXRF analysis of soils and sediments to assess environmental contamination.

    Science.gov (United States)

    Bilo, Fabjola; Borgese, Laura; Cazzago, Davide; Zacco, Annalisa; Bontempi, Elza; Guarneri, Rita; Bernardello, Marco; Attuati, Silvia; Lazo, Pranvera; Depero, Laura E

    2014-12-01

    Total reflection x-ray fluorescence spectroscopy (TXRF) is proposed for the elemental chemical analysis of crustal environmental samples, such as sediments and soils. A comparative study of TXRF with respect to flame atomic absorption spectroscopy and inductively coupled plasma optical emission spectroscopy was performed. Microwave acid digestion and suspension preparation methods are evaluated. A good agreement was found among the results obtained with different spectroscopic techniques and sample preparation methods for Cr, Mn, Fe, Ni, Cu, and Zn. We demonstrated that TXRF is suitable for the assessment of environmental contamination phenomena, even if the errors for Pb, As, V, and Ba are ingent.

  11. Supporting analysis and assessments quality metrics: Utility market sector

    Energy Technology Data Exchange (ETDEWEB)

    Ohi, J. [National Renewable Energy Lab., Golden, CO (United States)

    1996-10-01

    In FY96, NREL was asked to coordinate all analysis tasks so that in FY97 these tasks will be part of an integrated analysis agenda that will begin to define a 5-15 year R&D roadmap and portfolio for the DOE Hydrogen Program. The purpose of the Supporting Analysis and Assessments task at NREL is to provide this coordination and conduct specific analysis tasks. One of these tasks is to prepare the Quality Metrics (QM) for the Program as part of the overall QM effort at DOE/EERE. The Hydrogen Program one of 39 program planning units conducting QM, a process begun in FY94 to assess benefits/costs of DOE/EERE programs. The purpose of QM is to inform decisionmaking during budget formulation process by describing the expected outcomes of programs during the budget request process. QM is expected to establish first step toward merit-based budget formulation and allow DOE/EERE to get {open_quotes}most bang for its (R&D) buck.{close_quotes} In FY96. NREL coordinated a QM team that prepared a preliminary QM for the utility market sector. In the electricity supply sector, the QM analysis shows hydrogen fuel cells capturing 5% (or 22 GW) of the total market of 390 GW of new capacity additions through 2020. Hydrogen consumption in the utility sector increases from 0.009 Quads in 2005 to 0.4 Quads in 2020. Hydrogen fuel cells are projected to displace over 0.6 Quads of primary energy in 2020. In future work, NREL will assess the market for decentralized, on-site generation, develop cost credits for distributed generation benefits (such as deferral of transmission and distribution investments, uninterruptible power service), cost credits for by-products such as heat and potable water, cost credits for environmental benefits (reduction of criteria air pollutants and greenhouse gas emissions), compete different fuel cell technologies against each other for market share, and begin to address economic benefits, especially employment.

  12. Heart rate variability analysis for newborn infants prolonged pain assessment.

    Science.gov (United States)

    De Jonckheere, J; Rakza, T; Logier, R; Jeanne, M; Jounwaz, R; Storme, L

    2011-01-01

    Pain management is a general concern for healthcare quality. In the particular context of neonatal care, it's well known that an efficient pain management will decrease mortality and morbidity of newborn infants. Furthermore, the plasticity of developing brain is vulnerable to pain and/or stress, that in turn may cause long term neurodevelopmental changes, including altered pain sensitivity and neuroanatomic and behavioural abnormalities. During neonatal intensive care stay, large number of painful procedures are performed, the majority of which are not accompanied by adequate analgesia. Optimal management requires competent pain assessment which can be especially difficult to perform in this non verbal population. We have developed an instantaneous heart rate variability (HRV) analysis method, non intrusive and user-friendly, based on the ECG signal acquisition. This analysis method enabled us to design parameters related to the influence of pain on the Autonomic Nervous System (ANS) activity. This paper presents the application of this method, previously validated for adults under general anesthesia, to the domain of newborn infants prolonged pain assessment.

  13. Assessing microstructures of pyrrhotites in basalts by multifractal analysis

    Directory of Open Access Journals (Sweden)

    S. Xie

    2010-07-01

    Full Text Available Understanding and describing spatial arrangements of mineral particles and determining the mineral distribution structure are important to model the rock-forming process. Geometric properties of individual mineral particles can be estimated from thin sections, and different models have been proposed to quantify the spatial complexity of mineral arrangement. The Gejiu tin-polymetallic ore-forming district, located in Yunnan province, southwestern China, is chosen as the study area. The aim of this paper is to apply fractal and multifractal analysis to quantify distribution patterns of pyrrhotite particles from twenty-eight binary images obtained from seven basalt segments and then to discern the possible petrological formation environments of the basalts based on concentrations of trace elements. The areas and perimeters of pyrrhotite particles were measured for each image. Perimeter-area fractal analysis shows that the perimeter and area of pyrrhotite particles follow a power-law relationship, which implies the scale-invariance of the shapes of the pyrrhotites. Furthermore, the spatial variation of the pyrrhotite particles in space was characterized by multifractal analysis using the method of moments. The results show that the average values of the area-perimeter exponent (DAP, the width of the multifractal spectra (Δ(D(0−D(2 and Δ(D(qminD(qmax and the multifractality index (τ"(1 for the pyrrhotite particles reach their minimum in the second basalt segment, which implies that the spatial arrangement of pyrrhotite particles in Segment 2 is less heterogeneous. Geochemical trace element analysis results distinguish the second basalt segment sample from other basalt samples. In this aspect, the fractal and multifractal analysis may provide new insights into the quantitative assessment of mineral microstructures which may be closely associated with the petrogenesis as shown by the

  14. Using miscue analysis to assess comprehension in deaf college readers.

    Science.gov (United States)

    Albertini, John; Mayer, Connie

    2011-01-01

    For over 30 years, teachers have used miscue analysis as a tool to assess and evaluate the reading abilities of hearing students in elementary and middle schools and to design effective literacy programs. More recently, teachers of deaf and hard-of-hearing students have also reported its usefulness for diagnosing word- and phrase-level reading difficulties and for planning instruction. To our knowledge, miscue analysis has not been used with older, college-age deaf students who might also be having difficulty decoding and understanding text at the word level. The goal of this study was to determine whether such an analysis would be helpful in identifying the source of college students' reading comprehension difficulties. After analyzing the miscues of 10 college-age readers and the results of other comprehension-related tasks, we concluded that comprehension of basic grade school-level passages depended on the ability to recognize and comprehend key words and phrases in these texts. We also concluded that these diagnostic procedures provided useful information about the reading abilities and strategies of each reader that had implications for designing more effective interventions.

  15. Assessment of Stability of Craniofacial Implants by Resonant Frequency Analysis.

    Science.gov (United States)

    Ivanjac, Filip; Konstantinović, Vitomir S; Lazić, Vojkan; Dordević, Igor; Ihde, Stefan

    2016-03-01

    Implant stability is a principal precondition for the success of implant therapy. Extraoral implants (EO) are mainly used for anchoring of maxillofacial epithesis. However, assessment of implant stability is mostly based on principles derived from oral implants. The aim of this study was to investigate clinical stability of EO craniofacial disk implants (single, double, and triple) by resonance frequency analysis at different stages of the bone's healing. Twenty patients with orbital (11), nasal (5), and auricular (4) defects with 50 EO implants placed for epithesis anchorage were included. Implant stability was measured 3 times; after implant placement, at 3 months and at least after 6 months. A significant increase in implant stability values was noted between all of the measurements, except for triple-disk implants between third and sixth months, and screw implants between 0 and third months. Disk implants showed lower implant stability quotient (ISQ) values compared with screw implants. Triple-disk implants showed better stability compared with single and double-disk implants. Based on resonance frequency analysis values, disk implants could be safely loaded when their ISQ values are 38 (single disks), 47 (double disks), and 48 (triple disks). According to resonance frequency analysis, disk implant stability increased over time, which showed good osseointegration and increasing mineralization. Although EO screw implants showed higher ISQ values than disk implants, disk-type implants can be safely loaded even if lower values of stability are measured.

  16. Assessing student teachers' reflective writing through quantitative content analysis

    NARCIS (Netherlands)

    Poldner, Eric; Van der Schaaf, Marieke; Simons, P. Robert Jan; Van Tartwijk, Jan; Wijngaards, Guus

    2014-01-01

    Students' reflective essay writing can be stimulated by the formative assessments provided to them by their teachers. Such assessments contain information about the quality of students' reflective writings and offer suggestions for improvement. Despite the importance of formatively assessing student

  17. The October 1973 NASA mission model analysis and economic assessment

    Science.gov (United States)

    1974-01-01

    Results are presented of the 1973 NASA Mission Model Analysis. The purpose was to obtain an economic assessment of using the Shuttle to accommodate the payloads and requirements as identified by the NASA Program Offices and the DoD. The 1973 Payload Model represents a baseline candidate set of future payloads which can be used as a reference base for planning purposes. The cost of implementing these payload programs utilizing the capabilities of the shuttle system is analyzed and compared with the cost of conducting the same payload effort using expendable launch vehicles. There is a net benefit of 14.1 billion dollars as a result of using the shuttle during the 12-year period as compared to using an expendable launch vehicle fleet.

  18. Time-dependent reliability analysis and condition assessment of structures

    Energy Technology Data Exchange (ETDEWEB)

    Ellingwood, B.R. [Johns Hopkins Univ., Baltimore, MD (United States)

    1997-01-01

    Structures generally play a passive role in assurance of safety in nuclear plant operation, but are important if the plant is to withstand the effect of extreme environmental or abnormal events. Relative to mechanical and electrical components, structural systems and components would be difficult and costly to replace. While the performance of steel or reinforced concrete structures in service generally has been very good, their strengths may deteriorate during an extended service life as a result of changes brought on by an aggressive environment, excessive loading, or accidental loading. Quantitative tools for condition assessment of aging structures can be developed using time-dependent structural reliability analysis methods. Such methods provide a framework for addressing the uncertainties attendant to aging in the decision process.

  19. Current perceptions and applicability of ecosystem analysis to impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Auerbach, S.I.

    1977-01-01

    The concept of cost-benefit analysis in relation to the assessment of various factors causing stress on natural ecosystems is discussed. It is pointed out that if stress is considered in the context of a deviation from some homeostatic condition, we do face a number of technical and socially related questions. The technical questions are those concerning the need to define in rigorous scientific terms the meaning of ecosystem homeostasis. What is the significance, both temporally and spatially, of a deviation from such homeostasis, and of the elucidation in quantitative terms of the acceptability and nonacceptability of such a deviation. The latter, of course, puts us into our role as scientist-citizens. There we enter the realm of value judgment where we provide only one of many inputs which need to be considered by an institutional decision-maker.

  20. Origin assessment of EV olive oils by esterified sterols analysis.

    Science.gov (United States)

    Giacalone, Rosa; Giuliano, Salvatore; Gulotta, Eleonora; Monfreda, Maria; Presti, Giovanni

    2015-12-01

    In this study extra virgin olive oils of Italian and non-Italian origin (from Spain, Tunisia and blends of EU origin) were differentiated by GC-FID analysis of sterols and esterified sterols followed by chemometric tools. PCA allowed to highlight the high significance of esterified sterols to characterise extra virgin olive oils in relation to their origin. SIMCA provided a sensitivity and specificity of 94.39% and 91.59% respectively; furthermore, an external set of 54 extra virgin olive oils bearing a designation of Italian origin on the labelling was tested by SIMCA. Prediction results were also compared with organoleptic assessment. Finally, the poor correlation found between ethylesters and esterified sterols allowed to hazard the guess, worthy of further investigations, that esterified sterols may prove to be promising in studies of geographical discrimination: indeed they appear to be independent of those factors causing the formation of ethyl esters and related to olive oil production.

  1. New challenges on uncertainty propagation assessment of flood risk analysis

    Science.gov (United States)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  2. Analysis of environmental impact assessment (EIA) system in Turkey.

    Science.gov (United States)

    Coşkun, Aynur Aydın; Turker, Ozhan

    2011-04-01

    The Environmental Impact Assessment (EIA) System, which embodies the "prevention principle" of the environmental law, is an important tool for environmental protection. This tool has a private importance for Turkey since it is a developing country, and it entered the Turkish law in 1983 with the Environmental Law. Besides, the EIA Regulation, which shows the application principles, became effective in 1993. Because Turkey is a candidate for European Union (EU), the EIA Regulation has been changed due to the EU compliance procedure, and its latest version became valid in 2008. This study aims to emphasize The EIA system in Turkey to supervise the efficiency of this procedure and point the success level. In the introduction part, general EIA concept, its importance, and some notations are mentioned. Following that, the legislation, which builds the EIA system, has been analyzed starting from the 1982 Turkish Constitution. Then, the legislation rules are explained due to the basic steps of the EIA procedure. In order to shed light upon the application, the EIA final decisions given until today, the results, and their distributions to the industries are assessed. In the final part of the study, a SWOT analysis is made to mention the weaknesses, strengths, opportunities, and threats of the EIA system in Turkey.

  3. Assessing farming eco-efficiency: a Data Envelopment Analysis approach.

    Science.gov (United States)

    Picazo-Tadeo, Andrés J; Gómez-Limón, José A; Reig-Martínez, Ernest

    2011-04-01

    This paper assesses farming eco-efficiency using Data Envelopment Analysis (DEA) techniques. Eco-efficiency scores at both farm and environmental pressure-specific levels are computed for a sample of Spanish farmers operating in the rain-fed agricultural system of Campos County. The determinants of eco-efficiency are then studied using truncated regression and bootstrapping techniques. We contribute to previous literature in this field of research by including information on slacks in the assessment of the potential environmental pressure reductions in a DEA framework. Our results reveal that farmers are quite eco-inefficient, with very few differences emerging among specific environmental pressures. Moreover, eco-inefficiency is closely related to technical inefficiencies in the management of inputs. Regarding the determinants of eco-efficiency, farmers benefiting from agri-environmental programs as well as those with university education are found to be more eco-efficient. Concerning the policy implications of these results, public expenditure in agricultural extension and farmer training could be of some help to promote integration between farming and the environment. Furthermore, Common Agricultural Policy agri-environmental programs are an effective policy to improve eco-efficiency, although some doubts arise regarding their cost-benefit balance.

  4. Cyber threat impact assessment and analysis for space vehicle architectures

    Science.gov (United States)

    McGraw, Robert M.; Fowler, Mark J.; Umphress, David; MacDonald, Richard A.

    2014-06-01

    This paper covers research into an assessment of potential impacts and techniques to detect and mitigate cyber attacks that affect the networks and control systems of space vehicles. Such systems, if subverted by malicious insiders, external hackers and/or supply chain threats, can be controlled in a manner to cause physical damage to the space platforms. Similar attacks on Earth-borne cyber physical systems include the Shamoon, Duqu, Flame and Stuxnet exploits. These have been used to bring down foreign power generation and refining systems. This paper discusses the potential impacts of similar cyber attacks on space-based platforms through the use of simulation models, including custom models developed in Python using SimPy and commercial SATCOM analysis tools, as an example STK/SOLIS. The paper discusses the architecture and fidelity of the simulation model that has been developed for performing the impact assessment. The paper walks through the application of an attack vector at the subsystem level and how it affects the control and orientation of the space vehicle. SimPy is used to model and extract raw impact data at the bus level, while STK/SOLIS is used to extract raw impact data at the subsystem level and to visually display the effect on the physical plant of the space vehicle.

  5. Assessing computer waste generation in Chile using material flow analysis.

    Science.gov (United States)

    Steubing, Bernhard; Böni, Heinz; Schluep, Mathias; Silva, Uca; Ludwig, Christian

    2010-03-01

    The quantities of e-waste are expected to increase sharply in Chile. The purpose of this paper is to provide a quantitative data basis on generated e-waste quantities. A material flow analysis was carried out assessing the generation of e-waste from computer equipment (desktop and laptop PCs as well as CRT and LCD-monitors). Import and sales data were collected from the Chilean Customs database as well as from publications by the International Data Corporation. A survey was conducted to determine consumers' choices with respect to storage, re-use and disposal of computer equipment. The generation of e-waste was assessed in a baseline as well as upper and lower scenarios until 2020. The results for the baseline scenario show that about 10,000 and 20,000 tons of computer waste may be generated in the years 2010 and 2020, respectively. The cumulative e-waste generation will be four to five times higher in the upcoming decade (2010-2019) than during the current decade (2000-2009). By 2020, the shares of LCD-monitors and laptops will increase more rapidly replacing other e-waste including the CRT-monitors. The model also shows the principal flows of computer equipment from production and sale to recycling and disposal. The re-use of computer equipment plays an important role in Chile. An appropriate recycling scheme will have to be introduced to provide adequate solutions for the growing rate of e-waste generation.

  6. Acrylamide analysis: assessment of results from six rounds of Food Analysis Performance Assessment Scheme (FAPAS) proficiency testing.

    Science.gov (United States)

    Owen, Linda M; Castle, Laurence; Kelly, Janet; Wilson, Lesley; Lloyd, Antony S

    2005-01-01

    Six proficiency tests have now been completed in an ongoing program of the UK Food Analysis Performance Assessment Scheme (FAPAS) for the analysis of acrylamide in a range of food matrixes. Homogeneous test material samples were requested by laboratories throughout the world, with 29 to 45 submitting results for each test. Results were analyzed by appropriate statistical procedures, and z-scores were awarded for reported values. In the absence of both legislation and collaborative trial data, the target standard deviation was derived from the Horwitz equation, although it is acknowledged that there is a need to establish a "fit for purpose" target standard deviation specifically for acrylamide analysis. Participants were encouraged to use the analytical method routinely used in their own laboratory and to provide details of their procedure. Close examination of the data submitted indicates that performance is generally acceptable in terms of accuracy. There is no significant difference between results submitted by gas chromatography and liquid chromatography (GC and LC) methods, and no method dependency on the use of internal standards or sample size. However, choice of extraction solvent may be important, with indications that plain water is an acceptable extraction method. There is evidence from the most recent test that direct (underivatized) GC methodology may present problems, but more data are required and this aspect will be monitored in the continuing proficiency testing program.

  7. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Feng, E-mail: fwang@unu.edu [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Huisman, Jaco [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Stevels, Ab [Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft (Netherlands); Baldé, Cornelis Peter [Institute for Sustainability and Peace, United Nations University, Hermann-Ehler-Str. 10, 53113 Bonn (Germany); Statistics Netherlands, Henri Faasdreef 312, 2492 JP Den Haag (Netherlands)

    2013-11-15

    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e

  8. Analysis of Student Knowledge Evaluation Applying Self-Assessment Methodology: Criteria, Problems and Results

    OpenAIRE

    Agnė Matuliauskaitė; Edmundas Žvirblis

    2011-01-01

    The article analyses research done by a number of authors on problems related to knowledge evaluation based on self-assessment. Self-assessment problems, self-assessment criteria, self-assessment methods, and integration of self-assessment data into the final results are considered in the article. This analysis of the researches is an attempt to reveal whether self-assessment corresponds to traditional knowledge evaluation and what kind of problems occur during such evaluation.Article in English

  9. Reading Grade Levels and Mathematics Assessment: An Analysis of Texas Mathematics Assessment Items and Their Reading Difficulty

    Science.gov (United States)

    Lamb, John H.

    2010-01-01

    Increased reading difficulty of mathematics assessment items has been shown to negatively affect student performance. The advent of high-stakes testing, which has serious ramifications for students' futures and teachers' careers, necessitates analysis of reading difficulty on state assessment items and student performance on those items. Using…

  10. Integrated analysis environment for the Movement Assessment Battery for Children

    Directory of Open Access Journals (Sweden)

    Carlos Norberto Fischer

    2013-12-01

    Full Text Available Developmental Coordination Disorder (DCD, a chronic and usually permanent condition found in children, is characterized by motor impairment that interferes with a child's activities of daily living and with academic achievement. One of the most popular tests for the quantitative diagnosis of DCD is the Movement Assessment Battery for Children (MABC. Based on the Battery's standardized scores, it is possible to identify children with typical development, children at risk of developing DCD, and children with DCD. This article describes a computational system we developed to assist with the analysis of results obtained in the MABC test. The tool was developed for the web environment and its database provides integration of MABC data. Thus, researchers around the world can share data and develop collaborative work in the DCD field. In order to help analysis processes, our system provides services for filtering data to show more specific sets of information and present the results in textual, table, and graphic formats, allowing easier and more comprehensive evaluation of the results.

  11. Lake Michigan Wind Assessment Analysis, 2012 and 2013

    Directory of Open Access Journals (Sweden)

    Charles R Standridge

    2017-03-01

    Full Text Available A study was conducted to address the wind energy potential over Lake Michigan to support a commercial wind farm.  Lake Michigan is an inland sea in the upper mid-western United States.  A laser wind sensor mounted on a floating platform was located at the mid-lake plateau in 2012 and about 10.5 kilometers from the eastern shoreline near Muskegon Michigan in 2013.  Range gate heights for the laser wind sensor were centered at 75, 90, 105, 125, 150, and 175 meters.  Wind speed and direction were measured once each second and aggregated into 10 minute averages.  The two sample t-test and the paired-t method were used to perform the analysis.  Average wind speed stopped increasing between 105 m and 150 m depending on location.  Thus, the collected data is inconsistent with the idea that average wind speed increases with height. This result implies that measuring wind speed at wind turbine hub height is essential as opposed to using the wind energy power law to project the wind speed from lower heights.  Average speed at the mid-lake plateau is no more that 10% greater than at the location near Muskegon.  Thus, it may be possible to harvest much of the available wind energy at a lower height and closer to the shoreline than previously thought.  At both locations, the predominate wind direction is from the south-southwest.  The ability of the laser wind sensor to measure wind speed appears to be affected by a lack of particulate matter at greater heights.   Keywords: wind assessment, Lake Michigan, LIDAR wind sensor, statistical analysis. Article History: Received June 15th 2016; Received in revised form January 16th 2017; Accepted February 2nd 2017 Available online How to Cite This Article: Standridge, C., Zeitler, D., Clark, A., Spoelma, T., Nordman, E., Boezaart, T.A., Edmonson, J.,  Howe, G., Meadows, G., Cotel, A. and Marsik, F. (2017 Lake Michigan Wind Assessment Analysis, 2012 and 2013. Int. Journal of Renewable Energy Development

  12. Assessing population exposure for landslide risk analysis using dasymetric cartography

    Science.gov (United States)

    Garcia, Ricardo A. C.; Oliveira, Sergio C.; Zezere, Jose L.

    2015-04-01

    Exposed Population is a major topic that needs to be taken into account in a full landslide risk analysis. Usually, risk analysis is based on an accounting of inhabitants number or inhabitants density, applied over statistical or administrative terrain units, such as NUTS or parishes. However, this kind of approach may skew the obtained results underestimating the importance of population, mainly in territorial units with predominance of rural occupation. Furthermore, the landslide susceptibility scores calculated for each terrain unit are frequently more detailed and accurate than the location of the exposed population inside each territorial unit based on Census data. These drawbacks are not the ideal setting when landslide risk analysis is performed for urban management and emergency planning. Dasymetric cartography, which uses a parameter or set of parameters to restrict the spatial distribution of a particular phenomenon, is a methodology that may help to enhance the resolution of Census data and therefore to give a more realistic representation of the population distribution. Therefore, this work aims to map and to compare the population distribution based on a traditional approach (population per administrative terrain units) and based on dasymetric cartography (population by building). The study is developed in the Region North of Lisbon using 2011 population data and following three main steps: i) the landslide susceptibility assessment based on statistical models independently validated; ii) the evaluation of population distribution (absolute and density) for different administrative territorial units (Parishes and BGRI - the basic statistical unit in the Portuguese Census); and iii) the dasymetric population's cartography based on building areal weighting. Preliminary results show that in sparsely populated administrative units, population density differs more than two times depending on the application of the traditional approach or the dasymetric

  13. Quality assessment of cortex cinnamomi by HPLC chemical fingerprint, principle component analysis and cluster analysis.

    Science.gov (United States)

    Yang, Jie; Chen, Li-Hong; Zhang, Qin; Lai, Mao-Xiang; Wang, Qiang

    2007-06-01

    HPLC fingerprint analysis, principle component analysis (PCA), and cluster analysis were introduced for quality assessment of Cortex cinnamomi (CC). The fingerprint of CC was developed and validated by analyzing 30 samples of CC from different species and geographic locations. Seventeen chromatographic peaks were selected as characteristic peaks and their relative peak areas (RPA) were calculated for quantitative expression of the HPLC fingerprints. The correlation coefficients of similarity in chromatograms were higher than 0.95 for the same species while much lower than 0.6 for different species. Besides, two principal components (PCs) have been extracted by PCA. PC1 separated Cinnamomum cassia from other species, capturing 56.75% of variance while PC2 contributed for their further separation, capturing 19.08% variance. The scores of the samples showed that the samples could be clustered reasonably into different groups corresponding to different species and different regions. The scores and loading plots together revealed different chemical properties of each group clearly. The cluster analysis confirmed the results of PCA analysis. Therefore, HPLC fingerprint in combination with chemometric techniques provide a very flexible and reliable method for quality assessment of traditional Chinese medicines.

  14. Imminent Cardiac Risk Assessment via Optical Intravascular Biochemical Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wetzel, D.; Wetzel, L; Wetzel, M; Lodder, R

    2009-01-01

    Heart disease is by far the biggest killer in the United States, and type II diabetes, which affects 8% of the U.S. population, is on the rise. In many cases, the acute coronary syndrome and/or sudden cardiac death occurs without warning. Atherosclerosis has known behavioral, genetic and dietary risk factors. However, our laboratory studies with animal models and human post-mortem tissue using FT-IR microspectroscopy reveal the chemical microstructure within arteries and in the arterial walls themselves. These include spectra obtained from the aortas of ApoE-/- knockout mice on sucrose and normal diets showing lipid deposition in the former case. Also pre-aneurysm chemical images of knockout mouse aorta walls, and spectra of plaque excised from a living human patient are shown for comparison. In keeping with the theme of the SPEC 2008 conference Spectroscopic Diagnosis of Disease this paper describes the background and potential value of a new catheter-based system to provide in vivo biochemical analysis of plaque in human coronary arteries. We report the following: (1) results of FT-IR microspectroscopy on animal models of vascular disease to illustrate the localized chemical distinctions between pathological and normal tissue, (2) current diagnostic techniques used for risk assessment of patients with potential unstable coronary syndromes, and (3) the advantages and limitations of each of these techniques illustrated with patent care histories, related in the first person, by the physician coauthors. Note that the physician comments clarify the contribution of each diagnostic technique to imminent cardiac risk assessment in a clinical setting, leading to the appreciation of what localized intravascular chemical analysis can contribute as an add-on diagnostic tool. The quality of medical imaging has improved dramatically since the turn of the century. Among clinical non-invasive diagnostic tools, laboratory tests of body fluids, EKG, and physical examination are

  15. Assessing temporal variations in connectivity through suspended sediment hysteresis analysis

    Science.gov (United States)

    Sherriff, Sophie; Rowan, John; Fenton, Owen; Jordan, Phil; Melland, Alice; Mellander, Per-Erik; hUallacháin, Daire Ó.

    2016-04-01

    Connectivity provides a valuable concept for understanding catchment-scale sediment dynamics. In intensive agricultural catchments, land management through tillage, high livestock densities and extensive land drainage practices significantly change hydromorphological behaviour and alter sediment supply and downstream delivery. Analysis of suspended sediment-discharge hysteresis has offered insights into sediment dynamics but typically on a limited selection of events. Greater availability of continuous high-resolution discharge and turbidity data and qualitative hysteresis metrics enables assessment of sediment dynamics during more events and over time. This paper assesses the utility of this approach to explore seasonal variations in connectivity. Data were collected from three small (c. 10 km2) intensive agricultural catchments in Ireland with contrasting morphologies, soil types, land use patterns and management practices, and are broadly defined as low-permeability supporting grassland, moderate-permeability supporting arable and high-permeability supporting arable. Suspended sediment concentration (using calibrated turbidity measurements) and discharge data were collected at 10-min resolution from each catchment outlet and precipitation data were collected from a weather station within each catchment. Event databases (67-90 events per catchment) collated information on sediment export metrics, hysteresis category (e.g., clockwise, anti-clockwise, no hysteresis), numeric hysteresis index, and potential hydro-meteorological controls on sediment transport including precipitation amount, duration, intensity, stream flow and antecedent soil moisture and rainfall. Statistical analysis of potential controls on sediment export was undertaken using Pearson's correlation coefficient on separate hysteresis categories in each catchment. Sediment hysteresis fluctuations through time were subsequently assessed using the hysteresis index. Results showed the numeric

  16. Risk assessment for benefits analysis: framework for analysis of a thyroid-disrupting chemical.

    Science.gov (United States)

    Axelrad, Daniel A; Baetcke, Karl; Dockins, Chris; Griffiths, Charles W; Hill, Richard N; Murphy, Patricia A; Owens, Nicole; Simon, Nathalie B; Teuschler, Linda K

    Benefit-cost analysis is of growing importance in developing policies to reduce exposures to environmental contaminants. To quantify health benefits of reduced exposures, economists generally rely on dose-response relationships estimated by risk assessors. Further, to be useful for benefits analysis, the endpoints that are quantified must be expressed as changes in incidence of illnesses or symptoms that are readily understood by and perceptible to the layperson. For most noncancer health effects and for nonlinear carcinogens, risk assessments generally do not provide the dose-response functions necessary for economic benefits analysis. This article presents the framework for a case study that addresses these issues through a combination of toxicology, epidemiology, statistics, and economics. The case study assesses a chemical that disrupts proper functioning of the thyroid gland, and considers the benefits of reducing exposures in terms of both noncancer health effects (hypothyroidism) and thyroid cancers. The effects are presumed to be due to a mode of action involving interference with thyroid-pituitary functioning that would lead to nonlinear dose response. The framework integrates data from animal testing, statistical modeling, human data from the medical and epidemiological literature, and economic methodologies and valuation studies. This interdisciplinary collaboration differs from the more typical approach in which risk assessments and economic analyses are prepared independently of one another. This framework illustrates particular approaches that may be useful for expanded quantification of adverse health effects, and demonstrates the potential of such interdisciplinary approaches. Detailed implementation of the case study framework will be presented in future publications.

  17. Concepts of Causality in Psychopathology: Applications in Clinical Assessment, Clinical Case Formulation and Functional Analysis

    NARCIS (Netherlands)

    Haynes, S.H.; O'Brien, W.H.; Kaholokula, J.K.; Witteman, C.L.M.

    2012-01-01

    This paper discusses and integrates concepts of causality in psychopathology, clinical assessment, clinical case formulation and the functional analysis. We propose that identifying causal variables, relations and mechanisms in psychopathology and clinical assessment can lead to more powerful and e

  18. Analysis of existing risk assessments, and list of suggestions

    CERN Document Server

    Heimsch, Laura

    2016-01-01

    The scope of this project was to analyse risk assessments made at CERN and extracting some crucial information about the different methodologies used, profiles of people who make the risk assessments, and gathering information of whether the risk matrix was used and if the acceptable level of risk was defined. Second step of the project was to trigger discussion inside HSE about risk assessment by suggesting a risk matrix and a risk assessment template.

  19. Portfolio Assessment on Chemical Reactor Analysis and Process Design Courses

    Science.gov (United States)

    Alha, Katariina

    2004-01-01

    Assessment determines what students regard as important: if a teacher wants to change students' learning, he/she should change the methods of assessment. This article describes the use of portfolio assessment on five courses dealing with chemical reactor and process design during the years 1999-2001. Although the use of portfolio was a new…

  20. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    Science.gov (United States)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  1. Assessing Student Teachers' Reflective Writing through Quantitative Content Analysis

    Science.gov (United States)

    Poldner, Eric; Van der Schaaf, Marieke; Simons, P. Robert-Jan; Van Tartwijk, Jan; Wijngaards, Guus

    2014-01-01

    Students' reflective essay writing can be stimulated by the formative assessments provided to them by their teachers. Such assessments contain information about the quality of students' reflective writings and offer suggestions for improvement. Despite the importance of formatively assessing students' reflective writings in teacher education…

  2. Flood Risk Analysis and Flood Potential Losses Assessment

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The heavy floods in the Taihu Basin showed increasing trend in recent years. In thiswork, a typical area in the northern Taihu Basin was selected for flood risk analysis and potentialflood losses assessment. Human activities have strong impact on the study area' s flood situation (asaffected by the polders built, deforestation, population increase, urbanization, etc. ), and havemade water level higher, flood duration shorter, and flood peaks sharper. Five years of differentflood return periods [(1970), 5 (1962), 10 (1987), 20 (1954), 50 (1991)] were used to cal-culate the potential flood risk area and its losses. The potential flood risk map, economic losses,and flood-impacted population were also calculated. The study's main conclusions are: 1 ) Humanactivities have strongly changed the natural flood situation in the study area, increasing runoff andflooding; 2) The flood risk area is closely related with the precipitation center; 3) Polder construc-tion has successfully protected land from flood, shortened the flood duration, and elevated waterlevel in rivers outside the polders; 4) Economic and social development have caused flood losses toincrease in recent years.

  3. Assessment of the Assessment Tool: Analysis of Items in a Non-MCQ Mathematics Exam

    Science.gov (United States)

    Khoshaim, Heba Bakr; Rashid, Saima

    2016-01-01

    Assessment is one of the vital steps in the teaching and learning process. The reported action research examines the effectiveness of an assessment process and inspects the validity of exam questions used for the assessment purpose. The instructors of a college-level mathematics course studied questions used in the final exams during the academic…

  4. Analysis Strategy for Fracture Assessment of Defects in Ductile Materials

    Energy Technology Data Exchange (ETDEWEB)

    Dillstroem, Peter; Andersson, Magnus; Sattari-Far, Iradj; Weilin Zang (Inspecta Technology AB, Stockholm (Sweden))

    2009-06-15

    The main purpose of this work is to investigate the significance of the residual stresses for defects (cracks) in ductile materials with nuclear applications, when the applied primary (mechanical) loads are high. The treatment of weld-induced stresses as expressed in the SACC/ProSACC handbook and other fracture assessment procedures such as the ASME XI code and the R6-method is believed to be conservative for ductile materials. This is because of the general approach not to account for the improved fracture resistance caused by ductile tearing. Furthermore, there is experimental evidence that the contribution of residual stresses to fracture diminishes as the degree of yielding increases to a high level. However, neglecting weld-induced stresses in general, though, is doubtful for loads that are mostly secondary (e.g. thermal shocks) and for materials which are not ductile enough to be limit load controlled. Both thin-walled and thick-walled pipes containing surface cracks are studied here. This is done by calculating the relative contribution from the weld residual stresses to CTOD and the J-integral. Both circumferential and axial cracks are analysed. Three different crack geometries are studied here by using the finite element method (FEM). (i) 2D axisymmetric modelling of a V-joint weld in a thin-walled pipe. (ii) 2D axisymmetric modelling of a V-joint weld in a thick-walled pipe. (iii) 3D modelling of a X-joint weld in a thick-walled pipe. t. Each crack configuration is analysed for two load cases; (1) Only primary (mechanical) loading is applied to the model, (2) Both secondary stresses and primary loading are applied to the model. Also presented in this report are some published experimental investigations conducted on cracked components of ductile materials subjected to both primary and secondary stresses. Based on the outcome of this study, an analysis strategy for fracture assessment of defects in ductile materials of nuclear components is proposed. A new

  5. Breast Image Analysis for Risk Assessment, Detection, Diagnosis, and Treatment of Cancer

    NARCIS (Netherlands)

    Giger, M.L.; Karssemeijer, N.; Schnabel, J.A.

    2013-01-01

    The role of breast image analysis in radiologists' interpretation tasks in cancer risk assessment, detection, diagnosis, and treatment continues to expand. Breast image analysis methods include segmentation, feature extraction techniques, classifier design, biomechanical modeling, image registration

  6. SOCIAL NETWORK ANALYSIS FOR ASSESSING SOCIAL CAPITAL IN BIOSECURITY ECOLITERACY

    Directory of Open Access Journals (Sweden)

    Sang Putu Kaler Surata

    2016-02-01

    Full Text Available Abstract: Social Network Analysis for Assessing Social Capital in Biosecurity Ecoliteracy. Biosecurity ecoliteracy (BEL is a view of literacy that applies ecological concepts to promote in-depth understanding, critical reflection, creative thinking, self consciousness, communication and social skills, in analyzing and managing issues around plant health/living, animal health/living and the risks that are associated with the environment. We used social network analysis (SNA to evaluate two distinct forms of social capital of BEL: social cohesion and network structure. This study was executed by employing cooperative learning in BEL toward 30 undergraduate teacher training students. Data then was analyzed using UCINET software. We found the tendency of so­cial cohesion to increase after students participated in BEL. This was supported by several SNA measures (density, closeness and degree and these values at the end were statistically different than at the beginning of BEL. The social structure map (sociogram after BEL visualized that students were much more likely to cluster in groups compared with the sociogram before BEL. Thus BEL, through cooperative learning, was able to promote social capital. In addition SNA proved a useful tool for evaluating the achievement levels of social capital of BEL in the form of network cohesion and network structure. Abstrak: Analisis Jaringan Sosial untuk Menilai Ekoliterasi Ketahanan Hayati. Ekoliterasi ketahanan hayati (EKH adalah literasi yang mengaplikasikan berbagai konsep ekologi untuk mempromosikan pe­mahaman yang mendalam, refleksi kritis, kesadaran diri, keterampilan sosial dan berkomunikasi, dalam menganalisis, dan mengelola isu yang terkait dengan kesehatan/kehidupan tanaman, kesehatan/kehidupan binatang, dan risiko yang terkait dengan lingkungan. Analisis jaringan kerja sosial (AJS telah digunakan untuk mengevaluasi dua bentuk model sosial EKH: kohesi sosial dan struktur jaringan kerja. Untuk itu

  7. Assessment of academic departments efficiency using data envelopment analysis

    Directory of Open Access Journals (Sweden)

    Salah R. Agha

    2011-07-01

    Full Text Available Purpose: In this age of knowledge economy, universities play an important role in the development of a country. As government subsidies to universities have been decreasing, more efficient use of resources becomes important for university administrators. This study evaluates the relative technical efficiencies of academic departments at the Islamic University in Gaza (IUG during the years 2004-2006. Design/methodology/approach: This study applies Data Envelopment Analysis (DEA to assess the relative technical efficiency of the academic departments. The inputs are operating expenses, credit hours and training resources, while the outputs are number of graduates, promotions and public service activities. The potential improvements and super efficiency are computed for inefficient and efficient departments respectively. Further, multiple linear -regression is used to develop a relationship between super efficiency and input and output variables.Findings: Results show that the average efficiency score is 68.5% and that there are 10 efficient departments out of the 30 studied. It is noted that departments in the faculty of science, engineering and information technology have to greatly reduce their laboratory expenses. The department of economics and finance was found to have the highest super efficiency score among the efficient departments. Finally, it was found that promotions have the greatest contribution to the super efficiency scores while public services activities come next.Research limitations/implications: The paper focuses only on academic departments at a single university. Further, DEA is deterministic in nature.Practical implications: The findings offer insights on the inputs and outputs that significantly contribute to efficiencies so that inefficient departments can focus on these factors.Originality/value: Prior studies have used only one type of DEA (BCC and they did not explicitly answer the question posed by the inefficient

  8. Problems of method of technology assessment. A methodological analysis; Methodenprobleme des Technology Assessment; Eine methodologische Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, V.

    1993-03-01

    The study undertakes to analyse the theoretical and methodological structure of Technology Assessment (TA). It is based on a survey of TA studies which provided an important condition for theoreticall sound statements on methodological aspects of TA. It was established that the main basic theoretical problems of TA are in the field of dealing with complexity. This is also apparent in the constitution of problems, the most elementary and central approach of TA. Scientifically founded constitution of problems and the corresponding construction of models call for interdisciplinary scientific work. Interdisciplinarity in the TA research process is achieved at the level of virtual networks, these networks being composed of individuals suited to teamwork. The emerging network structures have an objective-organizational and an ideational basis. The objective-organizational basis is mainly the result of team composition and the external affiliations of the team members. The ideational basis of the virtual network is represented by the team members` mode of thinking, which is individually located at a multidisciplinary level. The theoretical `skeleton` of the TA knowledge system, which is represented by process knowledge based linkage structures, can be generated and also processed in connection with the knowledge on types of problems, areas of analysis and procedures to deal with complexity. Within this process, disciplinary knowledge is a necessary but not a sufficient condition. Metatheoretical and metadisciplinary knowledge and the correspondingly processes complexity of models are the basis for the necessary methodological awareness, that allows TA to become designable as a research procedure. (orig./HP) [Deutsch] Die Studie stellt sich die Aufgabe, die theoretische und methodische Struktur des Technology Assessment (TA) zu analysieren. Sie fusst auf Erhebungen, die bei Technology-Assessment-Studien vorgenommen wurden und die wesentliche Voraussetzungen fuer

  9. Performance Analysis of the Capability Assessment Tool for Sustainable Manufacturing

    Directory of Open Access Journals (Sweden)

    Enda Crossin

    2013-08-01

    Full Text Available This paper explores the performance of a novel capability assessment tool, developed to identify capability gaps and associated training and development requirements across the supply chain for environmentally-sustainable manufacturing. The tool was developed to assess 170 capabilities that have been clustered with respect to key areas of concern such as managing energy, water, material resources, carbon emissions and waste as well as environmental management practices for sustainability. Two independent expert teams used the tool to assess a sample group of five first and second tier sports apparel and footwear suppliers within the supply chain of a global sporting goods manufacturer in Asia. The paper addresses the reliability and robustness of the developed assessment method by formulating the expected links between the assessment results. The management practices of the participating suppliers were shown to be closely connected to their performance in managing their resources and emissions. The companies’ initiatives in implementing energy efficiency measures were found to be generally related to their performance in carbon emissions management. The suppliers were also asked to undertake a self-assessment by using a short questionnaire. The large gap between the comprehensive assessment and these in-house self-assessments revealed the suppliers’ misconceptions about their capabilities.

  10. Life Cycle Assessment Software for Product and Process Sustainability Analysis

    Science.gov (United States)

    Vervaeke, Marina

    2012-01-01

    In recent years, life cycle assessment (LCA), a methodology for assessment of environmental impacts of products and services, has become increasingly important. This methodology is applied by decision makers in industry and policy, product developers, environmental managers, and other non-LCA specialists working on environmental issues in a wide…

  11. Ways of Writing: Linguistic Analysis of Self-Assessment and Traditional Assignments

    Science.gov (United States)

    Peden, Blaine F.; Carroll, David W.

    2008-01-01

    Scholars of teaching and learning have endorsed self-assessment assignments as a way to encourage greater reflection by students. However, no studies to date have compared writing in self-assessment with traditional academic assignments. We performed a quantitative text analysis of students' language in self-assessment versus traditional…

  12. Uncertainty analysis and global sensitivity analysis of techno-economic assessments for biodiesel production.

    Science.gov (United States)

    Tang, Zhang-Chun; Zhenzhou, Lu; Zhiwen, Liu; Ningcong, Xiao

    2015-01-01

    There are various uncertain parameters in the techno-economic assessments (TEAs) of biodiesel production, including capital cost, interest rate, feedstock price, maintenance rate, biodiesel conversion efficiency, glycerol price and operating cost. However, fewer studies focus on the influence of these parameters on TEAs. This paper investigated the effects of these parameters on the life cycle cost (LCC) and the unit cost (UC) in the TEAs of biodiesel production. The results show that LCC and UC exhibit variations when involving uncertain parameters. Based on the uncertainty analysis, three global sensitivity analysis (GSA) methods are utilized to quantify the contribution of an individual uncertain parameter to LCC and UC. The GSA results reveal that the feedstock price and the interest rate produce considerable effects on the TEAs. These results can provide a useful guide for entrepreneurs when they plan plants.

  13. Coronary plaque composition as assessed by greyscale intravascular ultrasound and radiofrequency spectral data analysis

    NARCIS (Netherlands)

    N. Gonzalo (Nieves); H.M. Garcia-Garcia (Hector); J.M.R. Ligthart (Jürgen); G.A. Rodriguez-Granillo (Gaston); E. Meliga (Emanuele); Y. Onuma (Yoshinobu); J.C.H. Schuurbiers (Johan); N. Bruining (Nico); P.W.J.C. Serruys (Patrick)

    2008-01-01

    textabstractObjectives: (i) To explore the relation between greyscale intravascular ultrasound (IVUS) plaque qualitative classification and IVUS radiofrequency data (RFD) analysis tissue types; (ii) to evaluate if plaque composition as assessed by RFD analysis can be predicted by visual assessment o

  14. Assessing the validity of discourse analysis: transdisciplinary convergence

    Science.gov (United States)

    Jaipal-Jamani, Kamini

    2014-12-01

    Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to research. The argument is made that discourse analysis explicitly grounded in semiotics, systemic functional linguistics, and critical theory, offers a credible research methodology. The underlying assumptions, constructs, and techniques of analysis of these three theoretical disciplines can be drawn on to show convergence of data at multiple levels, validating interpretations from text analysis.

  15. Scout: orbit analysis and hazard assessment for NEOCP objects

    Science.gov (United States)

    Farnocchia, Davide; Chesley, Steven R.; Chamberlin, Alan B.

    2016-10-01

    It typically takes a few days for a newly discovered asteroid to be officially recognized as a real object. During this time, the tentative discovery is published on the Minor Planet Center's Near-Earth Object Confirmation Page (NEOCP) until additional observations confirm that the object is a real asteroid rather than an observational artifact or an artificial object. Also, NEOCP objects could have a limited observability window and yet be scientifically interesting, e.g., radar and lightcurve targets, mini-moons (temporary Earth captures), mission accessible targets, close approachers or even impactors. For instance, the only two asteroids discovered before an impact, 2008 TC3 and 2014 AA, both reached the Earth less than a day after discovery. For these reasons we developed Scout, an automated system that provides an orbital and hazard assessment for NEOCP objects within minutes after the observations are available. Scout's rapid analysis increases the chances of securing the trajectory of interesting NEOCP objects before the ephemeris uncertainty grows too large or the observing geometry becomes unfavorable. The generally short observation arcs, perhaps only a few hours or even less, lead severe degeneracies in the orbit estimation process. To overcome these degeneracies Scout relies on systematic ranging, a technique that derives possible orbits by scanning a grid in the poorly constrained space of topocentric range and range rate, while the plane-of-sky position and motion are directly tied to the recorded observations. This scan allows us to derive a distribution of the possible orbits and in turn identify the NEOCP objects of most interest to prioritize followup efforts. In particular, Scout ranks objects according to the likelihood of an impact, estimates the close approach distance, the Earth-relative minimum orbit intersection distance and v-infinity, and computes scores to identify objects more likely to be an NEO, a km-sized NEO, a Potentially

  16. Assessing the Validity of Discourse Analysis: Transdisciplinary Convergence

    Science.gov (United States)

    Jaipal-Jamani, Kamini

    2014-01-01

    Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to…

  17. An Application of Conjoint Analysis in Agricultural Sustainability Assessment

    OpenAIRE

    Sydorovych, Olha; Wossink, Ada

    2008-01-01

    Increasing public interest in the concept of sustainable agriculture has resulted in the development of a number of methods that could be used for the assessment of sustainability of various agricultural production systems. Because of its complex, multi-dimensional nature, sustainability is most often assessed using numerous indicators, which make aggregate comparisons among systems difficult. In this paper we propose a methodology that could be beneficial in aggregate sustainability assessme...

  18. No-Reference Video Quality Assessment using Codec Analysis

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Forchhammer, Søren; Korhonen, Jari

    2015-01-01

    A no-reference video quality assessment (VQA) method is presented for videos distorted by H.264/AVC and MPEG-2. The assessment is performed without access to the bit-stream. Instead we analyze and estimate coefficients based on decoded pixels. The approach involves distinguishing between the two...... (SVR). For validation purposes, the proposed method was tested on two databases. In both cases good performance compared with state of the art full, reduced, and no-reference VQA algorithms was achieved....

  19. Radiological assessment. A textbook on environmental dose analysis

    Energy Technology Data Exchange (ETDEWEB)

    Till, J.E.; Meyer, H.R. (eds.)

    1983-09-01

    Radiological assessment is the quantitative process of estimating the consequences to humans resulting from the release of radionuclides to the biosphere. It is a multidisciplinary subject requiring the expertise of a number of individuals in order to predict source terms, describe environmental transport, calculate internal and external dose, and extrapolate dose to health effects. Up to this time there has been available no comprehensive book describing, on a uniform and comprehensive level, the techniques and models used in radiological assessment. Radiological Assessment is based on material presented at the 1980 Health Physics Society Summer School held in Seattle, Washington. The material has been expanded and edited to make it comprehensive in scope and useful as a text. Topics covered include (1) source terms for nuclear facilities and Medical and Industrial sites; (2) transport of radionuclides in the atmosphere; (3) transport of radionuclides in surface waters; (4) transport of radionuclides in groundwater; (5) terrestrial and aquatic food chain pathways; (6) reference man; a system for internal dose calculations; (7) internal dosimetry; (8) external dosimetry; (9) models for special-case radionuclides; (10) calculation of health effects in irradiated populations; (11) evaluation of uncertainties in environmental radiological assessment models; (12) regulatory standards for environmental releases of radionuclides; (13) development of computer codes for radiological assessment; and (14) assessment of accidental releases of radionuclides.

  20. Performance Assessment and Authentic Assessment: A Conceptual Analysis of the Literature

    OpenAIRE

    Torulf Palm

    2008-01-01

    Performance assessment and authentic assessment are recurrent terms in the literature on education and educational research. They have both been given a number of different meanings and unclear definitions and are in some publications not defined at all. Such uncertainty of meaning causes difficulties in interpretation and communication and can cause clouded or misleading research conclusions. This paper reviews the meanings attached to these concepts in the literature and describes the simil...

  1. Operator Alertness/Workload Assessment Using Stochastic Model-Based Analysis of Myoelectric Signals.

    Science.gov (United States)

    1985-11-01

    BASED ANALYSIS OF MYOELECTRIC SIGNALS A. Madni C. Conaway S. Otsubo Y. Chu D" O~~r ’: Prepared For: E - AIR FORCE OFFICE OF SCIENTIFIC RESEARCH...November 1985 Phase II Interim Report For: so OPERATOR ALERTNESS/WORKLOAD ASSESSMENT USING STOCHASTIC MODEL-BASED ANALYSIS OF MYOELECTRIC SIGNALS A...TITLE (Include Security ClasSification) Operator Alterness/Workload Assessment Using Stochastic Model-Based Analysis of Myoelectric Signals 112

  2. Windfarm Generation Assessment for Reliability Analysis of Power Systems

    DEFF Research Database (Denmark)

    Barberis Negra, Nicola; Bak-Jensen, Birgitte; Holmstrøm, O.

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  3. Windfarm Generation Assessment for ReliabilityAnalysis of Power Systems

    DEFF Research Database (Denmark)

    Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  4. IoA Experience on Aerial Delivery

    Science.gov (United States)

    2006-10-01

    IEEE Transactions on Automatic Control , Vol.42, No.9, pp. 1200 - 1221. [3] Graffstein J. et...Delay Controller, IEEE Transactions on Automatic Control , Vol. AC-24, No.2, pp. 370 - 372. [13] Xie, L.-L., Guo, L. (2000). How much Uncertainty...can be Dealt with by Feedback, IEEE Transactions on Automatic Control , Vol. 45, No.12, pp. 2203 - 2217. [14] Youcef – Toumi, K., Wu,

  5. Environmental Impact Assessment for Socio-Economic Analysis of Chemicals

    DEFF Research Database (Denmark)

    Calow, Peter; Biddinger, G; Hennes, C;

    This report describes the requirements for, and illustrates the application of, a methodology for a socio-economic analysis (SEA) especially as it might be adopted in the framework of REACH.......This report describes the requirements for, and illustrates the application of, a methodology for a socio-economic analysis (SEA) especially as it might be adopted in the framework of REACH....

  6. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    Science.gov (United States)

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  7. Performance Assessment and Authentic Assessment: A Conceptual Analysis of the Literature

    Directory of Open Access Journals (Sweden)

    Torulf Palm

    2008-04-01

    Full Text Available Performance assessment and authentic assessment are recurrent terms in the literature on education and educational research. They have both been given a number of different meanings and unclear definitions and are in some publications not defined at all. Such uncertainty of meaning causes difficulties in interpretation and communication and can cause clouded or misleading research conclusions. This paper reviews the meanings attached to these concepts in the literature and describes the similarities and wide range of differences between the meanings of each concept.

  8. Using Empirical Article Analysis to Assess Research Methods Courses

    Science.gov (United States)

    Bachiochi, Peter; Everton, Wendi; Evans, Melanie; Fugere, Madeleine; Escoto, Carlos; Letterman, Margaret; Leszczynski, Jennifer

    2011-01-01

    Developing students who can apply their knowledge of empirical research is a key outcome of the undergraduate psychology major. This learning outcome was assessed in two research methods courses by having students read and analyze a condensed empirical journal article. At the start and end of the semester, students in multiple sections of an…

  9. Windfarm generation assessment for reliability analysis of power systems

    DEFF Research Database (Denmark)

    Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.;

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...

  10. Repeater Analysis for Combining Information from Different Assessments

    Science.gov (United States)

    Haberman, Shelby; Yao, Lili

    2015-01-01

    Admission decisions frequently rely on multiple assessments. As a consequence, it is important to explore rational approaches to combine the information from different educational tests. For example, U.S. graduate schools usually receive both TOEFL iBT® scores and GRE® General scores of foreign applicants for admission; however, little guidance…

  11. Analysis of Differential Item Functioning in the NAEP History Assessment.

    Science.gov (United States)

    Zwick, Rebecca; Ercikan, Kadriye

    The Mantel-Haenszel approach for investigating differential item functioning (DIF) was applied to U.S. history items that were administered as part of the National Assessment of Educational Progress (NAEP). DIF analyses were based on the responses of 7,743 students in grade 11. On some items, Blacks, Hispanics, and females performed more poorly…

  12. ASSESSMENT OF REGIONAL EFFICIENCY IN CROATIA USING DATA ENVELOPMENT ANALYSIS

    Directory of Open Access Journals (Sweden)

    Danijela Rabar

    2013-02-01

    Full Text Available In this paper, regional efficiency of Croatian counties is measured in three-year period (2005-2007 using Data Envelopment Analysis (DEA. The set of inputs and outputs consists of seven socioeconomic indicators. Analysis is carried out using models with assumption of variable returns-to-scale. DEA identifies efficient counties as benchmark members and inefficient counties that are analyzed in detail to determine the sources and the amounts of their inefficiency in each source. To enable proper monitoring of development dynamics, window analysis is applied. Based on the results, guidelines for implementing necessary improvements to achieve efficiency are given. Analysis reveals great disparities among counties. In order to alleviate naturally, historically and politically conditioned unequal county positions over which economic policy makers do not have total control, categorical approach is introduced as an extension to the basic DEA models. This approach, combined with window analysis, changes relations among efficiency scores in favor of continental counties.

  13. A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment

    Science.gov (United States)

    Finch, Holmes; Monahan, Patrick

    2008-01-01

    This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…

  14. Modeling and Analysis on Radiological Safety Assessment of Low- and Intermediate Level Radioactive Waste Repository

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Youn Myoung; Jung, Jong Tae; Kang, Chul Hyung (and others)

    2008-04-15

    Modeling study and analysis for technical support for the safety and performance assessment of the low- and intermediate level (LILW) repository partially needed for radiological environmental impact reporting which is essential for the licenses for construction and operation of LILW has been fulfilled. Throughout this study such essential area for technical support for safety and performance assessment of the LILW repository and its licensing as gas generation and migration in and around the repository, risk analysis and environmental impact during transportation of LILW, biosphere modeling and assessment for the flux-to-dose conversion factors for human exposure as well as regional and global groundwater modeling and analysis has been carried out.

  15. A root cause analysis approach to risk assessment of a pipeline network for Kuwait Oil Company

    Energy Technology Data Exchange (ETDEWEB)

    Davies, Ray J.; Alfano, Tony D. [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Waheed, Farrukh [Kuwait Oil Company, Ahmadi (Kuwait); Komulainen, Tiina [Kongsberg Oil and Gas Technologies, Sandvika (Norway)

    2009-07-01

    A large scale risk assessment was performed by Det Norske Veritas (DNV) for the entire Kuwait Oil Company (KOC) pipeline network. This risk assessment was unique in that it incorporated the assessment of all major sources of process related risk faced by KOC and included root cause management system related risks in addition to technical risks related to more immediate causes. The assessment was conducted across the entire pipeline network with the scope divided into three major categories:1. Integrity Management 2. Operations 3. Management Systems Aspects of integrity management were ranked and prioritized using a custom algorithm based on critical data sets. A detailed quantitative risk assessment was then used to further evaluate those issues deemed unacceptable, and finally a cost benefit analysis approach was used to compare and select improvement options. The operations assessment involved computer modeling of the entire pipeline network to assess for bottlenecks, surge and erosion analysis, and to identify opportunities within the network that could potentially lead to increased production. The management system assessment was performed by conducting a gap analysis on the existing system and by prioritizing those improvement actions that best aligned with KOC's strategic goals for pipelines. Using a broad and three-pronged approach to their overall risk assessment, KOC achieved a thorough, root cause analysis-based understanding of risks to their system as well as a detailed list of recommended remediation measures that were merged into a 5-year improvement plan. (author)

  16. Analysis of Management Behavior Assessments and Affect on Productivity

    Science.gov (United States)

    2005-06-10

    associated with the affects of management behavior on employee beliefs and attitudes; include Herzberg’s Two Factor Theory , Vroom’s Expectancy and Equity...achieved, when recognized, when given additional responsibility, and when advanced in job position. Vroom’s Expectancy and Equity Theories of Motivation...degree or higher. The results of the study proved the validity of the Vroom and Yetton theory , but the employee assessments of management decision-making

  17. Orbital data applications for space objects conjunction assessment and situation analysis

    CERN Document Server

    Chen, Lei; Liang, Yan-Gang; Li, Ke-Bo

    2017-01-01

    This book introduces readers to the application of orbital data on space objects in the contexts of conjunction assessment and space situation analysis, including theories and methodologies. It addresses the main topics involved in space object conjunction assessment, such as: orbital error analysis of space objects; close approach analysis; the calculation, analysis and application of collision probability; and the comprehensive assessment of collision risk. In addition, selected topics on space situation analysis are also presented, including orbital anomaly and space event analysis, and so on. The book offers a valuable guide for researchers and engineers in the fields of astrodynamics, space telemetry, tracking and command (TT&C), space surveillance, space situational awareness, and space debris, as well as for graduates majoring in flight vehicle design and related fields.

  18. Assessment of the Assessment Tool: Analysis of Items in a Non-MCQ Mathematics Exam

    Directory of Open Access Journals (Sweden)

    Heba Bakr Khoshaim

    2016-01-01

    Full Text Available Assessment is one of the vital steps in the teaching and learning process. The reported action research examines the effectiveness of an assessment process and inspects the validity of exam questions used for the assessment purpose. The instructors of a college-level mathematics course studied questions used in the final exams during the academic years 2013–2014 and 2014−2015. Using the data from 206 students, the researchers analyzed 54 exam questions with regard to the complexity level, the difficulty coefficient and the discrimination coefficient. Findings indicated that the complexity level correlated with the difficulty coefficient for only one of three semesters. In addition, the correlation between the discrimination coefficient and the difficulty coefficient was found to be statistically significant in all three semesters. The results suggest that all three exams were acceptable; however, further attention should be given to the complexity level of questions used in mathematical tests and that moderate difficulty level questions are better classifying students’ performance.

  19. Risk Assessment of Infrastructure System of Systems with Precursor Analysis.

    Science.gov (United States)

    Guo, Zhenyu; Haimes, Yacov Y

    2016-08-01

    Physical infrastructure systems are commonly composed of interconnected and interdependent subsystems, which in their essence constitute system of systems (S-o-S). System owners and policy researchers need tools to foresee potential emergent forced changes and to understand their impact so that effective risk management strategies can be developed. We develop a systemic framework for precursor analysis to support the design of an effective and efficient precursor monitoring and decision support system with the ability to (i) identify and prioritize indicators of evolving risks of system failure; and (ii) evaluate uncertainties in precursor analysis to support informed and rational decision making. This integrated precursor analysis framework is comprised of three processes: precursor identification, prioritization, and evaluation. We use an example of a highway bridge S-o-S to demonstrate the theories and methodologies of the framework. Bridge maintenance processes involve many interconnected and interdependent functional subsystems and decision-making entities and bridge failure can have broad social and economic consequences. The precursor analysis framework, which constitutes an essential part of risk analysis, examines the impact of various bridge inspection and maintenance scenarios. It enables policy researchers and analysts who are seeking a risk perspective on bridge infrastructure in a policy setting to develop more risk informed policies and create guidelines to efficiently allocate limited risk management resources and mitigate severe consequences resulting from bridge failures.

  20. Quantitative assessment of human motion using video motion analysis

    Science.gov (United States)

    Probe, John D.

    1993-01-01

    In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.

  1. Assessing Teacher Performance in the Classroom: Pattern Analysis Applied to Interaction Data.

    Science.gov (United States)

    Shymansky, James A.

    1978-01-01

    Selected data from a junior high school science assessment study are presented to demonstrate the potential of pattern analysis for teacher evaluation, and to illustrate its superiority over other classroom observation techniques. (CP)

  2. Cost-effectiveness analysis: adding value to assessment of animal health welfare and production.

    Science.gov (United States)

    Babo Martins, S; Rushton, J

    2014-12-01

    Cost-effectiveness analysis (CEA) has been extensively used in economic assessments in fields related to animal health, namely in human health where it provides a decision-making framework for choices about the allocation of healthcare resources. Conversely, in animal health, cost-benefit analysis has been the preferred tool for economic analysis. In this paper, the use of CEA in related areas and the role of this technique in assessments of animal health, welfare and production are reviewed. Cost-effectiveness analysis can add further value to these assessments, particularly in programmes targeting animal welfare or animal diseases with an impact on human health, where outcomes are best valued in natural effects rather than in monetary units. Importantly, CEA can be performed during programme implementation stages to assess alternative courses of action in real time.

  3. 78 FR 27235 - Technical Guidance for Assessing Environmental Justice in Regulatory Analysis

    Science.gov (United States)

    2013-05-09

    ..., Office of Policy, National Center for Environmental Economics, Mail code 1809T, Environmental Protection... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY Technical Guidance for Assessing Environmental Justice in Regulatory Analysis...

  4. Model Analysis Assessing the dynamics of student learning

    CERN Document Server

    Bao, L; Bao, Lei; Redish, Edward F.

    2002-01-01

    In this paper we present a method of modeling and analysis that permits the extraction and quantitative display of detailed information about the effects of instruction on a class's knowledge. The method relies on a congitive model that represents student thinking in terms of mental models. Students frequently fail to recognize relevant conditions that lead to appropriate uses of their models. As a result they can use multiple models inconsistently. Once the most common mental models have been determined by qualitative research, they can be mapping onto a multiple choice test. Model analysis permits the interpretation of such a situation. We illustrate the use of our method by analyzing results from the FCI.

  5. Assessing Texture of Slub-Yarn Fabric Using Image Analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The application of digital image processing to the classification of the slub-yarn texture is discussed. Texture of the slub-yam fabric is analyzed by using the texture analysis techniques. The influence of the slub-yarn parameters on the fabric texture is discussed. Results indicate that texture of the slub-yarn fabric can be reliably measured using gray level co-occurrence matrix (GLCM)analysis. The four indices of GLCM, the angular second moment, the contrast, the inverse difference moment and the correlation, are sensitive to the change of the slub-yarn parameters, and can be regarded as the major indices for the texture.

  6. Numerical analysis and geotechnical assessment of mine scale model

    Institute of Scientific and Technical Information of China (English)

    Khanal Manoj; Adhikary Deepak; Balusu Rao

    2012-01-01

    Various numerical methods are available to model,simulate,analyse and interpret the results; however a major task is to select a reliable and intended tool to perform a realistic assessment of any problem.For a model to be a representative of the realistic mining scenario,a verified tool must be chosen to perform an assessment of mine roof support requirement and address the geotechnical risks associated with longwall mining.The dependable tools provide a safe working environment,increased production,efficient management of resources and reduce environmental impacts of mining.Although various methods,for example,analytical,experimental and empirical are being adopted in mining,in recent days numerical tools are becoming popular due to the advancement in computer hardware and numerical methods.Empirical rules based on past experiences do provide a general guide,however due to the heterogeneous nature of mine geology (i.e.,none of the mine sites are identical),numerical simulations of mine site specific conditions would lend better insights into some underlying issues.The paper highlights the use of a continuum mechanics based tool in coal mining with a mine scale model.The continuum modelling can provide close to accurate stress fields and deformation.The paper describes the use of existing mine data to calibrate and validate the model parameters,which then are used to assess geotechnical issues related with installing a new high capacity longwall mine at the mine site.A variety of parameters,for example,chock convergences,caveability of overlying sandstones,abutment and vertical stresses have been estimated.

  7. A Risk-Analysis Approach to Implementing Web-Based Assessment

    Science.gov (United States)

    Ricketts, Chris; Zakrzewski, Stan

    2005-01-01

    Computer-Based Assessment is a risky business. This paper proposes the use of a model for web-based assessment systems that identifies pedagogic, operational, technical (non web-based), web-based and financial risks. The strategies and procedures for risk elimination or reduction arise from risk analysis and management and are the means by which…

  8. An Analysis of Large-Scale Writing Assessments in Canada (Grades 5-8)

    Science.gov (United States)

    Peterson, Shelley Stagg; McClay, Jill; Main, Kristin

    2011-01-01

    This paper reports on an analysis of large-scale assessments of Grades 5-8 students' writing across 10 provinces and 2 territories in Canada. Theory, classroom practice, and the contributions and constraints of large-scale writing assessment are brought together with a focus on Grades 5-8 writing in order to provide both a broad view of…

  9. Literary translation and quality assessment analysis – its significance in translation training

    OpenAIRE

    2004-01-01

    This paper aims to highlight the role of translation quality assessment in translation training so as to develop students’ translation competence and skills to face translation problems. An analysis to assess literary translation quality is proposed before proceeding to discuss its pedagogical implementation.

  10. Assessing Key Assumptions of Network Meta-Analysis: A Review of Methods

    Science.gov (United States)

    Donegan, Sarah; Williamson, Paula; D'Alessandro, Umberto; Tudur Smith, Catrin

    2013-01-01

    Background: Homogeneity and consistency assumptions underlie network meta-analysis (NMA). Methods exist to assess the assumptions but they are rarely and poorly applied. We review and illustrate methods to assess homogeneity and consistency. Methods: Eligible articles focussed on indirect comparison or NMA methodology. Articles were sought by…

  11. An Analysis of State Autism Educational Assessment Practices and Requirements.

    Science.gov (United States)

    Barton, Erin E; Harris, Bryn; Leech, Nancy; Stiff, Lillian; Choi, Gounah; Joel, Tiffany

    2016-03-01

    States differ in the procedures and criteria used to identify ASD. These differences are likely to impact the prevalence and age of identification for children with ASD. The purpose of the current study was to examine the specific state variations in ASD identification and eligibility criteria requirements. We examined variations by state in autism assessment practices and the proportion of children eligible for special education services under the autism category. Overall, our findings suggest that ASD identification practices vary across states, but most states use federal guidelines, at least in part, to set their requirements. Implications and recommendations for policy and practice are discussed.

  12. Cost-effectiveness analysis of computer-based assessment

    Directory of Open Access Journals (Sweden)

    Pauline Loewenberger

    2003-12-01

    Full Text Available The need for more cost-effective and pedagogically acceptable combinations of teaching and learning methods to sustain increasing student numbers means that the use of innovative methods, using technology, is accelerating. There is an expectation that economies of scale might provide greater cost-effectiveness whilst also enhancing student learning. The difficulties and complexities of these expectations are considered in this paper, which explores the challenges faced by those wishing to evaluate the costeffectiveness of computer-based assessment (CBA. The paper outlines the outcomes of a survey which attempted to gather information about the costs and benefits of CBA.

  13. Psychometric Analysis of the Diagnostic Evaluation of Language Variation Assessment

    Science.gov (United States)

    Petscher, Yaacov; Connor, Carol McDonald; Al Otaiba, Stephanie

    2012-01-01

    This study investigated the psychometrics of the "Diagnostic Evaluation of Language Variation-Screening Test" (DELV-S) test using confirmatory factor analysis, item response theory, and differential item functioning (DIF). Responses from 1,764 students in kindergarten through second grade were used in the study, with results indicating…

  14. Using Rasch Analysis to Identify Uncharacteristic Responses to Undergraduate Assessments

    Science.gov (United States)

    Edwards, Antony; Alcock, Lara

    2010-01-01

    Rasch Analysis is a statistical technique that is commonly used to analyse both test data and Likert survey data, to construct and evaluate question item banks, and to evaluate change in longitudinal studies. In this article, we introduce the dichotomous Rasch model, briefly discussing its assumptions. Then, using data collected in an…

  15. Assessment of competence in simulated flexible bronchoscopy using motion analysis

    DEFF Research Database (Denmark)

    Collela, Sara; Svendsen, Morten Bo Søndergaard; Konge, Lars

    2015-01-01

    Background: Flexible bronchoscopy should be performed with a correct posture and a straight scope to optimize bronchoscopy performance and at the same time minimize the risk of work-related injuries and endoscope damage. Objectives: We aimed to test whether an automatic motion analysis system cou...

  16. Technical quality assessment of an optoelectronic system for movement analysis

    Science.gov (United States)

    Di Marco, R.; Rossi, S.; Patanè, F.; Cappa, P.

    2015-02-01

    The Optoelectronic Systems (OS) are largely used in gait analysis to evaluate the motor performances of healthy subjects and patients. The accuracy of marker trajectories reconstruction depends on several aspects: the number of cameras, the dimension and position of the calibration volume, and the chosen calibration procedure. In this paper we propose a methodology to evaluate the effects of the mentioned sources of error on the reconstruction of marker trajectories. The novel contribution of the present work consists in the dimension of the tested calibration volumes, which is comparable with the ones normally used in gait analysis; in addition, to simulate trajectories during clinical gait analysis, we provide non-default paths for markers as inputs. Several calibration procedures are implemented and the same trial is processed with each calibration file, also considering different cameras configurations. The RMSEs between the measured trajectories and the optimal ones are calculated for each comparison. To investigate the significant differences between the computed indices, an ANOVA analysis is implemented. The RMSE is sensible to the variations of the considered calibration volume and the camera configurations and it is always inferior to 43 mm.

  17. No-Reference Video Quality Assessment using Codec Analysis

    DEFF Research Database (Denmark)

    Søgaard, Jacob; Forchhammer, Søren; Korhonen, Jari

    2015-01-01

    propose a novel estimation method of the quantization in H.264/AVC videos without bitstream access, which can also be used for Peak Signalto-Noise Ratio (PSNR) estimation. The results from the MPEG-2 and H.264/AVC analysis are mapped to a perceptual measure of video quality by Support Vector Regression...

  18. Biogas upgrading technologies:Energetic analysis and environmental impact assessment

    Institute of Scientific and Technical Information of China (English)

    Yajing Xu; Ying Huang; Bin Wu; Xiangping Zhang; Suojiang Zhang

    2015-01-01

    Biogas upgrading for removing CO2 and other trace components from raw biogas is a necessary step before the biogas to be used as a vehicle fuel or supplied to the natural gas grid. In this work, three technologies for biogas upgrading, i.e., pressured water scrubbing (PWS), monoethanolamine aqueous scrubbing (MAS) and ionic liquid scrubbing (ILS), are studied and assessed in terms of their energy consumption and environmental impacts with the process simulation and green degree method. A non-random-two-liquid and Henry's law property method for a CO2 separation system with ionic liquid 1-butyl-3-methylimidazolium bis(trifluoromethylsulfonyl)imide ([bmim][Tf2N]) is established and verified with experimental data. The assessment results indicate that the specific energy consumption of ILS and PWS is almost the same and much less than that of MAS. High purity CO2 product can be obtained by MAS and ILS methods, whereas no pure CO2 is recovered with the PWS. For the environmental aspect, ILS has the highest green degree production value, while MAS and PWS produce serious environmental impacts.

  19. Forest ecosystem health assessment and analysis in China

    Institute of Scientific and Technical Information of China (English)

    XIAOFengjin; OUYANGHua; ZHANGQiang; FUBojie; ZHANGZhicheng

    2004-01-01

    Based on more than 300 forest sample plots surveying data and forestry statistical data, remote sensing information from the NOAA AVHRR database and the daily meteorological data of 300 stations, we selected vigor, organization and resilience as the indicators to assess large-scale forest ecosystem health in China and analyzed the spatial pattern of forest ecosystem health and influencing factors. The results of assessment indicated that the spatial pattern of forest ecosystem health showed a decreasing trend along latitude gradients and longitude gradients. The healthy forests are mainly distributed in natural forests, tropical rainforests and seasonal rainforests; secondarily orderly in northeast national forest zone, subtropical forest zonation and southwest forest zonation; while the unhealthy forests were mainly located in warm temperate zone and Xinjiang-Mongolia forest zone. The coefficient of correction between Forest Ecosystem Health Index (FEHI) and annual average precipitation was 0.58 (p<0.01), while the coefficient of correlation between FEHI and annual mean temperatures was 0.49 (p<0.01), which identified that the precipitation and temperatures affect the pattern of FEHI, and the precipitation's effect was stronger than the temperature's. We also measured the correlation coefficient between FEHI and NPP, biodiversity and resistance, which were 0.64, 0.76 and 0.81 (p<0.01) respectively. The order of effect on forest ecosystem health was vigor, organization and resistance.

  20. An Analysis Report of 2014 CALA Self-Assessment Survey

    Directory of Open Access Journals (Sweden)

    Jian Anna Xiong

    2016-06-01

    Full Text Available On the occasion of CALA’s 40th anniversary in 2014, the 2013 Board of Directors appointed a Self-Assessment Task Force to conduct an assessment survey with special focuses on members’ awareness of CALA’s organizational structure and policies, its services to members, the extent of participation in events sponsored by CALA, and the level of satisfaction with CALA leadership. Although only one-fifth of the active members responded to the survey, the answers and feedback have identified areas for organizational improvement and have shown how active members view the current state of CALA. Some essential findings from the survey include: 1 the growth of overseas membership as a demographic trend, 2 a need to recruit student members, 3 a high percentage of CALA members not aware of CALA’s Mission/Vision/Goal, 4 conflicting data on CALA’s leadership, 5 discovery of low ratings (10-30% of respondents on eleven out of twelve rating questions, and 6 strong support for CALA as a representative organization of Chinese American librarians in North America. The findings of the survey will serve as a valuable reference for future strategic planning and for carrying out CALA’s long term goals.

  1. A new tool for risk analysis and assessment in petrochemical plants

    Directory of Open Access Journals (Sweden)

    El-Arkam Mechhoud

    2016-09-01

    Full Text Available The aim of our work was the implementation of a new automated tool dedicated to risk analysis and assessment in petrochemical plants, based on a combination of two analysis methods: HAZOP (HAZard and OPerability and FMEA (Failure Mode and Effect Analysis. Assessment of accident scenarios is also considered. The principal advantage of the two analysis methods is to speed-up hazard identification and risk assessment and forecast the nature and impact of such accidents. Plant parameters are analyzed under a graphical interface to facilitate the exploitation of our developed approach. This automated analysis brings out the different deviations of the operating parameters of any system in the plant. Possible causes of these deviations, their consequences and preventive actions are identified. The result is risk minimization and dependability enhancement of the considered system.

  2. THE MANAGEMENT OF TRANSFUSION SERVICES, ANALYSIS AND ASSESSMENT

    Science.gov (United States)

    Begic, Dzenana; Mujicic, Ermina; Coric, Jozo; Zunic, Lejla

    2016-01-01

    Introduction: The hospital blood bank (HBB) need to timely provide adequate amounts of blood and blood products for surgeries. For various surgical programs are performed assessments of the average number of blood doses needed for surgery. By using two types of requisitions BT/AB (blood type/antibody) and BT/AB/MT (blood type/antibody/match test) for pretransfusion immunohaematological testing in General Hospital “Prim. Dr. Abdulah Nakas” is achieved more rational consumption of blood and blood derivatives and financial savings through reduced number of matching tests (MT). Goal: To determine the total amount of pre-operative requisitions (BT/AB and BT/AB/MT) for blood and blood products at surgical departments of the General Hospital “Prim. Dr. Abdulah Nakas” in the period from June 1, 2014 – December 31, 2014 and analyze the consumption/return of blood in reserve in relation to the surgical disciplines, the total number of savings in MT. Conduct assessments MSBOS (Maximum Surgical Blood Ordering Schedule). Results: The total amount of preoperative requisitions for blood and blood products in surgical wards was 927 requests from which 623 demands or 67.2% is tested by BT/MT, while 304 or 32.8% was tested by BT/AB/MT. Transfused in total was 617 units of blood and blood products, 275 units were not transfused. Probability of transfusions for surgery was 51.3, the highest in the case of surgical intensive care 70.4 and the lowest for the department of general surgery 37.2%. Assessment of indicators of efficient resource management indicates they are the best at the delivery ward 0.89, while a total for surgical wards is 0.69. In total for surgery on the average were required 2.1 units of blood. By using two types of requisitions for pretransfusion immunohaematological testing (BT/AB and CG/AB/MT) is achieved more rational use of MT. In 623 requests for BT/AB only 61 MT were performed. Average of blood units issued in accordance with these requirements is 0

  3. Regional Hazard Analysis For Use In Vulnerability And Risk Assessment

    Directory of Open Access Journals (Sweden)

    Maris Fotios

    2015-09-01

    Full Text Available A method for supporting an operational regional risk and vulnerability analysis for hydrological hazards is suggested and applied in the Island of Cyprus. The method aggregates the output of a hydrological flow model forced by observed temperatures and precipitations, with observed discharge data. A scheme supported by observed discharge is applied for model calibration. A comparison of different calibration schemes indicated that the same model parameters can be used for the entire country. In addition, it was demonstrated that, for operational purposes, it is sufficient to rely on a few stations. Model parameters were adjusted to account for land use and thus for vulnerability of elements at risk by comparing observed and simulated flow patterns, using all components of the hydrological model. The results can be used for regional risk and vulnerability analysis in order to increase the resilience of the affected population.

  4. Dermal type I collagen assessment by digital image analysis*

    OpenAIRE

    Brianezi, Gabrielli; Grandi, Fabrizio; Bagatin, Ediléia; Enokihara, Mílvia Maria S. S.; Miot, Hélio Amante [UNESP

    2015-01-01

    Type I collagen is the main dermal component, and its evaluation is relevant to quantitative studies in dermatopathology. However, visual gradation (0 to 4+) has low precision and high subjectivity levels. This study aimed to develop and validate a digital morphometric analysis technique to estimate type I collagen levels in the papillary dermis. Four evaluators visually quantified (0 to 4+) the density of type I collagen in 63 images of forearm skin biopsies marked by immunohistochemistry an...

  5. Assessment of the SFC database for analysis and modeling

    Science.gov (United States)

    Centeno, Martha A.

    1994-01-01

    SFC is one of the four clusters that make up the Integrated Work Control System (IWCS), which will integrate the shuttle processing databases at Kennedy Space Center (KSC). The IWCS framework will enable communication among the four clusters and add new data collection protocols. The Shop Floor Control (SFC) module has been operational for two and a half years; however, at this stage, automatic links to the other 3 modules have not been implemented yet, except for a partial link to IOS (CASPR). SFC revolves around a DB/2 database with PFORMS acting as the database management system (DBMS). PFORMS is an off-the-shelf DB/2 application that provides a set of data entry screens and query forms. The main dynamic entity in the SFC and IOS database is a task; thus, the physical storage location and update privileges are driven by the status of the WAD. As we explored the SFC values, we realized that there was much to do before actually engaging in continuous analysis of the SFC data. Half way into this effort, it was realized that full scale analysis would have to be a future third phase of this effort. So, we concentrated on getting to know the contents of the database, and in establishing an initial set of tools to start the continuous analysis process. Specifically, we set out to: (1) provide specific procedures for statistical models, so as to enhance the TP-OAO office analysis and modeling capabilities; (2) design a data exchange interface; (3) prototype the interface to provide inputs to SCRAM; and (4) design a modeling database. These objectives were set with the expectation that, if met, they would provide former TP-OAO engineers with tools that would help them demonstrate the importance of process-based analyses. The latter, in return, will help them obtain the cooperation of various organizations in charting out their individual processes.

  6. Comparative analysis of deterministic and probabilistic fracture mechanical assessment tools

    Energy Technology Data Exchange (ETDEWEB)

    Heckmann, Klaus [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Koeln (Germany); Saifi, Qais [VTT Technical Research Centre of Finland, Espoo (Finland)

    2016-11-15

    Uncertainties in material properties, manufacturing processes, loading conditions and damage mechanisms complicate the quantification of structural reliability. Probabilistic structure mechanical computing codes serve as tools for assessing leak- and break probabilities of nuclear piping components. Probabilistic fracture mechanical tools were compared in different benchmark activities, usually revealing minor, but systematic discrepancies between results of different codes. In this joint paper, probabilistic fracture mechanical codes are compared. Crack initiation, crack growth and the influence of in-service inspections are analyzed. Example cases for stress corrosion cracking and fatigue in LWR conditions are analyzed. The evolution of annual failure probabilities during simulated operation time is investigated, in order to identify the reasons for differences in the results of different codes. The comparison of the tools is used for further improvements of the codes applied by the partners.

  7. Image analysis for dental bone quality assessment using CBCT imaging

    Science.gov (United States)

    Suprijanto; Epsilawati, L.; Hajarini, M. S.; Juliastuti, E.; Susanti, H.

    2016-03-01

    Cone beam computerized tomography (CBCT) is one of X-ray imaging modalities that are applied in dentistry. Its modality can visualize the oral region in 3D and in a high resolution. CBCT jaw image has potential information for the assessment of bone quality that often used for pre-operative implant planning. We propose comparison method based on normalized histogram (NH) on the region of inter-dental septum and premolar teeth. Furthermore, the NH characteristic from normal and abnormal bone condition are compared and analyzed. Four test parameters are proposed, i.e. the difference between teeth and bone average intensity (s), the ratio between bone and teeth average intensity (n) of NH, the difference between teeth and bone peak value (Δp) of NH, and the ratio between teeth and bone of NH range (r). The results showed that n, s, and Δp have potential to be the classification parameters of dental calcium density.

  8. Assessment and analysis components of physical fitness of students

    Directory of Open Access Journals (Sweden)

    Kashuba V.A.

    2012-08-01

    Full Text Available It is assessed components of a physical fitness of students. It is analyzed the internal and external factors affecting the quality of life for students. The study involved more than 200 students. Found that students represent a category of people with elevated risk factors, which include the nervous and mental tension, constant violations of the food, work and leisure, in their way of life there is a lack of care about their health. It is noted that the existing approaches to promoting physical fitness of students are inefficient and require the development and implementation of brand new contemporary theoretical foundations and practical approaches to the problem of increasing the activity of students. It is proved that sold today in the practice of higher education forms, methods, learning tools do not allow to fully ensure the implementation of approaches to promoting physical fitness of students do not meet the requirements for the preparation of the modern health professional.

  9. Game theoretic analysis of environmental impact assessment system in China

    Institute of Scientific and Technical Information of China (English)

    CHENG Hongguang; QI Ye; PU Xiao; GONG Li

    2007-01-01

    Environmental impact assessment (EIA) system has been established in China since 1973.In present EIA cases,there are four participants in general:governments,enterprises,EIA organizations and the public.The public has held responsible for both social costs and social duties.The public supervises social costs produced by enterprises discharging pollutant in EIA.However public participation is mostly deputized by governments,which severely weaken the independence of the public as one participant in EIA.In this paper,EIA refers to the different attitudes of the participants whose optional strategies may be described by a proper game model.According to disfigurements in EIA,three sides (governments,enterprises,and EIA organizations)dynamic iterative game theory of many phases is established referring to iterative game theory,dynamic game theory of incomplete information,and perfect Bayesian equilibrium theory to analyze the reciprocity relation among governments,EIA organizations and enterprises.The results show that in a short period,economic benefit is preponderant over social benefit.Governments and enterprises both do not want to take EIA to reveal social costs.EIA organizations' income comes from enterprises and the collusions are built between them to vindicate economic benefit.In a long run,social benefit loss caused by environmental pollution must be recuperated sooner or later and environmental deterioration will influence the achievements of economic benefit,so both governments and eaterprises are certain to pursue high social benefit and willing to take EIA,helpful to increase private benefit.EIA organizations will make fair assessment when their economic benefit are ensured.At present,the public as silent victims can not take actual part in EIA.The EIA system must be improved to break the present equilibrium of three sides,bringing the public to the equilibrium to exert public supervision.

  10. Pilot analysis of the Motivation Assessment for Team Readiness, Integration, and Collaboration (MATRICx) using Rasch analysis.

    Science.gov (United States)

    Mallinson, Trudy; Lotrecchiano, Gaetano R; Schwartz, Lisa S; Furniss, Jeremy; Leblanc-Beaudoin, Tommy; Lazar, Danielle; Falk-Krzesinski, Holly J

    2016-10-01

    Healthcare services and the production of healthcare knowledge are increasingly dependent on highly functioning, multidisciplinary teams, requiring greater awareness of individuals' readiness to collaborate in translational science teams. Yet, there is no comprehensive tool of individual motivations and threats to collaboration that can guide preparation of individuals for work on well-functioning teams. This prospective pilot study evaluated the preliminary psychometric properties of the Motivation Assessment for Team Readiness, Integration, and Collaboration (MATRICx). We examined 55 items of the MATRICx in a sample of 125 faculty, students and researchers, using contemporary psychometric methods (Rasch analysis). We found that the motivator and threat items formed separate constructs relative to collaboration readiness. Further, respondents who identified themselves as inexperienced at working on collaborative projects defined the motivation construct differently from experienced respondents. These results are consistent with differences in strategic alliances described in the literature-for example, inexperienced respondents reflected features of cooperation and coordination, such as concern with sharing information and compatibility of goals. In contrast, the more experienced respondents were concerned with issues that reflected a collective purpose, more typical of collaborative alliances. While these different types of alliances are usually described as representing varying aspects along a continuum, our findings suggest that collaboration might be better thought of as a qualitatively different state than cooperation or coordination. These results need to be replicated in larger samples, but the findings have implications for the development and design of educational interventions that aim to ready scientists and clinicians for greater interdisciplinary work.

  11. Alternative model for assessment administration and analysis: An example from the E-CLASS

    CERN Document Server

    Wilcox, Bethany R; Hobbs, Robert D; Aiken, John M; Welch, Nathan M; Lewandowski, H J

    2016-01-01

    The primary model for dissemination of conceptual and attitudinal assessments that has been used within the physics education research (PER) community is to create a high quality, validated assessment, make it available for a wide range of instructors to use, and provide minimal (if any) support to assist with administration or analysis of the results. Here, we present and discuss an alternative model for assessment dissemination, which is characterized by centralized data collection and analysis. This model also provides a greater degree of support for both researchers and instructors. Specifically, we describe our experiences developing a centralized, automated system for an attitudinal assessment we previously created to examine students' epistemologies and expectations about experimental physics. This system provides a proof-of-concept that we use to discuss the advantages associated with centralized administration and data collection for research-based assessments in PER. We also discuss the challenges t...

  12. How to assess the Efficiency and "Uncertainty" of Global Sensitivity Analysis?

    Science.gov (United States)

    Haghnegahdar, Amin; Razavi, Saman

    2016-04-01

    Sensitivity analysis (SA) is an important paradigm for understanding model behavior, characterizing uncertainty, improving model calibration, etc. Conventional "global" SA (GSA) approaches are rooted in different philosophies, resulting in different and sometime conflicting and/or counter-intuitive assessment of sensitivity. Moreover, most global sensitivity techniques are highly computationally demanding to be able to generate robust and stable sensitivity metrics over the entire model response surface. Accordingly, a novel sensitivity analysis method called Variogram Analysis of Response Surfaces (VARS) is introduced to overcome the aforementioned issues. VARS uses the Variogram concept to efficiently provide a comprehensive assessment of global sensitivity across a range of scales within the parameter space. Based on the VARS principles, in this study we present innovative ideas to assess (1) the efficiency of GSA algorithms and (2) the level of confidence we can assign to a sensitivity assessment. We use multiple hydrological models with different levels of complexity to explain the new ideas.

  13. Assessing Canadian Bank Branch Operating Efficiency Using Data Envelopment Analysis

    Science.gov (United States)

    Yang, Zijiang

    2009-10-01

    In today's economy and society, performance analyses in the services industries attract more and more attention. This paper presents an evaluation of 240 branches of one big Canadian bank in Greater Toronto Area using Data Envelopment Analysis (DEA). Special emphasis was placed on how to present the DEA results to management so as to provide more guidance to them on what to manage and how to accomplish the changes. Finally the potential management uses of the DEA results were presented. All the findings are discussed in the context of the Canadian banking market.

  14. A suite of models to support the quantitative assessment of spread in pest risk analysis

    NARCIS (Netherlands)

    Robinet, C.; Kehlenbeck, H.; Werf, van der W.

    2012-01-01

    In the frame of the EU project PRATIQUE (KBBE-2007-212459 Enhancements of pest risk analysis techniques) a suite of models was developed to support the quantitative assessment of spread in pest risk analysis. This dataset contains the model codes (R language) for the four models in the suite. Three

  15. The Development of a Content Analysis Model for Assessing Students' Cognitive Learning in Asynchronous Online Discussions

    Science.gov (United States)

    Yang, Dazhi; Richardson, Jennifer C.; French, Brian F.; Lehman, James D.

    2011-01-01

    The purpose of this study was to develop and validate a content analysis model for assessing students' cognitive learning in asynchronous online discussions. It adopted a fully mixed methods design, in which qualitative and quantitative methods were employed sequentially for data analysis and interpretation. Specifically, the design was a…

  16. Using Language Sample Analysis to Assess Spoken Language Production in Adolescents

    Science.gov (United States)

    Miller, Jon F.; Andriacchi, Karen; Nockerts, Ann

    2016-01-01

    Purpose: This tutorial discusses the importance of language sample analysis and how Systematic Analysis of Language Transcripts (SALT) software can be used to simplify the process and effectively assess the spoken language production of adolescents. Method: Over the past 30 years, thousands of language samples have been collected from typical…

  17. Alternative model for administration and analysis of research-based assessments

    Science.gov (United States)

    Wilcox, Bethany R.; Zwickl, Benjamin M.; Hobbs, Robert D.; Aiken, John M.; Welch, Nathan M.; Lewandowski, H. J.

    2016-06-01

    Research-based assessments represent a valuable tool for both instructors and researchers interested in improving undergraduate physics education. However, the historical model for disseminating and propagating conceptual and attitudinal assessments developed by the physics education research (PER) community has not resulted in widespread adoption of these assessments within the broader community of physics instructors. Within this historical model, assessment developers create high quality, validated assessments, make them available for a wide range of instructors to use, and provide minimal (if any) support to assist with administration or analysis of the results. Here, we present and discuss an alternative model for assessment dissemination, which is characterized by centralized data collection and analysis. This model provides a greater degree of support for both researchers and instructors in order to more explicitly support adoption of research-based assessments. Specifically, we describe our experiences developing a centralized, automated system for an attitudinal assessment we previously created to examine students' epistemologies and expectations about experimental physics. This system provides a proof of concept that we use to discuss the advantages associated with centralized administration and data collection for research-based assessments in PER. We also discuss the challenges that we encountered while developing, maintaining, and automating this system. Ultimately, we argue that centralized administration and data collection for standardized assessments is a viable and potentially advantageous alternative to the default model characterized by decentralized administration and analysis. Moreover, with the help of online administration and automation, this model can support the long-term sustainability of centralized assessment systems.

  18. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  19. Assessment

    Institute of Scientific and Technical Information of China (English)

    Geoff Brindley

    2005-01-01

    @@ Introduction TERMINOLOGY AND KEY CONCEPTS The term assessment refers to a variety of ways of collecting information on a learner's language ability or achievement. Although testing and assessment are often used interchangeably, the latter is an umbrella term encompassing measurement instruments administered on a ‘one-off’ basis such as tests, as well as qualitative methods of monitoring and recording student learning such as observation, simulations of project work. Assessment is also distinguished from evaluation which is concerned with the overall language programme and not just with what individual students have learnt. Proficiency assessment refers to the assessment of general language abilities acquired by the learner independent of a course of study.This kind of assessment is often done through the administration of standardised commercial language-proficency tests. On the other hand, assessment of achievement aims to establish what a student had learned in relation to a particular course or curriculum (thus frequently carried out by the teacher) .Achievement assesssment may be based either on the specific content of the course or on the course objectives (Hughes 1989).

  20. Use Of Risk Analysis Fremeworks In Urban Flood Assessments

    DEFF Research Database (Denmark)

    Arnbjerg-Nielsen, Karsten; Madsen, Henrik

    In the period 1960 – 1990 rapid urban development took place all over Europe, and notably in Denmark urban sprawl occurred around many cities. Favorable economic conditions ensured that the urbanization continued, although at a lower rate, until recently. However, from 1990 to present a increase...... in extreme precipitation has been observed, corresponding to an increase of design levels of at least 30 %. Analysis of climate change model output has given clear evidence, that further increases in extreme precipitation must be expected in the future due to anthropogenic emissions of greenhouse gasses...... with better decision support tools. Some of the developments are risk frameworks that encompass economic and/or ethic evaluation of climate change adaptation options and improved risk management. This line of development is based on a societal-based evaluation of maximizing the outcome for society...

  1. Seismic Assessment of an RC Building Using Pushover Analysis

    Directory of Open Access Journals (Sweden)

    R. A. Hakim

    2014-08-01

    Full Text Available Current research works and observations indicated that parts of the Kingdom of Saudi Arabia have low to moderate seismic regions. Major parts of buildings were designed only for gravity load and were poorly detailed to accommodate lateral loads. This study aims to investigate building performance on resisting expected seismic loadings. Two 3D frames were investigated using pushover analysis according to ATC-40. One was designed according to a design practice that considers only the gravity load and the other frame was designed according to the Saudi Building Code (SBC-301. Results showed that the building designed considering only the gravity load was found inadequate. On the other hand, the building designed according to SBC-301 satisfies the Immediate Occupancy (IO acceptance criteria according to ATC-40.

  2. Jelly pineapple syneresis assessment via univariate and multivariate analysis

    Directory of Open Access Journals (Sweden)

    Carlos Alberto da Silva Ledo

    2010-09-01

    Full Text Available The evaluation of the pineapple jelly is intended to analyze the occurrence of syneresis by univariate and multivariate analysis. The jelly of the pineapple presents low concentration pectin, therefore, it was added high methoxyl pectin in the following concentrations: 0.50%, 0.75% and 1.00% corresponding to slow, medium and fast speed of gel formation process. In this study it was checked the pH, acidity, brix and the syneresis of jelly. The highest concentration of pectin in the jelly showed a decrease in the release of the water, syneresis. This result showed that the percentage of 1.00% of pectin in jelly is necessary to form the gel and to obtain a suitable texture.

  3. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas

    OpenAIRE

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-01-01

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general proc...

  4. Polygraph Test Results Assessment by Regression Analysis Methods

    Directory of Open Access Journals (Sweden)

    K. A. Leontiev

    2014-01-01

    Full Text Available The paper considers a problem of defining the importance of asked questions for the examinee under judicial and psychophysiological polygraph examination by methods of mathematical statistics. It offers the classification algorithm based on the logistic regression as an optimum Bayesian classifier, considering weight coefficients of information for the polygraph-recorded physiological parameters with no condition for independence of the measured signs.Actually, binary classification is executed by results of polygraph examination with preliminary normalization and standardization of primary results, with check of a hypothesis that distribution of obtained data is normal, as well as with calculation of coefficients of linear regression between input values and responses by method of maximum likelihood. Further, the logistic curve divided signs into two classes of the "significant" and "insignificant" type.Efficiency of model is estimated by means of the ROC analysis (Receiver Operator Characteristics. It is shown that necessary minimum sample has to contain results of 45 measurements at least. This approach ensures a reliable result provided that an expert-polygraphologist possesses sufficient qualification and follows testing techniques.

  5. Job analysis and student assessment tool: perfusion education clinical preceptor.

    Science.gov (United States)

    Riley, Jeffrey B

    2007-09-01

    The perfusion education system centers on the cardiac surgery operating room and the perfusionist teacher who serves as a preceptor for the perfusion student. One method to improve the quality of perfusion education is to create a valid method for perfusion students to give feedback to clinical teachers. The preceptor job analysis consisted of a literature review and interviews with preceptors to list their critical tasks, critical incidents, and cognitive and behavioral competencies. Behaviorally anchored rating traits associated with the preceptors' tasks were identified. Students voted to validate the instrument items. The perfusion instructor rating instrument with a 0-4, "very weak" to "very strong" Likert rating scale was used. The five preceptor traits for student evaluation of clinical instruction (SECI) are as follows: The clinical instructor (1) encourages self-learning, (2) encourages clinical reasoning, (3) meets student's learning needs, (4) gives continuous feedback, and (5) represents a good role model. Scores from 430 student-preceptor relationships for 28 students rotating at 24 affiliate institutions with 134 clinical instructors were evaluated. The mean overall good preceptor average (GPA) was 3.45 +/- 0.76 and was skewed to the left, ranging from 0.0 to 4.0 (median = 3.8). Only 21 of the SECI relationships earned a GPA education program.

  6. Rorschach assessment of traumatized refugees: an exploratory factor analysis.

    Science.gov (United States)

    Opaas, Marianne; Hartmann, Ellen

    2013-01-01

    Fifty-one multitraumatized mental health patients with refugee backgrounds completed the Rorschach (Meyer & Viglione, 2008), Harvard Trauma Questionnaire, and Hopkins Symptom Checklist-25 (Mollica, McDonald, Massagli, & Silove, 2004), and the World Health Organization Quality of Life-BREF questionnaire (WHOQOL Group, 1998) before the start of treatment. The purpose was to gain more in-depth knowledge of an understudied patient group and to provide a prospective basis for later analyses of treatment outcome. Factor analysis of trauma-related Rorschach variables gave 2 components explaining 60% of the variance; the first was interpreted as trauma-related flooding versus constriction and the second as adequate versus impaired reality testing. Component 1 correlated positively with self-reported reexperiencing symptoms of posttraumatic stress (r = .32, p < .05). Component 2 correlated positively with self-reported quality of life in the physical, psychological, and social relationships domains (r = .34, .32, and .35, p < .05), and negatively with anxiety (r = -.33, p < .05). Each component also correlated significantly with resources like work experience, education, and language skills.

  7. Analysis of criteria for performance assessment of hospitals organizations manage

    Directory of Open Access Journals (Sweden)

    Fabricio Henrique dos Santos Simões

    2013-06-01

    The hospital as a service provider is undergoing rapid technological change and a society where quality and efficiency are evaluated through quality indicators are inevitable in a highly competitive healthcare market. This research seeks to compare performance indicators between hospitals managed under the model of social health organizations with the indicators of public hospitals directly managed in the State of São Paulo. The research is a literature review of qualitative method, where we seek to synthesize the knowledge produced through the analysis of the results shown in some studies of expert authors. The quality of care is a major concern of health institutions. From the literature review conducted it was found that the results point to the fact that the excellent quality of care requires the evaluation of the results of the services offered to the user through the use of indicators as generators of information that support the development of guidelines for policies health. This study shows that monitoring of indicators by measuring the same health professionals enables the process of decision making based on their results, the use of indicators still allows them to modify and improve their practice for the quality and efficiency of assistance.

  8. Desiccant dehumidification and cooling systems assessment and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Collier, R.K. Jr. [Collier Engineering, Reno, NV (United States)

    1997-09-01

    The objective of this report is to provide a preliminary analysis of the principles, sensitivities, and potential for national energy savings of desiccant cooling and dehumidification systems. The report is divided into four sections. Section I deals with the maximum theoretical performance of ideal desiccant cooling systems. Section II looks at the performance effects of non-ideal behavior of system components. Section III examines the effects of outdoor air properties on desiccant cooling system performance. Section IV analyzes the applicability of desiccant cooling systems to reduce primary energy requirements for providing space conditioning in buildings. A basic desiccation process performs no useful work (cooling). That is, a desiccant material drying air is close to an isenthalpic process. Latent energy is merely converted to sensible energy. Only when heat exchange is applied to the desiccated air is any cooling accomplished. This characteristic is generic to all desiccant cycles and critical to understanding their operation. The analyses of Section I show that desiccant cooling cycles can theoretically achieve extremely high thermal CoP`s (>2). The general conclusion from Section II is that ventilation air processing is the most viable application for the solid desiccant equipment analyzed. The results from the seasonal simulations performed in Section III indicate that, generally, the seasonal performance of the desiccant system does not change significantly from that predicted for outdoor conditions. Results from Section IV show that all of the candidate desiccant systems can save energy relative to standard vapor-compression systems. The largest energy savings are achieved by the enthalpy exchange devise.

  9. FEBEX II Project Post-mortem analysis EDZ assessment

    Energy Technology Data Exchange (ETDEWEB)

    Bazargan Sabet, B.; Shao, H.; Autio, J.; Elorza, F. J.

    2004-07-01

    variations in the propagation velocities of acoustic waves. A cylindrical block of granite 38.8 cm in diameter and 40 cm high has been analysed along 2D transversal sections in six radial directions. Different inverse tomographic strategies have been used to analyse the measured data, which shown no evidences of the existence of an EDZ in FEBEX gallery. However, a preferential direction in the wave propagation similar to the maximum compression direction of the stress tensor has appeared. As for in situ investigations, the hydraulic connectivity of the drift has been assessed at eleven locations in heated area, including granite matrix and lamprophyre dykes and at six locations in undisturbed zones. In the granite matrix area, pulse test using pressurised air with stepwise pressure increasing was conducted to determine gas entry pressure. In the fractured area, a gas constant flow rate injection test was conducted. Only two locations with higher permeability were detected; one in a natural fracture in the lamprophyre dyke and the other in the interface between lamprophyre and granite. Where numerical investigations are concerned, several analyses of the FEBEX in situ experiment were carried out to determine whether if the generation of a potential EDZ in the surrounding rock was possible or not. Stresses have been calculated by 1D full-coupled thermo-hydromechanical model and by 2D and 3D thermo-mechanical models. Results compared with the available data on compressive strength of the Grimsel granite show that in the worst-case studied, the state of stresses induced by the excavation and the heating phases remains far below the critical curve. (Author)

  10. Interconnectivity among Assessments from Rating Agencies: Using Cluster and Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Jaroslav Krejčíř

    2014-09-01

    Full Text Available The aim of this paper is to determine whether there is a dependency among leading rating agencies assessments. Rating agencies are important part of global economy. Great attention has been paid to activities of rating agencies since 2007, when there was a financial crisis. One of the main causes of this crisis was identified credit rating agencies. This paper is focused on an existence of mutual interconnectivity among assessments from three leading rating agencies. The method used for this determines is based on cluster analysis and subsequently correlation analysis and the test of independence. Credit rating assessments of Greece and Spain were chosen to the determination of this mutual interconnectivity due to the fact that these countries are most talked euro­area countries. The significant dependence of the assessment from different rating agencies has been demonstrated.

  11. Predictive Network-Centric Intelligence: Toward a Total-Systems Transformation of Analysis and Assessment

    Science.gov (United States)

    2006-09-15

    data are required to bring it to full fruition and prove its superiority to the inevitable legions of doubters. As Galileo Galilei could attest...0 Predictive Network-Centric Intelligence: Toward a Total-Systems Transformation of Analysis and Assessment Winner, 2006 Galileo ...Two previous Galileo papers also have called for the application of scientific method to intelligence assessment: Bruce, James: “Dynamic Adaptation

  12. SAFETY ANALYSIS AND RISK ASSESSMENT FOR BRIDGES HEALTH MONITORING WITH MONTE CARLO METHODS

    OpenAIRE

    2016-01-01

    With the increasing requirements of building safety in the past few decades, healthy monitoring and risk assessment of structures is of more and more importance. Especially since traffic loads are heavier, risk Assessment for bridges are essential. In this paper we take advantage of Monte Carlo Methods to analysis the safety of bridge and monitoring the destructive risk. One main goal of health monitoring is to reduce the risk of unexpected damage of artificial objects

  13. Contractor Past Performance Information: An Analysis of Assessment Narratives and Objective Ratings

    Science.gov (United States)

    2015-05-01

    Contractor Past Performance Information: An Analysis of Assessment Narratives and Objective Ratings Rene G. Rendon Uday Apte Michael Dixon...Assessment Narratives and Objective Ratings 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e...associated objective scores • CPARS deficiencies provide less-than-optimal information to the acquisition team that relies on these reports for source

  14. Risk assessment of Kermanshah gas storage tanks by energy trace and barrier analysis (2014)

    OpenAIRE

    M. Ghanbari Kakavandi; F. Rajati; H. Ashtarian; SY. Hosseini

    2016-01-01

    Background: Despite the cost and millions loss of life due to industrial accidents, often are preventable through risk assessment methods and control measures. Objective: To assess the safety of gas storage tanks in Kermanshah oil refinery by Energy Trace and Barrier Analysis (ETBA). Methods: This case-descriptive study was conducted in gas storage tanks of Kermanshah oil refinery. Energy checklist was used for identification of energy types. Energy flows were tracked and then, manageme...

  15. Endogenous allergens and compositional analysis in the allergenicity assessment of genetically modified plants.

    Science.gov (United States)

    Fernandez, A; Mills, E N C; Lovik, M; Spoek, A; Germini, A; Mikalsen, A; Wal, J M

    2013-12-01

    Allergenicity assessment of genetically modified (GM) plants is one of the key pillars in the safety assessment process of these products. As part of this evaluation, one of the concerns is to assess that unintended effects (e.g. over-expression of endogenous allergens) relevant for the food safety have not occurred due to the genetic modification. Novel technologies are now available and could be used as complementary and/or alternative methods to those based on human sera for the assessment of endogenous allergenicity. In view of these developments and as a step forward in the allergenicity assessment of GM plants, it is recommended that known endogenous allergens are included in the compositional analysis as additional parameters to be measured.

  16. Sensitivity analysis of a two-dimensional quantitative microbiological risk assessment: keeping variability and uncertainty separated.

    Science.gov (United States)

    Busschaert, Pieter; Geeraerd, Annemie H; Uyttendaele, Mieke; Van Impe, Jan F

    2011-08-01

    The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo-randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used-that is, an ANOVA-like model and Sobol sensitivity indices-to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.

  17. Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials

    DEFF Research Database (Denmark)

    Grieger, Khara Deanne; Linkov, Igor; Hansen, Steffen Foss;

    7.1.7 Critical analysis of frameworks and approaches to assess the environmental risks of nanomaterials Khara D. Grieger1, Igor Linkov2, Steffen Foss Hansen1, Anders Baun1 1Technical University of Denmark, Kgs. Lyngby, Denmark 2Environmental Laboratory, U.S. Army Corps of Engineers, Brookline, USA...... Email: kdg@env.dtu.dk Scientists, organizations, governments, and policy-makers are currently involved in reviewing, adapting, and formulating risk assessment frameworks and strategies to understand and assess the potential environmental risks of engineered nanomaterials (NM). It is becoming...

  18. Establishment of a Risk Assessment Framework for Analysis of the Spread of Highly Pathogenic Avian Influenza

    Institute of Scientific and Technical Information of China (English)

    LI Jing; WANG Jing-fei; WU Chun-yan; YANG Yan-tao; JI Zeng-tao; WANG Hong-bin

    2007-01-01

    To evaluate the risk of highly pathogenic avian influenza (HPAI) in mainland China, a risk assessment framework was built.Risk factors were determined by analyzing the epidemic data using the brainstorming method; the analytic hierarchy process was designed to weigh risk factors, and the integrated multicriteria analysis was used to evaluate the final result.The completed framework included the risk factor system, data standards for risk factors, weights of risk factors, and integrated assessment methods. This risk assessment framework can be used to quantitatively analyze the outbreak and spread of HPAI in mainland China.

  19. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations

    Science.gov (United States)

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific. PMID:28076440

  20. Seismic fragility analysis of a nuclear building based on probabilistic seismic hazard assessment and soil-structure interaction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, R.; Ni, S.; Chen, R.; Han, X.M. [CANDU Energy Inc, Mississauga, Ontario (Canada); Mullin, D. [New Brunswick Power, Point Lepreau, New Brunswick (Canada)

    2016-09-15

    Seismic fragility analyses are conducted as part of seismic probabilistic safety assessment (SPSA) for nuclear facilities. Probabilistic seismic hazard assessment (PSHA) has been undertaken for a nuclear power plant in eastern Canada. Uniform Hazard Spectra (UHS), obtained from the PSHA, is characterized by high frequency content which differs from the original plant design basis earthquake spectral shape. Seismic fragility calculations for the service building of a CANDU 6 nuclear power plant suggests that the high frequency effects of the UHS can be mitigated through site response analysis with site specific geological conditions and state-of-the-art soil-structure interaction analysis. In this paper, it is shown that by performing a detailed seismic analysis using the latest technology, the conservatism embedded in the original seismic design can be quantified and the seismic capacity of the building in terms of High Confidence of Low Probability of Failure (HCLPF) can be improved. (author)

  1. Degradation Assessment and Fault Diagnosis for Roller Bearing Based on AR Model and Fuzzy Cluster Analysis

    Directory of Open Access Journals (Sweden)

    Lingli Jiang

    2011-01-01

    Full Text Available This paper proposes a new approach combining autoregressive (AR model and fuzzy cluster analysis for bearing fault diagnosis and degradation assessment. AR model is an effective approach to extract the fault feature, and is generally applied to stationary signals. However, the fault vibration signals of a roller bearing are non-stationary and non-Gaussian. Aiming at this problem, the set of parameters of the AR model is estimated based on higher-order cumulants. Consequently, the AR parameters are taken as the feature vectors, and fuzzy cluster analysis is applied to perform classification and pattern recognition. Experiments analysis results show that the proposed method can be used to identify various types and severities of fault bearings. This study is significant for non-stationary and non-Gaussian signal analysis, fault diagnosis and degradation assessment.

  2. Socio-economic analysis : a tool for assessing the potential of nanotechnologies

    OpenAIRE

    Brignon, Jean-Marc

    2011-01-01

    International audience; Cost-Benefit Analysis (CBA) has a long history, especially in the USA, of being used for the assessment of new regulation, new infrastructure and more recently for new technologies. Under the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation, especially for the placing of chemicals on the market (REACh regulation) and the operation of industrial installations (Industrial Emissions Directive). As far as REACh a...

  3. Elusive Critical Elements of Transformative Risk Assessment Practice and Interpretation: Is Alternatives Analysis the Next Step?

    Science.gov (United States)

    Francis, Royce A

    2015-11-01

    This article argues that "game-changing" approaches to risk analysis must focus on "democratizing" risk analysis in the same way that information technologies have democratized access to, and production of, knowledge. This argument is motivated by the author's reading of Goble and Bier's analysis, "Risk Assessment Can Be a Game-Changing Information Technology-But Too Often It Isn't" (Risk Analysis, 2013; 33: 1942-1951), in which living risk assessments are shown to be "game changing" in probabilistic risk analysis. In this author's opinion, Goble and Bier's article focuses on living risk assessment's potential for transforming risk analysis from the perspective of risk professionals-yet, the game-changing nature of information technologies has typically achieved a much broader reach. Specifically, information technologies change who has access to, and who can produce, information. From this perspective, the author argues that risk assessment is not a game-changing technology in the same way as the printing press or the Internet because transformative information technologies reduce the cost of production of, and access to, privileged knowledge bases. The author argues that risk analysis does not reduce these costs. The author applies Goble and Bier's metaphor to the chemical risk analysis context, and in doing so proposes key features that transformative risk analysis technology should possess. The author also discusses the challenges and opportunities facing risk analysis in this context. These key features include: clarity in information structure and problem representation, economical information dissemination, increased transparency to nonspecialists, democratized manufacture and transmission of knowledge, and democratic ownership, control, and interpretation of knowledge. The chemical safety decision-making context illustrates the impact of changing the way information is produced and accessed in the risk context. Ultimately, the author concludes that although

  4. An Interactive Assessment Framework for Visual Engagement: Statistical Analysis of a TEDx Video

    Science.gov (United States)

    Farhan, Muhammad; Aslam, Muhammad

    2017-01-01

    This study aims to assess the visual engagement of the video lectures. This analysis can be useful for the presenter and student to find out the overall visual attention of the videos. For this purpose, a new algorithm and data collection module are developed. Videos can be transformed into a dataset with the help of data collection module. The…

  5. Self-evaluation of assessment programs: A cross-case analysis

    NARCIS (Netherlands)

    Baartman, Liesbeth; Prins, Frans; Kirschner, Paul A.; Van der Vleuten, Cees

    2011-01-01

    Baartman, L. K. J., Prins, F. J., Kirschner, P. A., & Van der Vleuten, C. P. M. (2011). Self-evaluation of assessment programs: A cross-case analysis. Evaluation and Program Planning, 34, 206-216. doi: 10.1016/j.evalprogplan.2011.03.001

  6. Reliability of ^1^H NMR analysis for assessment of lipid oxidation at frying temperatures

    Science.gov (United States)

    The reliability of a method using ^1^H NMR analysis for assessment of oil oxidation at a frying temperature was examined. During heating and frying at 180 °C, changes of soybean oil signals in the ^1^H NMR spectrum including olefinic (5.16-5.30 ppm), bisallylic (2.70-2.88 ppm), and allylic (1.94-2.1...

  7. Software Integration of Life Cycle Assessment and Economic Analysis for Process Evaluation

    DEFF Research Database (Denmark)

    Kalakula, Sawitree; Malakula, Pomthong; Siemanonda, Kitipat

    2013-01-01

    This study is focused on the sustainable process design of bioethanol production from cassava rhizome. The study includes: process simulation, sustainability analysis, economic evaluation and life cycle assessment (LCA). A steady state process simulation if performed to generate a base case desig...

  8. Assessing Model Fit: Caveats and Recommendations for Confirmatory Factor Analysis and Exploratory Structural Equation Modeling

    Science.gov (United States)

    Perry, John L.; Nicholls, Adam R.; Clough, Peter J.; Crust, Lee

    2015-01-01

    Despite the limitations of overgeneralizing cutoff values for confirmatory factor analysis (CFA; e.g., Marsh, Hau, & Wen, 2004), they are still often employed as golden rules for assessing factorial validity in sport and exercise psychology. The purpose of this study was to investigate the appropriateness of using the CFA approach with these…

  9. Further potentials in the joint implementation of life cycle assessment and data envelopment analysis.

    Science.gov (United States)

    Iribarren, Diego; Vázquez-Rowe, Ian; Moreira, María Teresa; Feijoo, Gumersindo

    2010-10-15

    The combined application of Life Cycle Assessment and Data Envelopment Analysis has been recently proposed to provide a tool for the comprehensive assessment of the environmental and operational performance of multiple similar entities. Among the acknowledged advantages of LCA+DEA methodology, eco-efficiency verification and avoidance of average inventories are usually highlighted. However, given the novelty of LCA+DEA methods, a high number of additional potentials remain unexplored. In this sense, there are some features that are worth detailing given their wide interest to enhance LCA performance. Emphasis is laid on the improved interpretation of LCA results through the complementary use of DEA with respect to: (i) super-efficiency analysis to facilitate the selection of reference performers, (ii) inter- and intra-assessments of multiple data sets within any specific sector with benchmarking and trend analysis purposes, (iii) integration of an economic dimension in order to enrich sustainability assessments, and (iv) window analysis to evaluate environmental impact efficiency over a certain period of time. Furthermore, the capability of LCA+DEA methodology to be generally implemented in a wide range of scenarios is discussed. These further potentials are explained and demonstrated via the presentation of brief case studies based on real data sets.

  10. Language Assessment Impacts in China:a Tentative Analysis of TEM8

    Institute of Scientific and Technical Information of China (English)

    Cui; Yingqiong; Cheng; Hongying

    2015-01-01

    The paper aims to present a tentative analysis of language assessing impacts on relevant parties.It will start with discussing the connotation of test impacts and then the analysing the impacts of TEM8 on test takers,teachers and social level.The importance of such impacts will also be revealed in this paper.

  11. Substituted plan analysis in the environmental impact assessment of Yongchuan wastewater treatment project

    Institute of Scientific and Technical Information of China (English)

    FANG Jun-hua

    2006-01-01

    Substituted plan in the environmental impact assessment (EIA) mainly means the treatment technology and the substituted site of plant, and it also includes the many kinds of environment protection measures. This paper will make detailed analysis on the treatment technology, the substituted site of plant, the purpose of discharged water and the dispose of sludge in the Yongchuan wastewater treatment project.

  12. Computational Psycholinguistic Analysis and Its Application in Psychological Assessment of College Students

    Science.gov (United States)

    Kucera, Dalibor; Havigerová, Jana M.

    2015-01-01

    The paper deals with the issue of computational psycholinguistic analysis (CPA) and its experimental application in basic psychological and pedagogical assessment. CPA is a new method which may potentially provide interesting, psychologically relevant information about the author of a particular text, regardless of the text's factual (semantic)…

  13. Identifying Barriers in Implementing Outcomes-Based Assessment Program Review: A Grounded Theory Analysis

    Science.gov (United States)

    Bresciani, Marilee J.

    2011-01-01

    The purpose of this grounded theory study was to identify the typical barriers encountered by faculty and administrators when implementing outcomes-based assessment program review. An analysis of interviews with faculty and administrators at nine institutions revealed a theory that faculty and administrators' promotion, tenure (if applicable),…

  14. Taxometric Analysis of the Antisocial Features Scale of the Personality Assessment Inventory in Federal Prison Inmates

    Science.gov (United States)

    Walters, Glenn D.; Diamond, Pamela M.; Magaletta, Philip R.; Geyer, Matthew D.; Duncan, Scott A.

    2007-01-01

    The Antisocial Features (ANT) scale of the Personality Assessment Inventory (PAI) was subjected to taxometric analysis in a group of 2,135 federal prison inmates. Scores on the three ANT subscales--Antisocial Behaviors (ANT-A), Egocentricity (ANT-E), and Stimulus Seeking (ANT-S)--served as indicators in this study and were evaluated using the…

  15. A Critical Examination of the Assessment Analysis Capabilities of OCLC ACAS

    Science.gov (United States)

    Lyons, Lucy E.

    2005-01-01

    Over 500 libraries have employed OCLC's iCAS and its successor Automated Collection Assessment and Analysis Services (ACAS) as bibliometric tools to evaluate monograph collections. This examination of ACAS reveals both its methodological limitations and its feasibility as an indicator of collecting patterns. The results can be used to maximize the…

  16. A critical assessment of the calculation and analysis of thermodynamic parameters from adsorption data

    OpenAIRE

    2015-01-01

    Proper analysis of thermodynamic parameters obtained from adsorption data is a basic requirement for the characterization and optimization of an adsorption-dependent process like the action of organic corrosion inhibitors. Thus, this work aims at presenting a critical assessment of typical flawed examples from the literature together with alternative good practice to be considered, for preference.

  17. Assessment of Smolt Condition for Travel Time Analysis, 1993-1994 Annual Report.

    Energy Technology Data Exchange (ETDEWEB)

    Schrock, Robin M; Beeman, John W; VanderKooi, Scott P [US Geological Survey, Western Fisheries Research Center, Columbia River Research Laboratory, Cook, WA

    1999-02-01

    The assessment of smolt condition for travel time analysis (ASCTTA) project provided information on the level of smoltification in Columbia River hatchery and wild salmonid stocks to the Fish Passage Center (FPC), for the primary purpose of in-river management of flows.

  18. MONTHLY VARIATION IN SPERM MOTILITY IN COMMON CARP ASSESSED USING COMPUTER-ASSISTED SPERM ANALYSIS (CASA)

    Science.gov (United States)

    Sperm motility variables from the milt of the common carp Cyprinus carpio were assessed using a computer-assisted sperm analysis (CASA) system across several months (March-August 1992) known to encompass the natural spawning period. Two-year-old pond-raised males obtained each mo...

  19. Investigating Faculty Familiarity with Assessment Terminology by Applying Cluster Analysis to Interpret Survey Data

    Science.gov (United States)

    Raker, Jeffrey R.; Holme, Thomas A.

    2014-01-01

    A cluster analysis was conducted with a set of survey data on chemistry faculty familiarity with 13 assessment terms. Cluster groupings suggest a high, middle, and low overall familiarity with the terminology and an independent high and low familiarity with terms related to fundamental statistics. The six resultant clusters were found to be…

  20. Use of Analog Functional Analysis in Assessing the Function of Mealtime Behavior Problems.

    Science.gov (United States)

    Girolami, Peter A.; Scotti, Joseph R.

    2001-01-01

    This study applied the methodology of an analog experimental (functional) analysis of behavior to the specific interaction between parents and three children with mental retardation exhibiting food refusal and related mealtime problems. Analog results were highly consistent with other forms of functional assessment data, including interviews,…

  1. Local Assessment: Using Genre Analysis to Validate Directed Self-Placement

    Science.gov (United States)

    Gere, Anne Ruggles; Aull, Laura; Escudero, Moises Damian Perales; Lancaster, Zak; Lei, Elizabeth Vander

    2013-01-01

    Grounded in the principle that writing assessment should be locally developed and controlled, this article describes a study that contextualizes and validates the decisions that students make in the modified Directed Self-Placement (DSP) process used at the University of Michigan. The authors present results of a detailed text analysis of…

  2. Assessment of balance in propensity score analysis in the medical literature: A systematic review

    NARCIS (Netherlands)

    Ali, M. Sanni; Groenwold, Rolf H.H.; Belitser, Svetlana V.; Pestman, Wiebe R.; Hoes, Arno W.; Roes, Kit C.B.; Boer, Ade; Klungel, Olaf H.

    2013-01-01

    Background: Assessing balance on co-variate distributions between treatment groups with a given propensity score (PS) is a crucial step in PS analysis. Several methodological papers comparing different balance measures have been published in the last decade. However, the current practice on measurin

  3. Dynamic Ecocentric Assessment Combining Emergy and Data Envelopment Analysis: Application to Wind Farms

    Directory of Open Access Journals (Sweden)

    Mario Martín-Gamboa

    2016-01-01

    Full Text Available Most of current life-cycle approaches show an anthropocentric standpoint for the evaluation of human-dominated activities. However, this perspective is insufficient when it comes to assessing the contribution of natural resources to production processes. In this respect, emergy analysis evaluates human-driven systems from a donor-side perspective, accounting for the environmental effort performed to make the resources available. This article presents a novel methodological framework, which combines emergy analysis and dynamic Data Envelopment Analysis (DEA for the ecocentric assessment of multiple resembling entities over an extended period of time. The use of this approach is shown through a case study of wind energy farms. Furthermore, the results obtained are compared with those of previous studies from two different angles. On the one hand, a comparison with results from anthropocentric approaches (combined life cycle assessment and DEA is drawn. On the other hand, results from similar ecocentric approaches, but without a dynamic model, are also subject to comparison. The combined use of emergy analysis and dynamic DEA is found to be a valid methodological framework for the computation of resource efficiency and the valuation of ecosystem services. It complements traditional anthropocentric assessments while appropriately including relevant time effects.

  4. Explore the Usefulness of Person-Fit Analysis on Large-Scale Assessment

    Science.gov (United States)

    Cui, Ying; Mousavi, Amin

    2015-01-01

    The current study applied the person-fit statistic, l[subscript z], to data from a Canadian provincial achievement test to explore the usefulness of conducting person-fit analysis on large-scale assessments. Item parameter estimates were compared before and after the misfitting student responses, as identified by l[subscript z], were removed. The…

  5. Cluster Analysis of Assessment in Anatomy and Physiology for Health Science Undergraduates

    Science.gov (United States)

    Brown, Stephen; White, Sue; Power, Nicola

    2016-01-01

    Academic content common to health science programs is often taught to a mixed group of students; however, content assessment may be consistent for each discipline. This study used a retrospective cluster analysis on such a group, first to identify high and low achieving students, and second, to determine the distribution of students within…

  6. A meta-analysis of the relationship between individual assessments and job performance.

    Science.gov (United States)

    Morris, Scott B; Daisley, Rebecca L; Wheeler, Megan; Boyer, Peggy

    2015-01-01

    Though individual assessments are widely used in selection settings, very little research exists to support their criterion-related validity. A random-effects meta-analysis was conducted of 39 individual assessment validation studies. For the current research, individual assessments were defined as any employee selection procedure that involved (a) multiple assessment methods, (b) administered to an individual examinee, and (c) relying on assessor judgment to integrate the information into an overall evaluation of the candidate's suitability for a job. Assessor recommendations were found to be useful predictors of job performance, although the level of validity varied considerably across studies. Validity tended to be higher for managerial than nonmanagerial occupations and for assessments that included a cognitive ability test. Validity was not moderated by the degree of standardization of the assessment content or by use of multiple assessors for each candidate. However, higher validities were found when the same assessor was used across all candidates than when different assessors evaluated different candidates. These results should be interpreted with caution, given a small number of studies for many of the moderator subgroups as well as considerable evidence of publication bias. These limitations of the available research base highlight the need for additional empirical work to inform individual assessment practices. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  7. Analysis And Assessment Of The Security Method Against Incidental Contamination In The Collective Water Supply System

    Directory of Open Access Journals (Sweden)

    Szpak Dawid

    2015-09-01

    Full Text Available The paper presents the main types of surface water incidental contaminations and the security method against incidental contamination in water sources. Analysis and assessment the collective water supply system (CWSS protection against incidental contamination was conducted. Failure Mode and Effects Analysis (FMEA was used. The FMEA method allow to use the product or process analysis, identification of weak points, and implementation the corrections and new solutions for eliminating the source of undesirable events. The developed methodology was shown in application case. It was found that the risk of water contamination in water-pipe network of the analyzed CWSS caused by water source incidental contamination is at controlled level.

  8. Novel approach to continuous adventitious respiratory sound analysis for the assessment of bronchodilator response

    Science.gov (United States)

    Fiz, José Antonio; Martínez-Rivera, Carlos; Torrents, Aurora; Ruiz-Manzano, Juan; Jané, Raimon

    2017-01-01

    Background A thorough analysis of continuous adventitious sounds (CAS) can provide distinct and complementary information about bronchodilator response (BDR), beyond that provided by spirometry. Nevertheless, previous approaches to CAS analysis were limited by certain methodology issues. The aim of this study is to propose a new integrated approach to CAS analysis that contributes to improving the assessment of BDR in clinical practice for asthma patients. Methods Respiratory sounds and flow were recorded in 25 subjects, including 7 asthma patients with positive BDR (BDR+), assessed by spirometry, 13 asthma patients with negative BDR (BDR-), and 5 controls. A total of 5149 acoustic components were characterized using the Hilbert spectrum, and used to train and validate a support vector machine classifier, which distinguished acoustic components corresponding to CAS from those corresponding to other sounds. Once the method was validated, BDR was assessed in all participants by CAS analysis, and compared to BDR assessed by spirometry. Results BDR+ patients had a homogenous high change in the number of CAS after bronchodilation, which agreed with the positive BDR by spirometry, indicating high reversibility of airway obstruction. Nevertheless, we also found an appreciable change in the number of CAS in many BDR- patients, revealing alterations in airway obstruction that were not detected by spirometry. We propose a categorization for the change in the number of CAS, which allowed us to stratify BDR- patients into three consistent groups. From the 13 BDR- patients, 6 had a high response, similar to BDR+ patients, 4 had a noteworthy medium response, and 1 had a low response. Conclusions In this study, a new non-invasive and integrated approach to CAS analysis is proposed as a high-sensitive tool for assessing BDR in terms of acoustic parameters which, together with spirometry parameters, contribute to improving the stratification of BDR levels in patients with

  9. Application of synthetic principal component analysis model to mine area farmland heavy metal pollution assessment

    Institute of Scientific and Technical Information of China (English)

    WANG Cong-lu; WU Chao; WANG Wei-jun

    2008-01-01

    Referring to GB5618-1995 about heavy metal pollution, and using statistical analysis SPSS, the major pollutants of mine area farmland heavy metal pollution were identified by variable clustering analysis. Assessment and classification were done to the mine area farmland heavy metal pollution situation by synthetic principal components analysis (PCA). The results show that variable clustering analysis is efficient to identify the principal components of mine area farmland heavy metal pollution. Sort and clustering were done to the synthetic principal components scores of soil sample, which is given by synthetic principal components analysis. Data structure of soil heavy metal contaminations, relationships and pollution level of different soil samples are discovered. The results of mine area farmland heavy metal pollution quality assessed and classified with synthetic component scores reflect the influence of both the major and compound heavy metal pol-lutants. Identification and assessment results of mine area farmland heavy metal pollution can provide reference and guide to propose control measures of mine area farmland heavy metal pollution and focus on the key treatment region.

  10. Socio-economic analysis: a tool for assessing the potential of nanotechnologies

    Science.gov (United States)

    Brignon, Jean-Marc

    2011-07-01

    Cost-Benefit Analysis (CBA) has a long history, especially in the USA, of being used for the assessment of new regulation, new infrastructure and more recently for new technologies. Under the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation, especially for the placing of chemicals on the market (REACh regulation) and the operation of industrial installations (Industrial Emissions Directive). As far as REACh and other EU legislation apply specifically to nanomaterials in the future, SEA might become an important assessment tool for nanotechnologies. The most important asset of SEA regarding nanomaterials, is the comparison with alternatives in socio-economic scenarios, which is key for the understanding of how a nanomaterial "socially" performs in comparison with its alternatives. "Industrial economics" methods should be introduced in SEAs to make industry and the regulator share common concepts and visions about economic competitiveness implications of regulating nanotechnologies, SEA and Life Cycle Analysis (LCA) can complement each other : Socio-Economic LCA are increasingly seen as a complete assessment tool for nanotechnologies, but the perspective between Social LCA and SEA are different and the respective merits and limitations of both approaches should be kept in mind. SEA is a "pragmatic regulatory impact analysis", that uses a cost/benefit framework analysis but remains open to other disciplines than economy, and open to the participation of stakeholders for the construction of scenarios of the deployment of technologies and the identification of alternatives. SEA is "pragmatic" in the sense that it is driven by the purpose to assess "what happens" with the introduction of nanotechnology, and uses methodologies such as Life Cycle Analysis only as far as they really contribute to that goal. We think that, being pragmatic, SEA is also adaptative, which is a key quality to handle the novelty of

  11. Strategic Environmental Assessment Framework for Landscape-Based, Temporal Analysis of Wetland Change in Urban Environments

    Science.gov (United States)

    Sizo, Anton; Noble, Bram F.; Bell, Scott

    2016-03-01

    This paper presents and demonstrates a spatial framework for the application of strategic environmental assessment (SEA) in the context of change analysis for urban wetland environments. The proposed framework is focused on two key stages of the SEA process: scoping and environmental baseline assessment. These stages are arguably the most information-intense phases of SEA and have a significant effect on the quality of the SEA results. The study aims to meet the needs for proactive frameworks to assess and protect wetland habitat and services more efficiently, toward the goal of advancing more intelligent urban planning and development design. The proposed framework, adopting geographic information system and remote sensing tools and applications, supports the temporal evaluation of wetland change and sustainability assessment based on landscape indicator analysis. The framework was applied to a rapidly developing urban environment in the City of Saskatoon, Saskatchewan, Canada, analyzing wetland change and land-use pressures from 1985 to 2011. The SEA spatial scale was rescaled from administrative urban planning units to an ecologically meaningful area. Landscape change assessed was based on a suite of indicators that were subsequently rolled up into a single, multi-dimensional, and easy to understand and communicate index to examine the implications of land-use change for wetland sustainability. The results show that despite the recent extremely wet period in the Canadian prairie region, land-use change contributed to increasing threats to wetland sustainability.

  12. Strategic Environmental Assessment Framework for Landscape-Based, Temporal Analysis of Wetland Change in Urban Environments.

    Science.gov (United States)

    Sizo, Anton; Noble, Bram F; Bell, Scott

    2016-03-01

    This paper presents and demonstrates a spatial framework for the application of strategic environmental assessment (SEA) in the context of change analysis for urban wetland environments. The proposed framework is focused on two key stages of the SEA process: scoping and environmental baseline assessment. These stages are arguably the most information-intense phases of SEA and have a significant effect on the quality of the SEA results. The study aims to meet the needs for proactive frameworks to assess and protect wetland habitat and services more efficiently, toward the goal of advancing more intelligent urban planning and development design. The proposed framework, adopting geographic information system and remote sensing tools and applications, supports the temporal evaluation of wetland change and sustainability assessment based on landscape indicator analysis. The framework was applied to a rapidly developing urban environment in the City of Saskatoon, Saskatchewan, Canada, analyzing wetland change and land-use pressures from 1985 to 2011. The SEA spatial scale was rescaled from administrative urban planning units to an ecologically meaningful area. Landscape change assessed was based on a suite of indicators that were subsequently rolled up into a single, multi-dimensional, and easy to understand and communicate index to examine the implications of land-use change for wetland sustainability. The results show that despite the recent extremely wet period in the Canadian prairie region, land-use change contributed to increasing threats to wetland sustainability.

  13. Bibliometric analysis of global environmental assessment research in a 20-year period

    Energy Technology Data Exchange (ETDEWEB)

    Li, Wei, E-mail: weili@bnu.edu.cn; Zhao, Yang

    2015-01-15

    Based on the samples of 113,468 publications on environmental assessment (EA) from the past 20 years, we used a bibliometric analysis to study the literature in terms of trends of growth, subject categories and journals, international collaboration, geographic distribution of publications, and scientific research issues. By applying thresholds to network centralities, a core group of countries can be distinguished as part of the international collaboration network. A frequently used keywords analysis found that the priority in assessment would gradually change from project environmental impact assessment (EIA) to strategic environmental assessment (SEA). Decision-theoretic approaches (i.e., environmental indicator selection, life cycle assessment, etc.), along with new technologies and methods (i.e., the geographic information system and modeling) have been widely applied in the EA research field over the past 20 years. Hot spots such as “biodiversity” and “climate change” have been emphasized in current EA research, a trend that will likely continue in the future. The h-index has been used to evaluate the research quality among countries all over the world, while the improvement of developing countries' EA systems is becoming a popular research topic. Our study reveals patterns in scientific outputs and academic collaborations and serves as an alternative and innovative way of revealing global research trends in the EA research field.

  14. Complex health care interventions: Characteristics relevant for ethical analysis in health technology assessment.

    Science.gov (United States)

    Lysdahl, Kristin Bakke; Hofmann, Bjørn

    2016-01-01

    Complexity entails methodological challenges in assessing health care interventions. In order to address these challenges, a series of characteristics of complexity have been identified in the Health Technology Assessment (HTA) literature. These characteristics are primarily identified and developed to facilitate effectiveness, safety, and cost-effectiveness analysis. However, ethics is also a constitutive part of HTA, and it is not given that the conceptions of complexity that appears relevant for effectiveness, safety, and cost-effectiveness analysis are also relevant and directly applicable for ethical analysis in HTA. The objective of this article is therefore to identify and elaborate a set of key characteristics of complex health care interventions relevant for addressing ethical aspects in HTA. We start by investigating the relevance of the characteristics of complex interventions, as defined in the HTA literature. Most aspects of complexity found to be important when assessing effectiveness, safety, and efficiency turn out also to be relevant when assessing ethical issues of a given health technology. However, the importance and relevance of the complexity characteristics may differ when addressing ethical issues rather than effectiveness. Moreover, the moral challenges of a health care intervention may themselves contribute to the complexity. After identifying and analysing existing conceptions of complexity, we synthesise a set of five key characteristics of complexity for addressing ethical aspects in HTA: 1) multiple and changing perspectives, 2) indeterminate phenomena, 3) uncertain causality, 4) unpredictable outcome, and 5) ethical complexity. This may serve as an analytic tool in addressing ethical issues in HTA of complex interventions.

  15. Sustainable development in the building industry: an analysis and assessment tool for design of disassembly

    Science.gov (United States)

    Graubner, Carl-Alexander; Reiche, Katja

    2001-02-01

    Ecologically Sustainable Development (ESD) has been embraced by governments worldwide and as building plays a key role in development, it is implicated in this movement. Consideration of the whole life cycle of a building is a major aspect, when assessing its sustainability. While the reduction of operating energy and the optimization of building material selection has been a main focus of research in Europe, the consideration of maintenance during operation or the demolition of a building at the end of its life has usually been neglected. Aiming for sustainability the conversation of materials and energy by applying a closed system approach on a long term time scale must be realized. Therefore building materials are to be recycled, building elements are to be reused and buildings are to be more flexible. Designing to facilitate the disassembly of building elements is expected to be an improvement for sustainable buildings. A tool for the assessment of building elements has been developed that focuses on connection selection, its influence on material and energy flow, as well as the quality of building waste materials. The assessment of material production and erection processes, using Life Cycle Assessment is completed with a qualitative/quantitative classification of demolition processes, and disposal scenarios, considering environmental, economic and technical aspects. An analysis of floor elements has confirmed, that Design for Disassembly is very promising for the improvement of sustainable buildings but that improvement potentials can differ considerably. Details of the analysis tool developed and an analysis of building elements will be shown in this article

  16. Evaluation of SOVAT: An OLAP-GIS decision support system for community health assessment data analysis

    Directory of Open Access Journals (Sweden)

    Parmanto Bambang

    2008-06-01

    Full Text Available Abstract Background Data analysis in community health assessment (CHA involves the collection, integration, and analysis of large numerical and spatial data sets in order to identify health priorities. Geographic Information Systems (GIS enable for management and analysis using spatial data, but have limitations in performing analysis of numerical data because of its traditional database architecture. On-Line Analytical Processing (OLAP is a multidimensional datawarehouse designed to facilitate querying of large numerical data. Coupling the spatial capabilities of GIS with the numerical analysis of OLAP, might enhance CHA data analysis. OLAP-GIS systems have been developed by university researchers and corporations, yet their potential for CHA data analysis is not well understood. To evaluate the potential of an OLAP-GIS decision support system for CHA problem solving, we compared OLAP-GIS to the standard information technology (IT currently used by many public health professionals. Methods SOVAT, an OLAP-GIS decision support system developed at the University of Pittsburgh, was compared against current IT for data analysis for CHA. For this study, current IT was considered the combined use of SPSS and GIS ("SPSS-GIS". Graduate students, researchers, and faculty in the health sciences at the University of Pittsburgh were recruited. Each round consisted of: an instructional video of the system being evaluated, two practice tasks, five assessment tasks, and one post-study questionnaire. Objective and subjective measurement included: task completion time, success in answering the tasks, and system satisfaction. Results Thirteen individuals participated. Inferential statistics were analyzed using linear mixed model analysis. SOVAT was statistically significant (α = .01 from SPSS-GIS for satisfaction and time (p Conclusion Using SOVAT, tasks were completed more efficiently, with a higher rate of success, and with greater satisfaction, than the

  17. Rasch analysis in the development of a rating scale for assessment of mobility after stroke

    DEFF Research Database (Denmark)

    Engberg, A; Garde, B; Kreiner, S

    1995-01-01

    The study describes the development of a rating scale for assessment of mobility after stroke. It was based on 74 first-stroke patients, 40 men and 34 women, each assessed three times during rehabilitation. Their median age was 69 years, and they represented all degrees of severity of paresis. Co...... in the 10-item subscores; 3) the score sum is independent of age, side of hemiparesis, and gender of the patient. Latent trait analysis (Rasch) was found to be an ideal model for statistical investigation of these properties....

  18. An Analysis Of Tensile Test Results to Assess the Innovation Risk for an Additive Manufacturing Technology

    Directory of Open Access Journals (Sweden)

    Adamczak Stanisław

    2015-03-01

    Full Text Available The aim of this study was to assess the innovation risk for an additive manufacturing process. The analysis was based on the results of static tensile tests obtained for specimens made of photocured resin. The assessment involved analyzing the measurement uncertainty by applying the FMEA method. The structure of the causes and effects of the discrepancies was illustrated using the Ishikawa diagram. The risk priority numbers were calculated. The uncertainty of the tensile test measurement was determined for three printing orientations. The results suggest that the material used to fabricate the tensile specimens shows clear anisotropy of the properties in relation to the printing direction.

  19. Structural integrity assessment by using finite element analysis based on damage mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Chang Sik; Kim, Nak Hyun; Kim, Yun Jae [Korea University, Seoul (Korea, Republic of)

    2009-07-01

    This paper introduces structural integrity assessment by using Finite Element analysis based on damage mechanics. Several FE damage methods as like GTN model have been proposed up to the present. These damage models have their own advantages and disadvantages. It is important to select the proper damage model for the integrity assessment of the structure in interest. In this paper, selected several damage models are apply to simulate fracture behaviours of structures with various geometries, and the FE results are compared with the experimental results. These models are implemented to general purpose FE program, ABAQUS, via user-defined subroutines.

  20. How does scientific risk assessment of GM crops fit within the wider risk analysis?

    Science.gov (United States)

    Johnson, Katy L; Raybould, Alan F; Hudson, Malcolm D; Poppy, Guy M

    2007-01-01

    The debate concerning genetically modified crops illustrates confusion between the role of scientists and that of wider society in regulatory decision making. We identify two fundamental misunderstandings, which, if rectified, would allow progress with confidence. First, scientific risk assessment needs to test well-defined hypotheses, not simply collect data. Second, risk assessments need to be placed in the wider context of risk analysis to enable the wider 'non-scientific' questions to be considered in regulatory decision making. Such integration and understanding is urgently required because the challenges to regulation will escalate as scientific progress advances.

  1. Modification of ACAP for use in accuracy assessment of safety analysis software at OPG

    Energy Technology Data Exchange (ETDEWEB)

    Popescu, A.I.; Pascoe, J.; Luxat, J.C. [Ontario Power Generation Inc., Nuclear Safety Technology Dept., Toronto, Ontario (Canada)

    2002-07-01

    ACAP (Automated Code Assessment Program) is a software tool designed to perform a comparison between either code results and experimental measurements, or code-to-code comparison by means of figures of merit (FOM). The FOM are based on equations used in approximation theory, time-series data analysis, and statistical analysis and provide an objective assessment of the agreement between individual or suite comparisons. This paper describes new ACAP features and FOM developed and implemented at OPG. These modifications were performed to increase productivity and enable ACAP to quantify accuracy of SA software in support of the OPG Software Qualification process. The capabilities added to ACAP are focused on data spectral analysis and normalcy assessment of residual values distribution. The latter functionality is provided by new FOM that compare the distribution of measured minus computed data with a normal distribution using the method of normal probability plot. Other new features implemented at OPG relate to linkage with other Windows applications, improvement of plot engine (power spectra graphics, Q-Q plots), etc. Application of ACAP to quantify the accuracy of the GOTHIC code modeling of buoyancy induced gas mixing using the LSGMF tests is described. Analyses of 170 experiment/code prediction comparisons using ACAP indicate that it is well suited for qualification of accuracy for software used in Safety Analysis. (author)

  2. A multicriteria decision analysis model and risk assessment framework for carbon capture and storage.

    Science.gov (United States)

    Humphries Choptiany, John Michael; Pelot, Ronald

    2014-09-01

    Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life-cycle assessments and cost-benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil-fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high-level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions.

  3. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas.

    Science.gov (United States)

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-08-05

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation.

  4. Using Two Different Approaches to Assess Dietary Patterns: Hypothesis-Driven and Data-Driven Analysis

    Directory of Open Access Journals (Sweden)

    Ágatha Nogueira Previdelli

    2016-09-01

    Full Text Available The use of dietary patterns to assess dietary intake has become increasingly common in nutritional epidemiology studies due to the complexity and multidimensionality of the diet. Currently, two main approaches have been widely used to assess dietary patterns: data-driven and hypothesis-driven analysis. Since the methods explore different angles of dietary intake, using both approaches simultaneously might yield complementary and useful information; thus, we aimed to use both approaches to gain knowledge of adolescents’ dietary patterns. Food intake from a cross-sectional survey with 295 adolescents was assessed by 24 h dietary recall (24HR. In hypothesis-driven analysis, based on the American National Cancer Institute method, the usual intake of Brazilian Healthy Eating Index Revised components were estimated. In the data-driven approach, the usual intake of foods/food groups was estimated by the Multiple Source Method. In the results, hypothesis-driven analysis showed low scores for Whole grains, Total vegetables, Total fruit and Whole fruits, while, in data-driven analysis, fruits and whole grains were not presented in any pattern. High intakes of sodium, fats and sugars were observed in hypothesis-driven analysis with low total scores for Sodium, Saturated fat and SoFAA (calories from solid fat, alcohol and added sugar components in agreement, while the data-driven approach showed the intake of several foods/food groups rich in these nutrients, such as butter/margarine, cookies, chocolate powder, whole milk, cheese, processed meat/cold cuts and candies. In this study, using both approaches at the same time provided consistent and complementary information with regard to assessing the overall dietary habits that will be important in order to drive public health programs, and improve their efficiency to monitor and evaluate the dietary patterns of populations.

  5. Using Two Different Approaches to Assess Dietary Patterns: Hypothesis-Driven and Data-Driven Analysis

    Science.gov (United States)

    Previdelli, Ágatha Nogueira; de Andrade, Samantha Caesar; Fisberg, Regina Mara; Marchioni, Dirce Maria

    2016-01-01

    The use of dietary patterns to assess dietary intake has become increasingly common in nutritional epidemiology studies due to the complexity and multidimensionality of the diet. Currently, two main approaches have been widely used to assess dietary patterns: data-driven and hypothesis-driven analysis. Since the methods explore different angles of dietary intake, using both approaches simultaneously might yield complementary and useful information; thus, we aimed to use both approaches to gain knowledge of adolescents’ dietary patterns. Food intake from a cross-sectional survey with 295 adolescents was assessed by 24 h dietary recall (24HR). In hypothesis-driven analysis, based on the American National Cancer Institute method, the usual intake of Brazilian Healthy Eating Index Revised components were estimated. In the data-driven approach, the usual intake of foods/food groups was estimated by the Multiple Source Method. In the results, hypothesis-driven analysis showed low scores for Whole grains, Total vegetables, Total fruit and Whole fruits), while, in data-driven analysis, fruits and whole grains were not presented in any pattern. High intakes of sodium, fats and sugars were observed in hypothesis-driven analysis with low total scores for Sodium, Saturated fat and SoFAA (calories from solid fat, alcohol and added sugar) components in agreement, while the data-driven approach showed the intake of several foods/food groups rich in these nutrients, such as butter/margarine, cookies, chocolate powder, whole milk, cheese, processed meat/cold cuts and candies. In this study, using both approaches at the same time provided consistent and complementary information with regard to assessing the overall dietary habits that will be important in order to drive public health programs, and improve their efficiency to monitor and evaluate the dietary patterns of populations. PMID:27669289

  6. Using Two Different Approaches to Assess Dietary Patterns: Hypothesis-Driven and Data-Driven Analysis.

    Science.gov (United States)

    Previdelli, Ágatha Nogueira; de Andrade, Samantha Caesar; Fisberg, Regina Mara; Marchioni, Dirce Maria

    2016-09-23

    The use of dietary patterns to assess dietary intake has become increasingly common in nutritional epidemiology studies due to the complexity and multidimensionality of the diet. Currently, two main approaches have been widely used to assess dietary patterns: data-driven and hypothesis-driven analysis. Since the methods explore different angles of dietary intake, using both approaches simultaneously might yield complementary and useful information; thus, we aimed to use both approaches to gain knowledge of adolescents' dietary patterns. Food intake from a cross-sectional survey with 295 adolescents was assessed by 24 h dietary recall (24HR). In hypothesis-driven analysis, based on the American National Cancer Institute method, the usual intake of Brazilian Healthy Eating Index Revised components were estimated. In the data-driven approach, the usual intake of foods/food groups was estimated by the Multiple Source Method. In the results, hypothesis-driven analysis showed low scores for Whole grains, Total vegetables, Total fruit and Whole fruits), while, in data-driven analysis, fruits and whole grains were not presented in any pattern. High intakes of sodium, fats and sugars were observed in hypothesis-driven analysis with low total scores for Sodium, Saturated fat and SoFAA (calories from solid fat, alcohol and added sugar) components in agreement, while the data-driven approach showed the intake of several foods/food groups rich in these nutrients, such as butter/margarine, cookies, chocolate powder, whole milk, cheese, processed meat/cold cuts and candies. In this study, using both approaches at the same time provided consistent and complementary information with regard to assessing the overall dietary habits that will be important in order to drive public health programs, and improve their efficiency to monitor and evaluate the dietary patterns of populations.

  7. Root-cause analysis and health failure mode and effect analysis: two leading techniques in health care quality assessment.

    Science.gov (United States)

    Shaqdan, Khalid; Aran, Shima; Daftari Besheli, Laleh; Abujudeh, Hani

    2014-06-01

    In this review article, the authors provide a detailed series of guidelines for effectively performing root-cause analysis (RCA) and health failure mode and effect analysis (HFMEA). RCA is a retrospective approach used to ascertain the "root cause" of a problem that has already occurred, whereas HFMEA is a prospective risk assessment tool whose aim is to recognize risks to patient safety. RCA and HFMEA are used for the prevention of errors or recurring errors to create a safer workplace, maintain high standards in health care quality, and incorporate time-saving and cost-saving modifications to favorably affect the patient care environment. The principles and techniques provided here should allow reviewers to better understand the features of RCA and HFMEA and how to apply these processes appropriately. These principles include how to organize a team, identify root causes, seed out proximate causes, graphically describe the process, conduct a hazard analysis, and develop and implement potential action plans.

  8. Application of Item Analysis to Assess Multiple-Choice Examinations in the Mississippi Master Cattle Producer Program

    Science.gov (United States)

    Parish, Jane A.; Karisch, Brandi B.

    2013-01-01

    Item analysis can serve as a useful tool in improving multiple-choice questions used in Extension programming. It can identify gaps between instruction and assessment. An item analysis of Mississippi Master Cattle Producer program multiple-choice examination responses was performed to determine the difficulty of individual examinations, assess the…

  9. Risk Assessment of Repetitive Movements in Olive Growing: Analysis of Annual Exposure Level Assessment Models with the OCRA Checklist.

    Science.gov (United States)

    Proto, A R; Zimbalatti, G

    2015-10-01

    In Italy, one of the main agricultural crops is represented by the cultivation of olive trees. Olive cultivation characterizes the Italian agricultural landscape and national agricultural economics. Italy is the world's second largest producer of olive oil. Because olive cultivation requires the largest labor force in southern Italy, the aim of this research was to assess the risk of biomechanical overload of the workers' upper limbs. The objective, therefore, was to determine the level of risk that workers are exposed to in each phase of the production process. In Calabria, the second most important region in Italy for both the production of olive oil and cultivated area, there are 113,907 olive farms (83% of all farms) and 250,000 workers. To evaluate the risk of repetitive movements, all of the work tasks performed by workers on 100 farms in Calabria were analyzed. A total of 430 workers were interviewed over the four-year research period. To evaluate the level of exposure to repetitive movements, the OCRA (occupational repetitive actions) checklist was adopted. This checklist was the primary analytical tool during the preliminary risk assessment and in a given working situation. The analysis suggested by the OCRA checklist starts with pre-assigned scores (increasing in value with intensification of risk) for each of four main risk factors and additional factors. Between 2010 and 2013, surveys were conducted using the OCRA checklist with the aim of verifying musculoskeletal risks. The results obtained from the study of 430 workers allowed us to identify the level of exposure to risk. This analysis was conducted in the workplace to examine in detail the repetitive movements performed by the workers. The research was divided into two phases: first to provide preliminary information on the different tasks performed in olive growing, and second to assign a percentage to each task of the total hours worked in a year. Based on the results, this method could well

  10. Structured Assessment Approach: a microcomputer-based insider-vulnerability analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Patenaude, C.J.; Sicherman, A.; Sacks, I.J.

    1986-01-01

    The Structured Assessment Approach (SAA) was developed to help assess the vulnerability of safeguards systems to insiders in a staged manner. For physical security systems, the SAA identifies possible diversion paths which are not safeguarded under various facility operating conditions and insiders who could defeat the system via direct access, collusion or indirect tampering. For material control and accounting systems, the SAA identifies those who could block the detection of a material loss or diversion via data falsification or equipment tampering. The SAA, originally desinged to run on a mainframe computer, has been converted to run on a personal computer. Many features have been added to simplify and facilitate its use for conducting vulnerability analysis. For example, the SAA input, which is a text-like data file, is easily readable and can provide documentation of facility safeguards and assumptions used for the analysis.

  11. Chemometric Analysis for Pollution Source Assessment of Harbour Sediments in Arctic Locations

    DEFF Research Database (Denmark)

    Pedersen, Kristine B.; Lejon, Tore; Jensen, Pernille Erland

    2015-01-01

    Pollution levels, pollutant distribution and potential source assessments based on multivariate analysis (chemometrics) were made for harbour sediments from two Arctic locations; Hammerfest in Norway and Sisimiut in Greenland. High levels of heavy metals were detected in addition to organic...... pollutants. Preliminary assessments based on principal component analysis (PCA) revealed different sources and pollutant distribution in the sediments of the two harbours. Tributyltin (TBT) was, however, found to originate from point source(s), and the highest concentrations of TBT in both harbours were...... indicated relation primarily to German, Russian and American mixtures in Hammerfest; and American, Russian and Japanese mixtures in Sisimiut. PCA was shown to be an important tool for identifying pollutant sources and differences in pollutant composition in relation to sediment characteristics....

  12. Integrating Life-cycle Assessment into Transport Cost-benefit Analysis

    DEFF Research Database (Denmark)

    Manzo, Stefano; Salling, Kim Bang

    2016-01-01

    -term sustainability of a transport infrastructure project. In the present study we suggest to overcome this limit by combining a conventional life-cycle assessment approach with standard transport cost-benefit analysis. The suggested methodology is tested upon a case study project related to the construction of a new....... Additionally, they can significantly modify the weight of the different components of the overall project costs – evidently becoming a significant part of the estimated construction cost. Therefore, the suggested approach guarantees a higher quality of information thus providing decision makers with a more......Traditional transport Cost-Benefit Analysis (CBA) commonly ignores the indirect environmental impacts of an infrastructure project deriving from the overall life-cycle of the different project components. Such indirect impacts are instead of key importance in order to assess the long...

  13. ASSESSMENT OF PLASTIC FLOWS AND STOCKS IN SERBIA USING MATERIAL FLOW ANALYSIS

    Directory of Open Access Journals (Sweden)

    Goran Vujić

    2010-01-01

    Full Text Available Material flow analysis (MFA was used to assess the amounts of plastic materials flows and stocks that are annually produced, consumed, imported, exported, collected, recycled, and disposed in the landfills in Serbia. The analysis revealed that approximatelly 269,000 tons of plastic materials are directly disposed in uncontrolled landfills in Serbia without any preatretment, and that siginificant amounts of these materials have already accumulated in the landfills. The substantial amounts of landfilled plastics represent not only a loss of valuable recourses, but also pose a seriuos treath to the environment and human health, and if the trend of direct plastic landfilling is continued, Serbia will face with grave consecequnces.

  14. Experimental assessment of computer codes used for safety analysis of integral reactors

    Energy Technology Data Exchange (ETDEWEB)

    Falkov, A.A.; Kuul, V.S.; Samoilov, O.B. [OKB Mechanical Engineering, Nizhny Novgorod (Russian Federation)

    1995-09-01

    Peculiarities of integral reactor thermohydraulics in accidents are associated with presence of noncondensable gas in built-in pressurizer, absence of pumped ECCS, use of guard vessel for LOCAs localisation and passive RHRS through in-reactor HX`s. These features defined the main trends in experimental investigations and verification efforts for computer codes applied. The paper reviews briefly the performed experimental investigation of thermohydraulics of AST-500, VPBER600-type integral reactors. The characteristic of UROVEN/MB-3 code for LOCAs analysis in integral reactors and results of its verification are given. The assessment of RELAP5/mod3 applicability for accident analysis in integral reactor is presented.

  15. The importance of accurate anatomic assessment for the volumetric analysis of the amygdala

    Directory of Open Access Journals (Sweden)

    L. Bonilha

    2005-03-01

    Full Text Available There is a wide range of values reported in volumetric studies of the amygdala. The use of single plane thick magnetic resonance imaging (MRI may prevent the correct visualization of anatomic landmarks and yield imprecise results. To assess whether there is a difference between volumetric analysis of the amygdala performed with single plane MRI 3-mm slices and with multiplanar analysis of MRI 1-mm slices, we studied healthy subjects and patients with temporal lobe epilepsy. We performed manual delineation of the amygdala on T1-weighted inversion recovery, 3-mm coronal slices and manual delineation of the amygdala on three-dimensional volumetric T1-weighted images with 1-mm slice thickness. The data were compared using a dependent t-test. There was a significant difference between the volumes obtained by the coronal plane-based measurements and the volumes obtained by three-dimensional analysis (P < 0.001. An incorrect estimate of the amygdala volume may preclude a correct analysis of the biological effects of alterations in amygdala volume. Three-dimensional analysis is preferred because it is based on more extensive anatomical assessment and the results are similar to those obtained in post-mortem studies.

  16. USE OF MULTIPARAMETER ANALYSIS OF LABORATORY BIOMARKERS TO ASSESS RHEUMATOID ARTHRITIS ACTIVITY

    Directory of Open Access Journals (Sweden)

    A. A. Novikov

    2015-01-01

    Full Text Available The key component in the management of patients with rheumatoid arthritis (RA is regular control of RA activity. The quantitative assessment of a patient’s status allows the development of standardized indications for anti-rheumatic therapy.Objective: to identify the laboratory biomarkers able to reflect RA activity.Subjects and methods. Fifty-eight patients with RA and 30 age- and sex-matched healthy donors were examined. The patients were divided into high/moderate and mild disease activity groups according to DAS28. The serum concentrations of 30 biomarkers were measured using immunonephelometric assay, enzyme immunoassay, and xMAP technology.Results and discussion. Multivariate analysis could identify the factors mostly related to high/moderate RA activity according to DAS28, such as fibroblast growth factor-2, monocyte chemoattractant protein-1, interleukins (IL 1α, 6, and 15, and tumor necrosis factor-α and could create a prognostic model for RA activity assessment. ROC analysis has shown that this model has excellent diagnostic efficiency in differentiating high/moderate versus low RA activity.Conclusion. To create a subjective assessment-independent immunological multiparameter index of greater diagnostic accuracy than the laboratory parameters routinely used in clinical practice may be a qualitatively new step in assessing and monitoring RA activity.

  17. Feasibility analysis of widely accepted indicators as key ones in river health assessment

    Institute of Scientific and Technical Information of China (English)

    FENG Yan; KANG Bin; YANG Liping

    2012-01-01

    Index systems on river health assessment are difficult for using in practice,due to the more complex and professional indicators adopted.In the paper,some key indicators which can be applied for river health assessment in general were selected,based on the analysis of 45 assessment index systems with 902 variables within around 150 published papers and documents in 1972-2010.According to the fields covered by the variables,they were divided into four groups:habitat condition,water environment,biotic status and water utilization.The adopted number and the accepted degrees in the above systems of each indicator were calculated after the variables were combined into the indicators,some of the widely accepted indicators which can reflect different aspects of river condition were selected as key indicators in candidate.Under the correlation analysis amongst the key indicators in candidate,8 indicators were finally suggested as the key indicators for assessing river health,which were:coverage rate of riparian vegetation,reserved rate of wetland,river continuity,the changing rate of water flow,the ratio of reaching water quality standard,fish index of biotic integrity,the ratio of water utilization and land use.

  18. Quantitative image analysis in the assessment of diffuse large B-cell lymphoma.

    Science.gov (United States)

    Chabot-Richards, Devon S; Martin, David R; Myers, Orrin B; Czuchlewski, David R; Hunt, Kristin E

    2011-12-01

    Proliferation rates in diffuse large B-cell lymphoma have been associated with conflicting outcomes in the literature, more often with high proliferation associated with poor prognosis. In most studies, the proliferation rate was estimated by a pathologist using an immunohistochemical stain for the monoclonal antibody Ki-67. We hypothesized that a quantitative image analysis algorithm would give a more accurate estimate of the proliferation rate, leading to better associations with survival. In all, 84 cases of diffuse large B-cell lymphoma were selected according to the World Health Organization criteria. Ki-67 percentage positivity estimated by the pathologist was recorded from the original report. The same slides were then scanned using an Aperio ImageScope, and Ki-67 percentage positivity was calculated using a computer-based quantitative immunohistochemistry nuclear algorithm. In addition, chart review was performed and survival time was recorded. The Ki-67 percentage estimated by the pathologist from the original report versus quantitative image analysis was significantly correlated (Pquantitative image analysis (P=0.021). There was less agreement at lower Ki-67 percentages. Comparison of Ki-67 percentage positivity versus survival did not show significant association either with pathologist estimate or quantitative image analysis. However, although not significant, there was a trend of worse survival at higher proliferation rates detected by the pathologist but not by quantitative image analysis. Interestingly, our data suggest that the Ki-67 percentage positivity as assessed by the pathologist may be more closely associated with survival outcome than that identified by quantitative image analysis. This may indicate that pathologists are better at selecting appropriate areas of the slide. More cases are needed to assess whether this finding would be statistically significant. Due to the good correlation between pathologist estimate and quantitative image

  19. Seeking Missing Pieces in Science Concept Assessments: Reevaluating the Brief Electricity and Magnetism Assessment through Rasch Analysis

    Science.gov (United States)

    Ding, Lin

    2014-01-01

    Discipline-based science concept assessments are powerful tools to measure learners' disciplinary core ideas. Among many such assessments, the Brief Electricity and Magnetism Assessment (BEMA) has been broadly used to gauge student conceptions of key electricity and magnetism (E&M) topics in college-level introductory physics courses.…

  20. Seeking Missing Pieces in Science Concept Assessments: Reevaluating the Brief Electricity and Magnetism Assessment through Rasch Analysis

    Science.gov (United States)

    Ding, Lin

    2014-01-01

    Discipline-based science concept assessments are powerful tools to measure learners' disciplinary core ideas. Among many such assessments, the Brief Electricity and Magnetism Assessment (BEMA) has been broadly used to gauge student conceptions of key electricity and magnetism (E&M) topics in college-level introductory physics courses.…

  1. Assessment of Non-Traditional Isotopic Ratios by Mass Spectrometry for Analysis of Nuclear Activities

    Science.gov (United States)

    2016-03-01

    Assessment of Non-traditional Isotopic Ratios by Mass Spectrometry for Analysis of Nuclear Activities Distribution Statement A. Approved for pubic...12. DISTRIBUTION / AVAILABILITY STATEMENT 13. SUPPLEMENTARY NOTES 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF...Fahrenheit ( o F) [T( o F) + 459.67]/1.8 kelvin (K) Radiation curie (Ci) [ activity of radionuclides] 3.7 × 10 10 per second (s –1 ) [becquerel (Bq

  2. Chemical Warfare Agent Operational Exposure Hazard Assessment Research: FY07 Report and Analysis

    Science.gov (United States)

    2010-07-01

    accidentally exposed to VX vapor. The method employs GC-MS/MS on a triple quadrupole mass spectrometer using stable isotope dilution for quantitation... Technique for Assessing Exposure to VX via GC-MS/MS Analysis 85 3.3.1 Introduction 85 3.3.2 Materials and Methods 85 3.3.3 Results and Discussion 90...guinea pigs were anesthetized using isoflurane (3% induction, 15-2% maintenance; with oxygen). All procedures were performed using aseptic technique

  3. Conference Analysis Report of Assessments on Defect and Damage for a High Temperature Structure

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hyeong Yeon

    2008-11-15

    This report presents the analysis on the state-of-the-art research trends on creep-fatigue damage, defect assessment of high temperature structure, development of heat resistant materials and their behavior at high temperature based on the papers presented in the two international conferences of ASME PVP 2008 which was held in Chicago in July 2008 and CF-5(5th International Conference on Creep, Fatigue and Creep-Fatigue) which was held in Kalpakkam, India in September 2008.

  4. Defense Acquisitions: Assessment of Institute for Defense Analyses C-130 Avionics Modernization Program Analysis

    Science.gov (United States)

    2014-05-29

    Page 1 GAO-14-547R C-130 Avionics Modernization Program 441 G St. N.W. Washington, DC 20548 May 29, 2014...Congressional Committees Defense Acquisitions: Assessment of Institute for Defense Analyses’ C-130 Avionics Modernization Program Analysis The Air...Force’s C-130 Avionics Modernization Program (AMP), which entered development in 2001, was to standardize and upgrade the cockpit and avionics for

  5. Case Study: Sensitivity Analysis of the Barataria Basin Barrier Shoreline Wetland Value Assessment Model

    Science.gov (United States)

    2014-07-01

    Barrier Shoreline Wetland Value Assessment Model1 by S. Kyle McKay2 and J. Craig Fischenich3 OVERVIEW: Sensitivity analysis is a technique for...relevance of questions posed during an Independent External Peer Review (IEPR). BARATARIA BASIN BARRIER SHORELINE (BBBS) STUDY: On average...scale restoration projects to reduce marsh loss and maintain these wetlands as healthy functioning ecosystems. The Barataria Basin Barrier Shoreline

  6. Dynamic Ecocentric Assessment Combining Emergy and Data Envelopment Analysis: Application to Wind Farms

    OpenAIRE

    Mario Martín-Gamboa; Diego Iribarren

    2016-01-01

    Most of current life-cycle approaches show an anthropocentric standpoint for the evaluation of human-dominated activities. However, this perspective is insufficient when it comes to assessing the contribution of natural resources to production processes. In this respect, emergy analysis evaluates human-driven systems from a donor-side perspective, accounting for the environmental effort performed to make the resources available. This article presents a novel methodological framework, which co...

  7. Analysis on evaluation ability of nonlinear safety assessment model of coal mines based on artificial neural network

    Institute of Scientific and Technical Information of China (English)

    SHI Shi-liang; LIU Hai-bo; LIU Ai-hua

    2004-01-01

    Based on the integration analysis of goods and shortcomings of various methods used in safety assessment of coal mines, combining nonlinear feature of mine safety sub-system, this paper establishes the neural network assessment model of mine safety, analyzes the ability of artificial neural network to evaluate mine safety state, and lays the theoretical foundation of artificial neural network using in the systematic optimization of mine safety assessment and getting reasonable accurate safety assessment result.

  8. ANALYSIS OF ENVIRONMENTAL FRAGILITY USING MULTI-CRITERIA ANALYSIS (MCE FOR INTEGRATED LANDSCAPE ASSESSMENT

    Directory of Open Access Journals (Sweden)

    Abimael Cereda Junior

    2014-01-01

    Full Text Available The Geographic Information Systems brought greater possibilitie s to the representation and interpretation of the landscap e as well as the integrated a nalysis. However, this approach does not dispense technical and methodological substan tiation for achieving the computational universe. This work is grounded in ecodynamic s and empirical analysis of natural and anthr opogenic environmental Fragility a nd aims to propose and present an integrated paradigm of Multi-criteria Analysis and F uzzy Logic Model of Environmental Fragility, taking as a case study of the Basin of Monjolinho Stream in São Carlos-SP. The use of this methodology allowed for a reduct ion in the subjectivism influences of decision criteria, which factors might have its cartographic expression, respecting the complex integrated landscape.

  9. Peak Pc Prediction in Conjunction Analysis: Conjunction Assessment Risk Analysis. Pc Behavior Prediction Models

    Science.gov (United States)

    Vallejo, J.J.; Hejduk, M.D.; Stamey, J. D.

    2015-01-01

    Satellite conjunction risk typically evaluated through the probability of collision (Pc). Considers both conjunction geometry and uncertainties in both state estimates. Conjunction events initially discovered through Joint Space Operations Center (JSpOC) screenings, usually seven days before Time of Closest Approach (TCA). However, JSpOC continues to track objects and issue conjunction updates. Changes in state estimate and reduced propagation time cause Pc to change as event develops. These changes a combination of potentially predictable development and unpredictable changes in state estimate covariance. Operationally useful datum: the peak Pc. If it can reasonably be inferred that the peak Pc value has passed, then risk assessment can be conducted against this peak value. If this value is below remediation level, then event intensity can be relaxed. Can the peak Pc location be reasonably predicted?

  10. Assessing the Reliability of Digitalized Cephalometric Analysis in Comparison with Manual Cephalometric Analysis

    Science.gov (United States)

    Farooq, Mohammed Umar; Khan, Mohd. Asadullah; Imran, Shahid; Qureshi, Arshad; Ahmed, Syed Afroz; Kumar, Sujan; Rahman, Mohd. Aziz Ur

    2016-01-01

    Introduction For more than seven decades orthodontist used cephalometric analysis as one of the main diagnostic tools which can be performed manually or by software. The use of computers in treatment planning is expected to avoid errors and make it less time consuming with effective evaluation and high reproducibility. Aim This study was done to evaluate and compare the accuracy and reliability of cephalometric measurements between computerized method of direct digital radiographs and conventional tracing. Materials and Methods Digital and conventional hand tracing cephalometric analysis of 50 patients were done. Thirty anatomical landmarks were defined on each radiograph by a single investi-gator, 5 skeletal analysis (Steiner, Wits, Tweeds, McNamara, Rakosi Jarabaks) and 28 variables were calculated. Results The variables showed consistency between the two methods except for 1-NA, Y-axis and interincisal angle measurements which were higher in manual tracing and higher facial axis angle in digital tracing. Conclusion Most of the commonly used measurements were accurate except some measurements between the digital tracing with FACAD® and manual methods. The advantages of digital imaging such as enhancement, transmission, archiving and low radiation dosages makes it to be preferred over conventional method in daily use. PMID:27891451

  11. Life Cycle Assessment and Life Cycle Cost Analysis of Magnesia Spinel Brick Production

    Directory of Open Access Journals (Sweden)

    Aysun Özkan

    2016-07-01

    Full Text Available Sustainable use of natural resources in the production of construction materials has become a necessity both in Europe and Turkey. Construction products in Europe should have European Conformity (CE and Environmental Product Declaration (EPD, an independently verified and registered document in line with the European standard EN 15804. An EPD certificate can be created by performing a Life Cycle Assessment (LCA study. In this particular work, an LCA study was carried out for a refractory brick production for environmental assessment. In addition to the LCA, the Life Cycle Cost (LCC analysis was also applied for economic assessment. Firstly, a cradle-to-gate LCA was performed for one ton of magnesia spinel refractory brick. The CML IA method included in the licensed SimaPro 8.0.1 software was chosen to calculate impact categories (namely, abiotic depletion, global warming potential, acidification potential, eutrophication potential, human toxicity, ecotoxicity, ozone depletion potential, and photochemical oxidation potential. The LCC analysis was performed by developing a cost model for internal and external cost categories within the software. The results were supported by a sensitivity analysis. According to the results, the production of raw materials and the firing process in the magnesia spinel brick production were found to have several negative effects on the environment and were costly.

  12. Assessing musculo-articular stiffness using free oscillations: theory, measurement and analysis.

    Science.gov (United States)

    Ditroilo, Massimiliano; Watsford, Mark; Murphy, Aron; De Vito, Giuseppe

    2011-12-01

    Stiffness, the relationship between applied load and elastic deformation, is an important neuromechanical component related to muscular performance and injury risk. The free-oscillation technique is a popular method for stiffness assessment. There has been wide application of this technique assessing a variety of musculature, including the triceps surae, knee flexors, knee extensors and pectorals. The methodology involves the modelling of the system as a linear damped mass-spring system. The use of such a model has certain advantages and limitations that will be discussed within this review. Perhaps the major advantage of such a model is the specificity of the measure, whereby it is possible for the assessment conditions to simulate the type of loading witnessed during functional tasks and sporting situations. High levels of reliability and construct validity have typically been reported using such procedures. Despite these assurances of accuracy, a number of issues have also been identified. The literature reveals some concerns surrounding the use of a linear model for stiffness assessment. Further, procedural issues surrounding the administration of the perturbation, attention focus of the participant during the perturbation, signal collection, data processing and analysis, presentation of stiffness as a linear or torsional value, assessment load (single vs multiple vs maximal) and the stiffness-load relationship have been identified, and are all fundamentally related to the quality of the calculated output data. Finally, several important considerations for practitioners have been recommended to ensure the quality and consistency of stiffness data collection, processing and interpretation.

  13. Assessing the influence of Environmental Impact Assessments on science and policy: an analysis of the Three Gorges Project.

    Science.gov (United States)

    Tullos, Desiree

    2009-07-01

    The need to understand and minimize negative environmental outcomes associated with large dams has both contributed to and benefited from the introduction and subsequent improvements in the Environmental Impact Assessment (EIA) process. However, several limitations in the EIA process remain, including those associated with the uncertainty and significance of impact projections. These limitations are directly related to the feedback between science and policy, with information gaps in scientific understanding discovered through the EIA process contributing valuable recommendations on critical focus areas for prioritizing and funding research within the fields of ecological conservation and river engineering. This paper presents an analysis of the EIA process for the Three Gorges Project (TGP) in China as a case study for evaluating this feedback between the EIA and science and policy. For one of the best-studied public development projects in the world, this paper presents an investigation into whether patterns exist between the scientific interest (via number of publications) in environmental impacts and (a) the identification of impacts as uncertain or priority by the EIA, (b) decisions or political events associated with the dam, and (c) impact type. This analysis includes the compilation of literature on TGP, characterization of ecosystem interactions and responses to TGP through a hierarchy of impacts, coding of EIA impacts as "uncertain" impacts that require additional study and "priority" impacts that have particularly high significance, mapping of an event chronology to relate policies, institutional changes, and decisions about TGP as "events" that could influence the focus and intensity of scientific investigation, and analysis of the number of publications by impact type and order within the impact hierarchy. From these analyses, it appears that the availability and consistency of scientific information limit the accuracy of environmental impact

  14. Predictive capacity of risk assessment scales and clinical judgment for pressure ulcers: a meta-analysis.

    Science.gov (United States)

    García-Fernández, Francisco Pedro; Pancorbo-Hidalgo, Pedro L; Agreda, J Javier Soldevilla

    2014-01-01

    A systematic review with meta-analysis was completed to determine the capacity of risk assessment scales and nurses' clinical judgment to predict pressure ulcer (PU) development. Electronic databases were searched for prospective studies on the validity and predictive capacity of PUs risk assessment scales published between 1962 and 2010 in English, Spanish, Portuguese, Korean, German, and Greek. We excluded gray literature sources, integrative review articles, and retrospective or cross-sectional studies. The methodological quality of the studies was assessed according to the guidelines of the Critical Appraisal Skills Program. Predictive capacity was measured as relative risk (RR) with 95% confidence intervals. When 2 or more valid original studies were found, a meta-analysis was conducted using a random-effect model and sensitivity analysis. We identified 57 studies, including 31 that included a validation study. We also retrieved 4 studies that tested clinical judgment as a risk prediction factor. Meta-analysis produced the following pooled predictive capacity indicators: Braden (RR = 4.26); Norton (RR = 3.69); Waterlow (RR = 2.66); Cubbin-Jackson (RR = 8.63); EMINA (RR = 6.17); Pressure Sore Predictor Scale (RR = 21.4); and clinical judgment (RR = 1.89). Pooled analysis of 11 studies found adequate risk prediction capacity in various clinical settings; the Braden, Norton, EMINA (mEntal state, Mobility, Incontinence, Nutrition, Activity), Waterlow, and Cubbin-Jackson scales showed the highest predictive capacity. The clinical judgment of nurses was found to achieve inadequate predictive capacity when used alone, and should be used in combination with a validated scale.

  15. Rapid quality assessment of Radix Aconiti Preparata using direct analysis in real time mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Zhu Hongbin; Wang Chunyan; Qi Yao [Changchun Center of Mass Spectrometry and Chemical Biology Laboratory, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, Changchun 130022 (China); University of Chinese Academy of Sciences, Beijing 100039 (China); Song Fengrui, E-mail: songfr@ciac.jl.cn [Changchun Center of Mass Spectrometry and Chemical Biology Laboratory, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, Changchun 130022 (China); Liu Zhiqiang; Liu Shuying [Changchun Center of Mass Spectrometry and Chemical Biology Laboratory, Changchun Institute of Applied Chemistry, Chinese Academy of Sciences, Changchun 130022 (China)

    2012-11-08

    Highlights: Black-Right-Pointing-Pointer DART MS combined with PCA and HCA was used to rapidly identify markers of Radix Aconiti. Black-Right-Pointing-Pointer The DART MS behavior of six aconitine-type alkaloids was investigated. Black-Right-Pointing-Pointer Chemical markers were recognized between the qualified and unqualified samples. Black-Right-Pointing-Pointer DART MS was shown to be an effective tool for quality control of Radix Aconiti Preparata. - Abstract: This study presents a novel and rapid method to identify chemical markers for the quality control of Radix Aconiti Preparata, a world widely used traditional herbal medicine. In the method, the samples with a fast extraction procedure were analyzed using direct analysis in real time mass spectrometry (DART MS) combined with multivariate data analysis. At present, the quality assessment approach of Radix Aconiti Preparata was based on the two processing methods recorded in Chinese Pharmacopoeia for the purpose of reducing the toxicity of Radix Aconiti and ensuring its clinical therapeutic efficacy. In order to ensure the safety and effectivity in clinical use, the processing degree of Radix Aconiti should be well controlled and assessed. In the paper, hierarchical cluster analysis and principal component analysis were performed to evaluate the DART MS data of Radix Aconiti Preparata samples in different processing times. The results showed that the well processed Radix Aconiti Preparata, unqualified processed and the raw Radix Aconiti could be clustered reasonably corresponding to their constituents. The loading plot shows that the main chemical markers having the most influence on the discrimination amongst the qualified and unqualified samples were mainly some monoester diterpenoid aconitines and diester diterpenoid aconitines, i.e. benzoylmesaconine, hypaconitine, mesaconitine, neoline, benzoylhypaconine, benzoylaconine, fuziline, aconitine and 10-OH-mesaconitine. The established DART MS approach in

  16. Genetic diversity analysis of highly incomplete SNP genotype data with imputations: an empirical assessment.

    Science.gov (United States)

    Fu, Yong-Bi

    2014-03-13

    Genotyping by sequencing (GBS) recently has emerged as a promising genomic approach for assessing genetic diversity on a genome-wide scale. However, concerns are not lacking about the uniquely large unbalance in GBS genotype data. Although some genotype imputation has been proposed to infer missing observations, little is known about the reliability of a genetic diversity analysis of GBS data, with up to 90% of observations missing. Here we performed an empirical assessment of accuracy in genetic diversity analysis of highly incomplete single nucleotide polymorphism genotypes with imputations. Three large single-nucleotide polymorphism genotype data sets for corn, wheat, and rice were acquired, and missing data with up to 90% of missing observations were randomly generated and then imputed for missing genotypes with three map-independent imputation methods. Estimating heterozygosity and inbreeding coefficient from original, missing, and imputed data revealed variable patterns of bias from assessed levels of missingness and genotype imputation, but the estimation biases were smaller for missing data without genotype imputation. The estimates of genetic differentiation were rather robust up to 90% of missing observations but became substantially biased when missing genotypes were imputed. The estimates of topology accuracy for four representative samples of interested groups generally were reduced with increased levels of missing genotypes. Probabilistic principal component analysis based imputation performed better in terms of topology accuracy than those analyses of missing data without genotype imputation. These findings are not only significant for understanding the reliability of the genetic diversity analysis with respect to large missing data and genotype imputation but also are instructive for performing a proper genetic diversity analysis of highly incomplete GBS or other genotype data.

  17. Relative urban ecosystem health assessment: a method integrating comprehensive evaluation and detailed analysis.

    Science.gov (United States)

    Su, Meirong; Yang, Zhifeng; Chen, Bin

    2010-12-01

    Regarding the basic roles of urban ecosystem health assessment (i.e., discovering the comprehensive health status, and diagnosing the limiting factors of urban ecosystems), the general framework integrating comprehensive evaluation and detailed analysis is established, from both bottom-up and top-down directions. Emergy-based health indicators are established to reflect the urban ecosystem health status from a biophysical viewpoint. Considering the intrinsic uncertainty and relativity of urban ecosystem health, set pair analysis is combined with the emergy-based indicators to fill the general framework and evaluate the relative health level of urban ecosystems. These techniques are favorable for understanding the overall urban ecosystem health status and confirming the limiting factors of concerned urban ecosystems from biophysical perspective. Moreover, clustering analysis is applied by combining the health status with spatial geographical conditions. Choosing 26 typical Chinese cities in 2005, relative comprehensive urban ecosystem health levels were evaluated. The higher health levels of Xiamen, Qingdao, Shenzhen, and Zhuhai are in particular contrast to those of Wuhan, Beijing, Yinchuan, and Harbin, which are relatively poor. In addition, the conditions of each factor and related indicators are investigated through set pair analysis, from which the critical limiting factors of Beijing are confirmed. According to clustering analysis results, the urban ecosystems studied are divided into four groups. It is concluded that the proposed framework of urban ecosystem health assessment, which integrates comprehensive evaluation and detailed analysis and is fulfilled by emergy synthesis and set pair analysis, can serve as a useful tool to conduct diagnosis of urban ecosystem health.

  18. Performance assessment of air quality monitoring networks using principal component analysis and cluster analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Wei-Zhen [Department of Building and Construction, City University of Hong Kong (China); He, Hong-Di [Department of Building and Construction, City University of Hong Kong (China); Logistics Research Center, Shanghai Maritime University, Shanghai (China); Dong, Li-yun [Shanghai Institute of Applied Mathematics and Mechanics, Shanghai University, Shanghai (China)

    2011-03-15

    This study aims to evaluate the performance of two statistical methods, principal component analysis and cluster analysis, for the management of air quality monitoring network of Hong Kong and the reduction of associated expenses. The specific objectives include: (i) to identify city areas with similar air pollution behavior; and (ii) to locate emission sources. The statistical methods were applied to the mass concentrations of sulphur dioxide (SO{sub 2}), respirable suspended particulates (RSP) and nitrogen dioxide (NO{sub 2}), collected in monitoring network of Hong Kong from January 2001 to December 2007. The results demonstrate that, for each pollutant, the monitoring stations are grouped into different classes based on their air pollution behaviors. The monitoring stations located in nearby area are characterized by the same specific air pollution characteristics and suggested with an effective management of air quality monitoring system. The redundant equipments should be transferred to other monitoring stations for allowing further enlargement of the monitored area. Additionally, the existence of different air pollution behaviors in the monitoring network is explained by the variability of wind directions across the region. The results imply that the air quality problem in Hong Kong is not only a local problem mainly from street-level pollutions, but also a region problem from the Pearl River Delta region. (author)

  19. Assessing the Goodness of Fit of Phylogenetic Comparative Methods: A Meta-Analysis and Simulation Study.

    Directory of Open Access Journals (Sweden)

    Dwueng-Chwuan Jhwueng

    Full Text Available Phylogenetic comparative methods (PCMs have been applied widely in analyzing data from related species but their fit to data is rarely assessed.Can one determine whether any particular comparative method is typically more appropriate than others by examining comparative data sets?I conducted a meta-analysis of 122 phylogenetic data sets found by searching all papers in JEB, Blackwell Synergy and JSTOR published in 2002-2005 for the purpose of assessing the fit of PCMs. The number of species in these data sets ranged from 9 to 117.I used the Akaike information criterion to compare PCMs, and then fit PCMs to bivariate data sets through REML analysis. Correlation estimates between two traits and bootstrapped confidence intervals of correlations from each model were also compared.For phylogenies of less than one hundred taxa, the Independent Contrast method and the independent, non-phylogenetic models provide the best fit.For bivariate analysis, correlations from different PCMs are qualitatively similar so that actual correlations from real data seem to be robust to the PCM chosen for the analysis. Therefore, researchers might apply the PCM they believe best describes the evolutionary mechanisms underlying their data.

  20. Risk Assessment Method for Offshore Structure Based on Global Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Zou Tao

    2012-01-01

    Full Text Available Based on global sensitivity analysis (GSA, this paper proposes a new risk assessment method for an offshore structure design. This method quantifies all the significances among random variables and their parameters at first. And by comparing the degree of importance, all minor factors would be negligible. Then, the global uncertainty analysis work would be simplified. Global uncertainty analysis (GUA is an effective way to study the complexity and randomness of natural events. Since field measured data and statistical results often have inevitable errors and uncertainties which lead to inaccurate prediction and analysis, the risk in the design stage of offshore structures caused by uncertainties in environmental loads, sea level, and marine corrosion must be taken into account. In this paper, the multivariate compound extreme value distribution model (MCEVD is applied to predict the extreme sea state of wave, current, and wind. The maximum structural stress and deformation of a Jacket platform are analyzed and compared with different design standards. The calculation result sufficiently demonstrates the new risk assessment method’s rationality and security.

  1. Using multi-criteria decision analysis to assess the vulnerability of drinking water utilities.

    Science.gov (United States)

    Joerin, Florent; Cool, Geneviève; Rodriguez, Manuel J; Gignac, Marc; Bouchard, Christian

    2010-07-01

    Outbreaks of microbiological waterborne disease have increased governmental concern regarding the importance of drinking water safety. Considering the multi-barrier approach to safe drinking water may improve management decisions to reduce contamination risks. However, the application of this approach must consider numerous and diverse kinds of information simultaneously. This makes it difficult for authorities to apply the approach to decision making. For this reason, multi-criteria decision analysis can be helpful in applying the multi-barrier approach to vulnerability assessment. The goal of this study is to propose an approach based on a multi-criteria analysis method in order to rank drinking water systems (DWUs) based on their vulnerability to microbiological contamination. This approach is illustrated with an application carried out on 28 DWUs supplied by groundwater in the Province of Québec, Canada. The multi-criteria analysis method chosen is measuring attractiveness by a categorical based evaluation technique methodology allowing the assessment of a microbiological vulnerability indicator (MVI) for each DWU. Results are presented on a scale ranking DWUs from less vulnerable to most vulnerable to contamination. MVI results are tested using a sensitivity analysis on barrier weights and they are also compared with historical data on contamination at the utilities. The investigation demonstrates that MVI provides a good representation of the vulnerability of DWUs to microbiological contamination.

  2. An integrated factor analysis model for product eco-design based on full life cycle assessment

    Directory of Open Access Journals (Sweden)

    Zhi fang Zhou

    2016-02-01

    Full Text Available Purpose: Among the methods of comprehensive analysis for a product or an enterprise, there exist defects and deficiencies in traditional standard cost analyses and life cycle assessment methods. For example, some methods only emphasize one dimension (such as economic or environmental factors while neglecting other relevant dimensions. This paper builds a factor analysis model of resource value flow, based on full life cycle assessment and eco-design theory, in order to expose the relevant internal logic between these two factors. Design/methodology/approach: The model considers the efficient multiplication of resources, economic efficiency, and environmental efficiency as its core objectives. The model studies the status of resource value flow during the entire life cycle of a product, and gives an in-depth analysis on the mutual logical relationship of product performance, value, resource consumption, and environmental load to reveal the symptoms and potentials in different dimensions. Originality/value: This provides comprehensive, accurate and timely decision-making information for enterprise managers regarding product eco-design, as well as production and management activities. To conclude, it verifies the availability of this evaluation and analysis model using a Chinese SUV manufacturer as an example. 

  3. Authorship bias in violence risk assessment? A systematic review and meta-analysis.

    Science.gov (United States)

    Singh, Jay P; Grann, Martin; Fazel, Seena

    2013-01-01

    Various financial and non-financial conflicts of interests have been shown to influence the reporting of research findings, particularly in clinical medicine. In this study, we examine whether this extends to prognostic instruments designed to assess violence risk. Such instruments have increasingly become a routine part of clinical practice in mental health and criminal justice settings. The present meta-analysis investigated whether an authorship effect exists in the violence risk assessment literature by comparing predictive accuracy outcomes in studies where the individuals who designed these instruments were study authors with independent investigations. A systematic search from 1966 to 2011 was conducted using PsycINFO, EMBASE, MEDLINE, and US National Criminal Justice Reference Service Abstracts to identify predictive validity studies for the nine most commonly used risk assessment tools. Tabular data from 83 studies comprising 104 samples was collected, information on two-thirds of which was received directly from study authors for the review. Random effects subgroup analysis and metaregression were used to explore evidence of an authorship effect. We found a substantial and statistically significant authorship effect. Overall, studies authored by tool designers reported predictive validity findings around two times higher those of investigations reported by independent authors (DOR=6.22 [95% CI=4.68-8.26] in designers' studies vs. DOR=3.08 [95% CI=2.45-3.88] in independent studies). As there was evidence of an authorship effect, we also examined disclosure rates. None of the 25 studies where tool designers or translators were also study authors published a conflict of interest statement to that effect, despite a number of journals requiring that potential conflicts be disclosed. The field of risk assessment would benefit from routine disclosure and registration of research studies. The extent to which similar conflict of interests exists in those

  4. Authorship bias in violence risk assessment? A systematic review and meta-analysis.

    Directory of Open Access Journals (Sweden)

    Jay P Singh

    Full Text Available Various financial and non-financial conflicts of interests have been shown to influence the reporting of research findings, particularly in clinical medicine. In this study, we examine whether this extends to prognostic instruments designed to assess violence risk. Such instruments have increasingly become a routine part of clinical practice in mental health and criminal justice settings. The present meta-analysis investigated whether an authorship effect exists in the violence risk assessment literature by comparing predictive accuracy outcomes in studies where the individuals who designed these instruments were study authors with independent investigations. A systematic search from 1966 to 2011 was conducted using PsycINFO, EMBASE, MEDLINE, and US National Criminal Justice Reference Service Abstracts to identify predictive validity studies for the nine most commonly used risk assessment tools. Tabular data from 83 studies comprising 104 samples was collected, information on two-thirds of which was received directly from study authors for the review. Random effects subgroup analysis and metaregression were used to explore evidence of an authorship effect. We found a substantial and statistically significant authorship effect. Overall, studies authored by tool designers reported predictive validity findings around two times higher those of investigations reported by independent authors (DOR=6.22 [95% CI=4.68-8.26] in designers' studies vs. DOR=3.08 [95% CI=2.45-3.88] in independent studies. As there was evidence of an authorship effect, we also examined disclosure rates. None of the 25 studies where tool designers or translators were also study authors published a conflict of interest statement to that effect, despite a number of journals requiring that potential conflicts be disclosed. The field of risk assessment would benefit from routine disclosure and registration of research studies. The extent to which similar conflict of interests exists

  5. Complex health care interventions: Characteristics relevant for ethical analysis in health technology assessment

    Directory of Open Access Journals (Sweden)

    Lysdahl, Kristin Bakke

    2016-03-01

    Full Text Available Complexity entails methodological challenges in assessing health care interventions. In order to address these challenges, a series of characteristics of complexity have been identified in the Health Technology Assessment (HTA literature. These characteristics are primarily identified and developed to facilitate effectiveness, safety, and cost-effectiveness analysis. However, ethics is also a constitutive part of HTA, and it is not given that the conceptions of complexity that appears relevant for effectiveness, safety, and cost-effectiveness analysis are also relevant and directly applicable for ethical analysis in HTA. The objective of this article is therefore to identify and elaborate a set of key characteristics of complex health care interventions relevant for addressing ethical aspects in HTA. We start by investigating the relevance of the characteristics of complex interventions, as defined in the HTA literature. Most aspects of complexity found to be important when assessing effectiveness, safety, and efficiency turn out also to be relevant when assessing ethical issues of a given health technology. However, the importance and relevance of the complexity characteristics may differ when addressing ethical issues rather than effectiveness. Moreover, the moral challenges of a health care intervention may themselves contribute to the complexity. After identifying and analysing existing conceptions of complexity, we synthesise a set of five key characteristics of complexity for addressing ethical aspects in HTA: 1 multiple and changing perspectives, 2 indeterminate phenomena, 3 uncertain causality, 4 unpredictable outcome, and 5 ethical complexity. This may serve as an analytic tool in addressing ethical issues in HTA of complex interventions.

  6. Risk Analysis and Assessment of Overtopping Concerning Sea Dikes in the Case of Storm Surge

    Institute of Scientific and Technical Information of China (English)

    王莉萍; 黄桂玲; 陈正寿; 梁丙臣; 刘桂林

    2014-01-01

    Risk-analysis-and-assessment-relating-coastal-structures-has-been-one-of-the-hot-topics-in-the-area-of-coastal-protection-recently.-In-this-paper,-from-three-aspects-of-joint-return-period-of-multiple-loads,-dike-failure-rate-and-dike-continuous-risk-prevention-respectively,-three-new-risk-analysis-methods-concerning-overtopping-of-sea-dikes-are-developed.-It-is-worth-noting-that-the-factors-of-storm-surge-which-leads-to-overtopping-are-also-considered-in-the-three-methods.-In-order-to-verify-and-estimate-the-effectiveness-and-reliability-of-the-newly-developed-methods,-quantified-mutual-information-is-adopted.-By-means-of-case-testing,-it-can-be-found-that-different-prior-variables-might-be-selected-dividedly,-according-to-the-requirement-of-special-engineering-application-or-the-dominance-of-loads.-Based-on-the-selection-of-prior-variables,-the-correlating-risk-analysis-method-can-be-successfully-applied-to-practical-engineering.

  7. Use of image analysis to assess color response on plants caused by herbicide application

    DEFF Research Database (Denmark)

    Asif, Ali; Streibig, Jens Carl; Duus, Joachim;

    2013-01-01

    In herbicide-selectivity experiments, response can be measured by visual inspection, stand counts, plant mortality, and biomass. Some response types are relative to nontreated control. We developed a nondestructive method by analyzing digital color images to quantify color changes in leaves caused......, cycloxydim, diquat dibromide, and fluazifop-p-butyl were described with a log-logistic dose–response model, and the relationship between visual inspection and image analysis was calculated at the effective doses that cause 50% and 90% response (ED50 and ED90, respectively). The ranges of HSB components...... for the green and nongreen parts of the plants and soil were different. The relative potencies were not significantly different from one, indicating that visual and image analysis estimations were about the same. The comparison results suggest that image analysis can be used to assess color changes of plants...

  8. Quantitative assessment of in-solution digestion efficiency identifies optimal protocols for unbiased protein analysis

    DEFF Research Database (Denmark)

    Leon, Ileana R; Schwämmle, Veit; Jensen, Ole N;

    2013-01-01

    a combination of qualitative and quantitative LC-MS/MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein...... conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents prior to analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative LC-MS/MS workflow quantified over 3700 distinct peptides with 96% completeness between all...... protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows...

  9. An assessment of turbulence models for linear hydrodynamic stability analysis of strongly swirling jets

    CERN Document Server

    Rukes, Lothar; Oberleithner, Kilian

    2016-01-01

    Linear stability analysis has proven to be a useful tool in the analysis of dominant coherent structures, such as the von K\\'{a}rm\\'{a}n vortex street and the global spiral mode associated with the vortex breakdown of swirling jets. In recent years, linear stability analysis has been applied successfully to turbulent time-mean flows, instead of laminar base-flows, \\textcolor{black}{which requires turbulent models that account for the interaction of the turbulent field with the coherent structures. To retain the stability equations of laminar flows, the Boussinesq approximation with a spatially nonuniform but isotropic eddy viscosity is typically employed. In this work we assess the applicability of this concept to turbulent strongly swirling jets, a class of flows that is particularly unsuited for isotropic eddy viscosity models. Indeed we find that unsteady RANS simulations only match with experiments with a Reynolds stress model that accounts for an anisotropic eddy viscosity. However, linear stability anal...

  10. Analysis of research publications that relate to bioterrorism and risk assessment.

    Science.gov (United States)

    Barker, Gary C

    2013-09-01

    Research relating to bioterrorism and its associated risks is interdisciplinary and is performed with a wide variety of objectives. Although published reports of this research have appeared only in the past decade, there has been a steady increase in their number and a continuous diversification of sources, content, and document types. In this analysis, we explored a large set of published reports, identified from accessible indices using simple search techniques, and tried to rationalize the patterns and connectivity of the research subjects rather than the detailed content. The analysis is based on a connectivity network representation built from author-assigned keywords. Network analysis reveals a strong relationship between research aimed at bioterrorism risks and research identified with public health. Additionally, the network identifies clusters of keywords centered on emergency preparedness and food safety issues. The network structure includes a large amount of meta-information that can be used for assessment and planning of research activity and for framing specific research interests.

  11. Comparative SWOT analysis of strategic environmental assessment systems in the Middle East and North Africa region.

    Science.gov (United States)

    Rachid, G; El Fadel, M

    2013-08-15

    This paper presents a SWOT analysis of SEA systems in the Middle East North Africa region through a comparative examination of the status, application and structure of existing systems based on country-specific legal, institutional and procedural frameworks. The analysis is coupled with the multi-attribute decision making method (MADM) within an analytical framework that involves both performance analysis based on predefined evaluation criteria and countries' self-assessment of their SEA system through open-ended surveys. The results show heterogenous status with a general delayed progress characterized by varied levels of weaknesses embedded in the legal and administrative frameworks and poor integration with the decision making process. Capitalizing on available opportunities, the paper highlights measures to enhance the development and enactment of SEA in the region.

  12. [Objective assessment of facial paralysis using infrared thermography and formal concept analysis].

    Science.gov (United States)

    Liu, Xu-Long; Hong, Wen-Xue; Liu, Jie-Min

    2014-04-01

    This paper presented a novel approach to objective assessment of facial nerve paralysis based on infrared thermography and formal concept analysis. Sixty five patients with facial nerve paralysis on one side were included in the study. The facial temperature distribution images of these 65 patients were captured by infrared thermography every five days during one-month period. First, the facial thermal images were pre-processed to identify six potential regions of bilateral symmetry by using image segmentation techniques. Then, the temperature differences on the left and right sides of the facial regions were extracted and analyzed. Finally, the authors explored the relationships between the statistical averages of those temperature differences and the House-Brackmann score for objective assessment degree of nerve damage in a facial nerve paralysis by using formal concept analysis. The results showed that the facial temperature distribution of patients with facial nerve paralysis exhibited a contralateral asymmetry, and the bilateral temperature differences of the facial regions were greater than 0.2 degrees C, whereas in normal healthy individuals these temperature differences were less than 0.2 degrees C. Spearman correlation coefficient between the bilateral temperature differences of the facial regions and the degree of facial nerve damage was an average of 0.508, which was statistically significant (p facial regions was greater than 0.2 degrees C, and all were less than 0.5 degrees C, facial nerve paralysis could be determined as for the mild to moderate; if one of the temperature differences of bilateral symmetry was greater than 0.5 degrees C, facial nerve paralysis could be determined as for serious. In conclusion, this paper presents an automated technique for the computerized analysis of thermal images to objectively assess facial nerve related thermal dysfunction by using formal concept analysis theory, which may benefit the clinical diagnosis and

  13. A Comparative Analysis on Assessment of Land Carrying Capacity with Ecological Footprint Analysis and Index System Method.

    Directory of Open Access Journals (Sweden)

    Yao Qian

    Full Text Available Land carrying capacity (LCC explains whether the local land resources are effectively used to support economic activities and/or human population. LCC can be evaluated commonly with two approaches, namely ecological footprint analysis (EFA and the index system method (ISM. EFA is helpful to investigate the effects of different land categories whereas ISM can be used to evaluate the contributions of social, environmental, and economic factors. Here we compared the two LCC-evaluation approaches with data collected from Xiamen City, a typical region where rapid economic growth and urbanization are found in China. The results show that LCC assessments with EFA and ISM not only complement each other but also are mutually supportive. Both assessments suggest that decreases in arable land and increasingly high energy consumption have major negative effects on LCC and threaten sustainable development for Xiamen City. It is important for the local policy makers, planners and designers to reduce ecological deficits by controlling fossil energy consumption, protecting arable land and forest land from converting into other land types, and slowing down the speed of urbanization, and to promote sustainability by controlling rural-to-urban immigration, increasing hazard-free treatment rate of household garbage, and raising energy consumption per unit industrial added value. Although EFA seems more appropriate for estimating LCC for a resource-output or self-sufficient region and ISM is more suitable for a resource-input region, both approaches should be employed when perform LCC assessment in any places around the world.

  14. Spatial assessment of air quality patterns in Malaysia using multivariate analysis

    Science.gov (United States)

    Dominick, Doreena; Juahir, Hafizan; Latif, Mohd Talib; Zain, Sharifuddin M.; Aris, Ahmad Zaharin

    2012-12-01

    This study aims to investigate possible sources of air pollutants and the spatial patterns within the eight selected Malaysian air monitoring stations based on a two-year database (2008-2009). The multivariate analysis was applied on the dataset. It incorporated Hierarchical Agglomerative Cluster Analysis (HACA) to access the spatial patterns, Principal Component Analysis (PCA) to determine the major sources of the air pollution and Multiple Linear Regression (MLR) to assess the percentage contribution of each air pollutant. The HACA results grouped the eight monitoring stations into three different clusters, based on the characteristics of the air pollutants and meteorological parameters. The PCA analysis showed that the major sources of air pollution were emissions from motor vehicles, aircraft, industries and areas of high population density. The MLR analysis demonstrated that the main pollutant contributing to variability in the Air Pollutant Index (API) at all stations was particulate matter with a diameter of less than 10 μm (PM10). Further MLR analysis showed that the main air pollutant influencing the high concentration of PM10 was carbon monoxide (CO). This was due to combustion processes, particularly originating from motor vehicles. Meteorological factors such as ambient temperature, wind speed and humidity were also noted to influence the concentration of PM10.

  15. Taylor Dispersion Analysis as a promising tool for assessment of peptide-peptide interactions.

    Science.gov (United States)

    Høgstedt, Ulrich B; Schwach, Grégoire; van de Weert, Marco; Østergaard, Jesper

    2016-10-10

    Protein-protein and peptide-peptide (self-)interactions are of key importance in understanding the physiochemical behavior of proteins and peptides in solution. However, due to the small size of peptide molecules, characterization of these interactions is more challenging than for proteins. In this work, we show that protein-protein and peptide-peptide interactions can advantageously be investigated by measurement of the diffusion coefficient using Taylor Dispersion Analysis. Through comparison to Dynamic Light Scattering it was shown that Taylor Dispersion Analysis is well suited for the characterization of protein-protein interactions of solutions of α-lactalbumin and human serum albumin. The peptide-peptide interactions of three selected peptides were then investigated in a concentration range spanning from 0.5mg/ml up to 80mg/ml using Taylor Dispersion Analysis. The peptide-peptide interactions determination indicated that multibody interactions significantly affect the PPIs at concentration levels above 25mg/ml for the two charged peptides. Relative viscosity measurements, performed using the capillary based setup applied for Taylor Dispersion Analysis, showed that the viscosity of the peptide solutions increased with concentration. Our results indicate that a viscosity difference between run buffer and sample in Taylor Dispersion Analysis may result in overestimation of the measured diffusion coefficient. Thus, Taylor Dispersion Analysis provides a practical, but as yet primarily qualitative, approach to assessment of the colloidal stability of both peptide and protein formulations.

  16. Risky business: factor analysis of survey data - assessing the probability of incorrect dimensionalisation.

    Directory of Open Access Journals (Sweden)

    Cees van der Eijk

    Full Text Available This paper undertakes a systematic assessment of the extent to which factor analysis the correct number of latent dimensions (factors when applied to ordered-categorical survey items (so-called Likert items. We simulate 2400 data sets of uni-dimensional Likert items that vary systematically over a range of conditions such as the underlying population distribution, the number of items, the level of random error, and characteristics of items and item-sets. Each of these datasets is factor analysed in a variety of ways that are frequently used in the extant literature, or that are recommended in current methodological texts. These include exploratory factor retention heuristics such as Kaiser's criterion, Parallel Analysis and a non-graphical scree test, and (for exploratory and confirmatory analyses evaluations of model fit. These analyses are conducted on the basis of Pearson and polychoric correlations. We find that, irrespective of the particular mode of analysis, factor analysis applied to ordered-categorical survey data very often leads to over-dimensionalisation. The magnitude of this risk depends on the specific way in which factor analysis is conducted, the number of items, the properties of the set of items, and the underlying population distribution. The paper concludes with a discussion of the consequences of over-dimensionalisation, and a brief mention of alternative modes of analysis that are much less prone to such problems.

  17. Risky business: factor analysis of survey data - assessing the probability of incorrect dimensionalisation.

    Science.gov (United States)

    van der Eijk, Cees; Rose, Jonathan

    2015-01-01

    This paper undertakes a systematic assessment of the extent to which factor analysis the correct number of latent dimensions (factors) when applied to ordered-categorical survey items (so-called Likert items). We simulate 2400 data sets of uni-dimensional Likert items that vary systematically over a range of conditions such as the underlying population distribution, the number of items, the level of random error, and characteristics of items and item-sets. Each of these datasets is factor analysed in a variety of ways that are frequently used in the extant literature, or that are recommended in current methodological texts. These include exploratory factor retention heuristics such as Kaiser's criterion, Parallel Analysis and a non-graphical scree test, and (for exploratory and confirmatory analyses) evaluations of model fit. These analyses are conducted on the basis of Pearson and polychoric correlations. We find that, irrespective of the particular mode of analysis, factor analysis applied to ordered-categorical survey data very often leads to over-dimensionalisation. The magnitude of this risk depends on the specific way in which factor analysis is conducted, the number of items, the properties of the set of items, and the underlying population distribution. The paper concludes with a discussion of the consequences of over-dimensionalisation, and a brief mention of alternative modes of analysis that are much less prone to such problems.

  18. Quantitative assessment of p-glycoprotein expression and function using confocal image analysis.

    Science.gov (United States)

    Hamrang, Zahra; Arthanari, Yamini; Clarke, David; Pluen, Alain

    2014-10-01

    P-glycoprotein is implicated in clinical drug resistance; thus, rapid quantitative analysis of its expression and activity is of paramout importance to the design and success of novel therapeutics. The scope for the application of quantitative imaging and image analysis tools in this field is reported here at "proof of concept" level. P-glycoprotein expression was utilized as a model for quantitative immunofluorescence and subsequent spatial intensity distribution analysis (SpIDA). Following expression studies, p-glycoprotein inhibition as a function of verapamil concentration was assessed in two cell lines using live cell imaging of intracellular Calcein retention and a routine monolayer fluorescence assay. Intercellular and sub-cellular distributions in the expression of the p-glycoprotein transporter between parent and MDR1-transfected Madin-Derby Canine Kidney cell lines were examined. We have demonstrated that quantitative imaging can provide dose-response parameters while permitting direct microscopic analysis of intracellular fluorophore distributions in live and fixed samples. Analysis with SpIDA offers the ability to detect heterogeniety in the distribution of labeled species, and in conjunction with live cell imaging and immunofluorescence staining may be applied to the determination of pharmacological parameters or analysis of biopsies providing a rapid prognostic tool.

  19. Discrete wavelet transform analysis of surface electromyography for the fatigue assessment of neck and shoulder muscles.

    Science.gov (United States)

    Chowdhury, Suman Kanti; Nimbarte, Ashish D; Jaridi, Majid; Creese, Robert C

    2013-10-01

    Assessment of neuromuscular fatigue is essential for early detection and prevention of risks associated with work-related musculoskeletal disorders. In recent years, discrete wavelet transform (DWT) of surface electromyography (SEMG) has been used to evaluate muscle fatigue, especially during dynamic contractions when the SEMG signal is non-stationary. However, its application to the assessment of work-related neck and shoulder muscle fatigue is not well established. Therefore, the purpose of this study was to establish DWT analysis as a suitable method to conduct quantitative assessment of neck and shoulder muscle fatigue under dynamic repetitive conditions. Ten human participants performed 40min of fatiguing repetitive arm and neck exertions while SEMG data from the upper trapezius and sternocleidomastoid muscles were recorded. The ten of the most commonly used wavelet functions were used to conduct the DWT analysis. Spectral changes estimated using power of wavelet coefficients in the 12-23Hz frequency band showed the highest sensitivity to fatigue induced by the dynamic repetitive exertions. Although most of the wavelet functions tested in this study reasonably demonstrated the expected power trend with fatigue development and recovery, the overall performance of the "Rbio3.1" wavelet in terms of power estimation and statistical significance was better than the remaining nine wavelets.

  20. Musculoskeletal Disorders Assessment and Posture Analysis by LUBA among Female Hairdressers in Tehran, 2015

    Directory of Open Access Journals (Sweden)

    Mohammad Khandan

    2017-01-01

    Full Text Available Background & Aims of the Study: Musculoskeletal disorders (MSDs are part of the main occupational diseases in the workplace. Occupations such as hairdressers are exposed to multiple risk factors of these problems. The study was conducted to assess MSDs and posture analysis among female hairdressers in Tehran, 2015. Materials and Methods: In this cross-sectional research, 114 participants were studied. To collect data, demographic questionnaire, body map for assessment of MSDs and Postural Loading on the Upper Body Assessment (LUBA method to evaluate postures was used. Also, data were analyzed by Mann-Whitney, Kruskal Wallis and Spearman correlation tests through SPSS-V20. Results: The mean and standard deviation of age and experience of the participants were5.34±8.9 and 10±8 years, respectively. In addition, they worked 9.8 hours per day on average. One hundred and thirteen (99.12% persons have experienced the pain at least in one member of their musculoskeletal system. Most of hairdressers had reported leg, lower back, as well as neck and shoulder pain. According to the posture assessment, 94.2% of people experienced high risk of exposure to risk factors for MSDs. Conclusion: Findings showed MSDs are high among barbers. Also, the work situations require immediate correction. Correction of workstations and tools design, work rest cycle and reduction in repetitive motions can help to improve working conditions.

  1. Transcription factor motif quality assessment requires systematic comparative analysis [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Caleb Kipkurui Kibet

    2016-03-01

    Full Text Available Transcription factor (TF binding site prediction remains a challenge in gene regulatory research due to degeneracy and potential variability in binding sites in the genome. Dozens of algorithms designed to learn binding models (motifs have generated many motifs available in research papers with a subset making it to databases like JASPAR, UniPROBE and Transfac. The presence of many versions of motifs from the various databases for a single TF and the lack of a standardized assessment technique makes it difficult for biologists to make an appropriate choice of binding model and for algorithm developers to benchmark, test and improve on their models. In this study, we review and evaluate the approaches in use, highlight differences and demonstrate the difficulty of defining a standardized motif assessment approach. We review scoring functions, motif length, test data and the type of performance metrics used in prior studies as some of the factors that influence the outcome of a motif assessment. We show that the scoring functions and statistics used in motif assessment influence ranking of motifs in a TF-specific manner. We also show that TF binding specificity can vary by source of genomic binding data. We also demonstrate that information content of a motif is not in isolation a measure of motif quality but is influenced by TF binding behaviour. We conclude that there is a need for an easy-to-use tool that presents all available evidence for a comparative analysis.

  2. A methodological framework for hydromorphological assessment, analysis and monitoring (IDRAIM) aimed at promoting integrated river management

    Science.gov (United States)

    Rinaldi, M.; Surian, N.; Comiti, F.; Bussettini, M.

    2015-12-01

    A methodological framework for hydromorphological assessment, analysis and monitoring (named IDRAIM) has been developed with the specific aim of supporting the management of river processes by integrating the objectives of ecological quality and flood risk mitigation. The framework builds on existing and up-to-date geomorphological concepts and approaches and has been tested on several Italian streams. The framework includes the following four phases: (1) catchment-wide characterization of the fluvial system; (2) evolutionary trajectory reconstruction and assessment of current river conditions; (3) description of future trends of channel evolution; and (4) identification of management options. The framework provides specific consideration of the temporal context, in terms of reconstructing the trajectory of past channel evolution as a basis for interpreting present river conditions and future trends. A series of specific tools has been developed for the assessment of river conditions, in terms of morphological quality and channel dynamics. These include: the Morphological Quality Index (MQI), the Morphological Dynamics Index (MDI), the Event Dynamics Classification (EDC), and the river morphodynamic corridors (MC and EMC). The monitoring of morphological parameters and indicators, alongside the assessment of future scenarios of channel evolution provides knowledge for the identification, planning and prioritization of actions for enhancing morphological quality and risk mitigation.

  3. An educationally inspired illustration of two-dimensional Quantitative Microbiological Risk Assessment (QMRA) and sensitivity analysis.

    Science.gov (United States)

    Vásquez, G A; Busschaert, P; Haberbeck, L U; Uyttendaele, M; Geeraerd, A H

    2014-11-03

    Quantitative Microbiological Risk Assessment (QMRA) is a structured methodology used to assess the risk involved by ingestion of a pathogen. It applies mathematical models combined with an accurate exploitation of data sets, represented by distributions and - in the case of two-dimensional Monte Carlo simulations - their hyperparameters. This research aims to highlight background information, assumptions and truncations of a two-dimensional QMRA and advanced sensitivity analysis. We believe that such a detailed listing is not always clearly presented in actual risk assessment studies, while it is essential to ensure reliable and realistic simulations and interpretations. As a case-study, we are considering the occurrence of listeriosis in smoked fish products in Belgium during the period 2008-2009, using two-dimensional Monte Carlo and two sensitivity analysis methods (Spearman correlation and Sobol sensitivity indices) to estimate the most relevant factors of the final risk estimate. A risk estimate of 0.018% per consumption of contaminated smoked fish by an immunocompromised person was obtained. The final estimate of listeriosis cases (23) is within the actual reported result obtained for the same period and for the same population. Variability on the final risk estimate is determined by the variability regarding (i) consumer refrigerator temperatures, (ii) the reference growth rate of L. monocytogenes, (iii) the minimum growth temperature of L. monocytogenes and (iv) consumer portion size. Variability regarding the initial contamination level of L. monocytogenes tends to appear as a determinant of risk variability only when the minimum growth temperature is not included in the sensitivity analysis; when it is included the impact regarding the variability on the initial contamination level of L. monocytogenes is disappearing. Uncertainty determinants of the final risk indicated the need of gathering more information on the reference growth rate and the minimum

  4. Numerical daemons in hydrological modeling: Effects on uncertainty assessment, sensitivity analysis and model predictions

    Science.gov (United States)

    Kavetski, D.; Clark, M. P.; Fenicia, F.

    2011-12-01

    Hydrologists often face sources of uncertainty that dwarf those normally encountered in many engineering and scientific disciplines. Especially when representing large scale integrated systems, internal heterogeneities such as stream networks, preferential flowpaths, vegetation, etc, are necessarily represented with a considerable degree of lumping. The inputs to these models are themselves often the products of sparse observational networks. Given the simplifications inherent in environmental models, especially lumped conceptual models, does it really matter how they are implemented? At the same time, given the complexities usually found in the response surfaces of hydrological models, increasingly sophisticated analysis methodologies are being proposed for sensitivity analysis, parameter calibration and uncertainty assessment. Quite remarkably, rather than being caused by the model structure/equations themselves, in many cases model analysis complexities are consequences of seemingly trivial aspects of the model implementation - often, literally, whether the start-of-step or end-of-step fluxes are used! The extent of problems can be staggering, including (i) degraded performance of parameter optimization and uncertainty analysis algorithms, (ii) erroneous and/or misleading conclusions of sensitivity analysis, parameter inference and model interpretations and, finally, (iii) poor reliability of a calibrated model in predictive applications. While the often nontrivial behavior of numerical approximations has long been recognized in applied mathematics and in physically-oriented fields of environmental sciences, it remains a problematic issue in many environmental modeling applications. Perhaps detailed attention to numerics is only warranted for complicated engineering models? Would not numerical errors be an insignificant component of total uncertainty when typical data and model approximations are present? Is this really a serious issue beyond some rare isolated

  5. The Structured Assessment Approach: A microcomputer-based insider-vulnerability analysis tool

    Energy Technology Data Exchange (ETDEWEB)

    Patenaude, C.J.; Sicherman, A.; Sacks, I.J.

    1986-01-01

    The Structured Assessment Approach (SAA) was developed to help assess the vulnerability of safeguards systems to insiders in a staged manner. For physical security systems, the SAA identifies possible diversion paths which are not safeguarded under various facility operating conditions and insiders who could defeat the system via direct access, collusion or indirect tampering. For material control and accounting systems, the SAA identifies those who could block the detection of a material loss or diversion via data falsification or equipment tampering. The SAA, originally designed to run on a mainframe computer, has been converted to run on a personal computer. Many features have been added to simplify and facilitate its use for conducting vulnerability analysis. The SAA microcomputer based approach is discussed in this paper.

  6. Predictive Validity of Pressure Ulcer Risk Assessment Tools for Elderly: A Meta-Analysis.

    Science.gov (United States)

    Park, Seong-Hi; Lee, Young-Shin; Kwon, Young-Mi

    2016-04-01

    Preventing pressure ulcers is one of the most challenging goals existing for today's health care provider. Currently used tools which assess risk of pressure ulcer development rarely evaluate the accuracy of predictability, especially in older adults. The current study aimed at providing a systemic review and meta-analysis of 29 studies using three pressure ulcer risk assessment tools: Braden, Norton, and Waterlow Scales. Overall predictive validities of pressure ulcer risks in the pooled sensitivity and specificity indicated a similar range with a moderate accuracy level in all three scales, while heterogeneity showed more than 80% variability among studies. The studies applying the Braden Scale used five different cut-off points representing the primary cause of heterogeneity. Results indicate that commonly used screening tools for pressure ulcer risk have limitations regarding validity and accuracy for use with older adults due to heterogeneity among studies.

  7. Assessing the Queuing Process Using Data Envelopment Analysis: an Application in Health Centres.

    Science.gov (United States)

    Safdar, Komal A; Emrouznejad, Ali; Dey, Prasanta K

    2016-01-01

    Queuing is one of the very important criteria for assessing the performance and efficiency of any service industry, including healthcare. Data Envelopment Analysis (DEA) is one of the most widely-used techniques for performance measurement in healthcare. However, no queue management application has been reported in the health-related DEA literature. Most of the studies regarding patient flow systems had the objective of improving an already existing Appointment System. The current study presents a novel application of DEA for assessing the queuing process at an Outpatients' department of a large public hospital in a developing country where appointment systems do not exist. The main aim of the current study is to demonstrate the usefulness of DEA modelling in the evaluation of a queue system. The patient flow pathway considered for this study consists of two stages; consultation with a doctor and pharmacy. The DEA results indicated that waiting times and other related queuing variables included need considerable minimisation at both stages.

  8. Nondestructive inspection assessment of eddy current and electrochemical analysis to separate inconel and stainless steel alloys

    Energy Technology Data Exchange (ETDEWEB)

    Moore, D.G.; Sorensen, N.R.

    1998-02-01

    This report presents a nondestructive inspection assessment of eddy current and electrochemical analysis to separate inconel alloys from stainless steel alloys as well as an evaluation of cleaning techniques to remove a thermal oxide layer on aircraft exhaust components. The results of this assessment are presented in terms of how effective each technique classifies a known exhaust material. Results indicate that either inspection technique can separate inconel and stainless steel alloys. Based on the experiments conducted, the electrochemical spot test is the optimum for use by airframe and powerplant mechanics. A spot test procedure is proposed for incorporation into the Federal Aviation Administration Advisory Circular 65-9A Airframe & Powerplant Mechanic - General Handbook. 3 refs., 70 figs., 7 tabs.

  9. Using statistical analysis and artificial intelligence tools for automatic assessment of video sequences

    Science.gov (United States)

    Ekobo Akoa, Brice; Simeu, Emmanuel; Lebowsky, Fritz

    2014-01-01

    This paper proposes two novel approaches to Video Quality Assessment (VQA). Both approaches attempt to develop video evaluation techniques capable of replacing human judgment when rating video quality in subjective experiments. The underlying study consists of selecting fundamental quality metrics based on Human Visual System (HVS) models and using artificial intelligence solutions as well as advanced statistical analysis. This new combination enables suitable video quality ratings while taking as input multiple quality metrics. The first method uses a neural network based machine learning process. The second method consists in evaluating the video quality assessment using non-linear regression model. The efficiency of the proposed methods is demonstrated by comparing their results with those of existing work done on synthetic video artifacts. The results obtained by each method are compared with scores from a database resulting from subjective experiments.

  10. Does bioimpedance analysis or measurement of natriuretic peptides aid volume assessment in peritoneal dialysis patients?

    Science.gov (United States)

    Davenport, Andrew

    2013-01-01

    Cardiovascular mortality remains the commonest cause of death for peritoneal dialysis patients. As such, preventing persistent hypervolemia is important. On the other hand, hypovolemia may potentially risk episodes of acute kidney injury and loss of residual renal function, a major determinant of peritoneal dialysis technique survival. Bioimpedance has developed from a single-frequency research tool to a multi-frequency bioelectrical impedance analysis readily available in the clinic and capable of measuring extracellular, intracellular, and total body water. Similarly, natriuretic peptides released from the heart because of myocardial stretch and increased intracardiac volume have also been variously reported to be helpful in assessing volume status in peritoneal dialysis patients. The question then arises whether these newer technologies and biomarkers have supplanted the time-honored clinical assessment of hydration status or whether they are merely adjuncts that aid the experienced clinician.

  11. A computer-based image analysis method for assessing the severity of hip joint osteoarthritis

    Science.gov (United States)

    Boniatis, Ioannis; Costaridou, Lena; Cavouras, Dionisis; Panagiotopoulos, Elias; Panayiotakis, George

    2006-12-01

    A computer-based image analysis method was developed for assessing the severity of hip osteoarthritis (OA). Eighteen pelvic radiographs of patients with verified unilateral hip OA, were digitized and enhanced employing custom developed software. Two ROIs corresponding to osteoarthritic and contralateral-physiological radiographic Hip Joint Spaces (HJSs) were determined on each radiograph. Textural features were extracted from the HJS-ROIs utilizing the run-length matrices and Laws textural measures. A k-Nearest Neighbour based hierarchical tree structure was designed for classifying hips into three OA severity categories labeled as "Normal", "Mild/Moderate", and "Severe". Employing the run-length features, the overall classification accuracy of the hierarchical tree structure was 86.1%. The utilization of Laws' textural measures improved the system classification performance, providing an overall classification accuracy of 94.4%. The proposed method maybe of value to physicians in assessing the severity of hip OA.

  12. Assessment of Static Delamination Propagation Capabilities in Commercial Finite Element Codes Using Benchmark Analysis

    Science.gov (United States)

    Orifici, Adrian C.; Krueger, Ronald

    2010-01-01

    With capabilities for simulating delamination growth in composite materials becoming available, the need for benchmarking and assessing these capabilities is critical. In this study, benchmark analyses were performed to assess the delamination propagation simulation capabilities of the VCCT implementations in Marc TM and MD NastranTM. Benchmark delamination growth results for Double Cantilever Beam, Single Leg Bending and End Notched Flexure specimens were generated using a numerical approach. This numerical approach was developed previously, and involves comparing results from a series of analyses at different delamination lengths to a single analysis with automatic crack propagation. Specimens were analyzed with three-dimensional and two-dimensional models, and compared with previous analyses using Abaqus . The results demonstrated that the VCCT implementation in Marc TM and MD Nastran(TradeMark) was capable of accurately replicating the benchmark delamination growth results and that the use of the numerical benchmarks offers advantages over benchmarking using experimental and analytical results.

  13. Confirmatory factor analysis of the Neuropsychological Assessment Battery of the LADIS study

    DEFF Research Database (Denmark)

    Moleiro, Carla; Madureira, Sofia; Verdelho, Ana

    2013-01-01

    Age-related white matter changes have been associated with cognitive functioning, even though their role is not fully understood. This work aimed to test a 3-factor model of the neuropsychological assessment battery and evaluate how the model fit the data longitudinally. Confirmatory factor...... analysis (CFA) was used to investigate the dimensions of a structured set of neuropsychological tests administered to a multicenter, international sample of independent older adults (LADIS study). Six hundred and thirty-eight older adults completed baseline neuropsychological, clinical, functional...... and motor assessments, which were repeated each year for a 3-year follow-up. CFA provided support for a 3-factor model. These factors involve the dimensions of executive functions, memory functions, and speed and motor control abilities. Performance decreased in most neuropsychological measures. Results...

  14. Automatic Speech Signal Analysis for Clinical Diagnosis and Assessment of Speech Disorders

    CERN Document Server

    Baghai-Ravary, Ladan

    2013-01-01

    Automatic Speech Signal Analysis for Clinical Diagnosis and Assessment of Speech Disorders provides a survey of methods designed to aid clinicians in the diagnosis and monitoring of speech disorders such as dysarthria and dyspraxia, with an emphasis on the signal processing techniques, statistical validity of the results presented in the literature, and the appropriateness of methods that do not require specialized equipment, rigorously controlled recording procedures or highly skilled personnel to interpret results. Such techniques offer the promise of a simple and cost-effective, yet objective, assessment of a range of medical conditions, which would be of great value to clinicians. The ideal scenario would begin with the collection of examples of the clients’ speech, either over the phone or using portable recording devices operated by non-specialist nursing staff. The recordings could then be analyzed initially to aid diagnosis of conditions, and subsequently to monitor the clients’ progress and res...

  15. Resource efficiency of urban sanitation systems. A comparative assessment using material and energy flow analysis

    Energy Technology Data Exchange (ETDEWEB)

    Meinzinger, Franziska

    2010-07-01

    Within the framework of sustainable development it is important to find ways of reducing natural resource consumption and to change towards closed-loop management. As in many other spheres increased resource efficiency has also become an important issue in sanitation. Particularly nutrient recovery for agriculture, increased energy-efficiency and saving of natural water resources, can make a contribution to more resource efficient sanitation systems. To assess the resource efficiency of alternative developments a systems perspective is required. The present study applies a combined cost, energy and material flow analysis (ceMFA) as a system analysis method to assess the resource efficiency of urban sanitation systems. This includes the discussion of relevant criteria and assessment methods. The main focus of this thesis is the comparative assessment of different systems, based on two case studies; Hamburg in Germany and Arba Minch in Ethiopia. A range of possible system developments including source separation (e.g. diversion of urine or blackwater) is defined and compared with the current situation as a reference system. The assessment is carried out using computer simulations based on model equations. The model equations not only integrate mass and nutrient flows, but also the energy and cost balances of the different systems. In order to assess the impact of different assumptions and calculation parameters, sensitivity analyses and parameter variations complete the calculations. Based on the simulations, following general conclusions can be drawn: None of the systems show an overall benefit with regard to all investigated criteria, namely nutrients, energy, water and costs. Yet, the results of the system analysis can be used as basis for decision making if a case-related weighting is introduced. The systems show varying potential for the recovery of nutrients from (source separated) wastewater flows. For the case study of Hamburg up to 29% of the mineral

  16. Pesticide Flow Analysis to Assess Human Exposure in Greenhouse Flower Production in Colombia

    Directory of Open Access Journals (Sweden)

    Claudia R. Binder

    2013-03-01

    Full Text Available Human exposure assessment tools represent a means for understanding human exposure to pesticides in agricultural activities and managing possible health risks. This paper presents a pesticide flow analysis modeling approach developed to assess human exposure to pesticide use in greenhouse flower crops in Colombia, focusing on dermal and inhalation exposure. This approach is based on the material flow analysis methodology. The transfer coefficients were obtained using the whole body dosimetry method for dermal exposure and the button personal inhalable aerosol sampler for inhalation exposure, using the tracer uranine as a pesticide surrogate. The case study was a greenhouse rose farm in the Bogota Plateau in Colombia. The approach was applied to estimate the exposure to pesticides such as mancozeb, carbendazim, propamocarb hydrochloride, fosetyl, carboxin, thiram, dimethomorph and mandipropamide. We found dermal absorption estimations close to the AOEL reference values for the pesticides carbendazim, mancozeb, thiram and mandipropamide during the study period. In addition, high values of dermal exposure were found on the forearms, hands, chest and legs of study participants, indicating weaknesses in the overlapping areas of the personal protective equipment parts. These results show how the material flow analysis methodology can be applied in the field of human exposure for early recognition of the dispersion of pesticides and support the development of measures to improve operational safety during pesticide management. Furthermore, the model makes it possible to identify the status quo of the health risk faced by workers in the study area.

  17. [Multi-component quantitative analysis combined with chromatographic fingerprint for quality assessment of Onosma hookeri].

    Science.gov (United States)

    Aga, Er-bu; Nie, Li-juan; Dongzhi, Zhuo-ma; Wang, Ju-le

    2015-11-01

    A method for simultaneous determination of the shikonin, acetyl shikonin and β, β'-dimethylpropene shikonin in Onosma hookeri and the chromatographic fingerprint was estabished by HPLC-DAD on an Agilent Zorbax SB-column with a gradient elution of acetonitrile and water at 0.8 mL x min(-1), 30 degrees C. The quality assessment was conducted by comparing the content difference of three naphthoquinone constituents, in combination with chromatographic fingerprint analysis and systems cluster analysis among 7 batches of radix O. hookeri. The content of the three naphthoquinone constituents showed wide variations in 7 bathces. The similarity value of the fingerprints of sample 5, 6 and 7 was above 0.99, sample 2 and 3 above 0.97, sample 3 and 4 above 0.90, and other samples larger than 0.8, which was in concert with the content of three naphthoquinone constituents. The 7 samples were roughly divided into 4 categories. The results above indicated that the using of this medicine is complex and rather spotty. The established HPLC fingerprints and the quantitative analysis method can be used efficiently for quality assessment of O. hookeri.

  18. Pesticide flow analysis to assess human exposure in greenhouse flower production in Colombia.

    Science.gov (United States)

    Lesmes-Fabian, Camilo; Binder, Claudia R

    2013-03-25

    Human exposure assessment tools represent a means for understanding human exposure to pesticides in agricultural activities and managing possible health risks. This paper presents a pesticide flow analysis modeling approach developed to assess human exposure to pesticide use in greenhouse flower crops in Colombia, focusing on dermal and inhalation exposure. This approach is based on the material flow analysis methodology. The transfer coefficients were obtained using the whole body dosimetry method for dermal exposure and the button personal inhalable aerosol sampler for inhalation exposure, using the tracer uranine as a pesticide surrogate. The case study was a greenhouse rose farm in the Bogota Plateau in Colombia. The approach was applied to estimate the exposure to pesticides such as mancozeb, carbendazim, propamocarb hydrochloride, fosetyl, carboxin, thiram, dimethomorph and mandipropamide. We found dermal absorption estimations close to the AOEL reference values for the pesticides carbendazim, mancozeb, thiram and mandipropamide during the study period. In addition, high values of dermal exposure were found on the forearms, hands, chest and legs of study participants, indicating weaknesses in the overlapping areas of the personal protective equipment parts. These results show how the material flow analysis methodology can be applied in the field of human exposure for early recognition of the dispersion of pesticides and support the development of measures to improve operational safety during pesticide management. Furthermore, the model makes it possible to identify the status quo of the health risk faced by workers in the study area.

  19. Assessing the effect of data pretreatment procedures for principal components analysis of chromatographic data.

    Science.gov (United States)

    McIlroy, John W; Smith, Ruth Waddell; McGuffin, Victoria L

    2015-12-01

    Following publication of the National Academy of Sciences report "Strengthening Forensic Science in the United States: A Path Forward", there has been increasing interest in the application of multivariate statistical procedures for the evaluation of forensic evidence. However, prior to statistical analysis, variance from sources other than the sample must be minimized through application of data pretreatment procedures. This is necessary to ensure that subsequent statistical analysis of the data provides meaningful results. The purpose of this work was to evaluate the effect of pretreatment procedures on multivariate statistical analysis of chromatographic data obtained for a reference set of diesel fuels. Diesel was selected due to its chemical complexity and forensic relevance, both for fire debris and environmental forensic applications. Principal components analysis (PCA) was applied to the untreated chromatograms to assess association of replicates and discrimination among the different diesel samples. The chromatograms were then pretreated by sequentially applying the following procedures: background correction, smoothing, retention-time alignment, and normalization. The effect of each procedure on association and discrimination was evaluated based on the association of replicates in the PCA scores plot. For these data, background correction and smoothing offered minimal improvement, whereas alignment and normalization offered the greatest improvement in the association of replicates and discrimination among highly similar samples. Further, prior to pretreatment, the first principal component accounted for only non-sample sources of variance. Following pretreatment, these sources were minimized and the first principal component accounted for significant chemical differences among the diesel samples. These results highlight the need for pretreatment procedures and provide a metric to assess the effect of pretreatment on subsequent multivariate statistical

  20. A statistical analysis to assess the maturity and stability of six composts.

    Science.gov (United States)

    Komilis, Dimitrios P; Tziouvaras, Ioannis S

    2009-05-01

    Despite the long-time application of organic waste derived composts to crops, there is still no universally accepted index to assess compost maturity and stability. The research presented in this article investigated the suitability of seven types of seeds for use in germination bioassays to assess the maturity and phytotoxicity of six composts. The composts used in the study were derived from cow manure, sea weeds, olive pulp, poultry manure and municipal solid waste. The seeds used in the germination bioassays were radish, pepper, spinach, tomato, cress, cucumber and lettuce. Data were analyzed with an analysis of variance at two levels and with pair-wise comparisons. The analysis revealed that composts rendered as phytotoxic to one type of seed could enhance the growth of another type of seed. Therefore, germination indices, which ranged from 0% to 262%, were highly dependent on the type of seed used in the germination bioassay. The poultry manure compost was highly phytotoxic to all seeds. At the 99% confidence level, the type of seed and the interaction between the seeds and the composts were found to significantly affect germination. In addition, the stability of composts was assessed by their microbial respiration, which ranged from approximately 4 to 16g O(2)/kg organic matter and from 2.6 to approximately 11g CO(2)-C/kg C, after seven days. Initial average oxygen uptake rates were all less than approximately 0.35g O(2)/kg organic matter/h for all six composts. A high statistically significant correlation coefficient was calculated between the cumulative carbon dioxide production, over a 7-day period, and the radish seed germination index. It appears that a germination bioassay with radish can be a valid test to assess both compost stability and compost phytotoxicity.

  1. Safety assessment technology on the free drop impact and puncture analysis of the cask for radioactive material transport

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dew Hey [Korea Institute of Nuclear Safety, Taejon (Korea, Republic of); Lee, Young Shin; Ryu, Chung Hyun; Kim, Hyun Su; Lee, Ho Chul; Hong, Song Jin; Choi, Young Jin; Lee, Jae Hyung; Na, Jae Yun [Chungnam National Univ., Taejon (Korea, Republic of)

    2001-03-15

    In this study, the regulatory condition and analysis condition is analyzed for the free drop and puncture impact analysis to develop the safety assessment technology. Impact analysis is performed with finite element method which is one of the many analysis methods of the shipping cask. LS-DYNA3D and ABAQUS is suitable for the free drop and the puncture impact analysis of the shipping cask. For the analysis model, the KSC-4 that is the shipping cask to transport spent nuclear fuel is investigated. The results of both LS-DYNA3D and ABAQUS is completely corresponded. And The integrity of the shipping cask is verified. Using this study, the reliable safety assessment technology is supplied to the staff. The efficient and reliable regulatory tasks is performed using the standard safety assessment technology.

  2. Tuck Jump Assessment: An Exploratory Factor Analysis in a College Age Population.

    Science.gov (United States)

    Lininger, Monica R; Smith, Craig A; Chimera, Nicole J; Hoog, Philipp; Warren, Meghan

    2017-03-01

    Lininger, MR, Smith, CA, Chimera, NJ, Hoog, P, and Warren, M. Tuck Jump Assessment: An exploratory factor analysis in a college age population. J Strength Cond Res 31(3): 653-659, 2017-Due to the high rate of noncontact lower extremity injuries that occur in the collegiate setting, medical personnel are implementing screening mechanisms to identify those athletes that may be at risk for certain injuries before starting a sports season. The tuck jump assessment (TJA) was created as a "clinician friendly" tool to identify lower extremity landing technique flaws during a plyometric activity. There are 10 technique flaws that are assessed as either having the apparent deficit or not during the TJA. Technique flaws are then summed up for an overall score. Through expert consensus, these 10 technique flaws have been grouped into 5 modifiable risk factors: ligament dominance, quadriceps dominance, leg dominance or residual injury deficits, trunk dominance ("core" dysfunction), and technique perfection. Research has not investigated the psychometric properties of the TJA technique flaws or the modifiable risk factors. The present study is a psychometric analysis of the TJA technique flaws to measure the internal structure using an exploratory factor analysis (EFA) using data from collegiate athletes (n = 90) and a general college cohort (n = 99). The EFA suggested a 3 factor model accounting for 46% of the variance. The 3 factors were defined as fatigue, distal landing pattern, and proximal control. The results differ from the 5 modifiable risk categories as previously suggested. These results may question the use of a single score, a unidimensional construct, of the TJA for injury screening.

  3. OVERVIEW ON BNL ASSESSMENT OF SEISMIC ANALYSIS METHODS FOR DEEPLY EMBEDDED NPP STRUCTURES.

    Energy Technology Data Exchange (ETDEWEB)

    XU,J.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H.

    2007-04-01

    A study was performed by Brookhaven National Laboratory (BNL) under the sponsorship of the U. S. Nuclear Regulatory Commission (USNRC), to determine the applicability of established soil-structure interaction analysis methods and computer programs to deeply embedded and/or buried (DEB) nuclear power plant (NPP) structures. This paper provides an overview of the BNL study including a description and discussions of analyses performed to assess relative performance of various SSI analysis methods typically applied to NPP structures, as well as the importance of interface modeling for DEB structures. There are four main elements contained in the BNL study: (1) Review and evaluation of existing seismic design practice, (2) Assessment of simplified vs. detailed methods for SSI in-structure response spectrum analysis of DEB structures, (3) Assessment of methods for computing seismic induced earth pressures on DEB structures, and (4) Development of the criteria for benchmark problems which could be used for validating computer programs for computing seismic responses of DEB NPP structures. The BNL study concluded that the equivalent linear SSI methods, including both simplified and detailed approaches, can be extended to DEB structures and produce acceptable SSI response calculations, provided that the SSI response induced by the ground motion is very much within the linear regime or the non-linear effect is not anticipated to control the SSI response parameters. The BNL study also revealed that the response calculation is sensitive to the modeling assumptions made for the soil/structure interface and application of a particular material model for the soil.

  4. Wavelet transform analysis to assess oscillations in pial artery pulsation at the human cardiac frequency.

    Science.gov (United States)

    Winklewski, P J; Gruszecki, M; Wolf, J; Swierblewska, E; Kunicka, K; Wszedybyl-Winklewska, M; Guminski, W; Zabulewicz, J; Frydrychowski, A F; Bieniaszewski, L; Narkiewicz, K

    2015-05-01

    Pial artery adjustments to changes in blood pressure (BP) may last only seconds in humans. Using a novel method called near-infrared transillumination backscattering sounding (NIR-T/BSS) that allows for the non-invasive measurement of pial artery pulsation (cc-TQ) in humans, we aimed to assess the relationship between spontaneous oscillations in BP and cc-TQ at frequencies between 0.5 Hz and 5 Hz. We hypothesized that analysis of very short data segments would enable the estimation of changes in the cardiac contribution to the BP vs. cc-TQ relationship during very rapid pial artery adjustments to external stimuli. BP and pial artery oscillations during baseline (70s and 10s signals) and the response to maximal breath-hold apnea were studied in eighteen healthy subjects. The cc-TQ was measured using NIR-T/BSS; cerebral blood flow velocity, the pulsatility index and the resistive index were measured using Doppler ultrasound of the left internal carotid artery; heart rate and beat-to-beat systolic and diastolic blood pressure were recorded using a Finometer; end-tidal CO2 was measured using a medical gas analyzer. Wavelet transform analysis was used to assess the relationship between BP and cc-TQ oscillations. The recordings lasting 10s and representing 10 cycles with a frequency of ~1 Hz provided sufficient accuracy with respect to wavelet coherence and wavelet phase coherence values and yielded similar results to those obtained from approximately 70cycles (70s). A slight but significant decrease in wavelet coherence between augmented BP and cc-TQ oscillations was observed by the end of apnea. Wavelet transform analysis can be used to assess the relationship between BP and cc-TQ oscillations at cardiac frequency using signals intervals as short as 10s. Apnea slightly decreases the contribution of cardiac activity to BP and cc-TQ oscillations.

  5. Limit Load and Buckling Analysis for Assessing Hanford Single-Shell Tank Dome Structural Integrity

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Kenneth I.; Deibler, John E.; Julyk, Larry J.; Karri, Naveen K.; Pilli, Siva Prasad

    2012-12-07

    The U.S. Department of Energy, Office of River Protection has commissioned a structural analysis of record (AOR) for the Hanford single shell tanks (SSTs) to assess their structural integrity. The analysis used finite element techniques to predict the tank response to the historical thermal and operating loads. The analysis also addressed the potential tank response to a postulated design basis earthquake. The combined response to static and seismic loads was then evaluated against the design requirements of American Concrete Institute (ACI) standard, ACI-349-06, for nuclear safety-related concrete structures. Further analysis was conducted to estimate the plastic limit load and the elastic-plastic buckling capacity of the tanks. The limit load and buckling analyses estimate the margin between the applied loads and the limiting load capacities of the tank structure. The potential for additional dome loads from waste retrieval equipment and the addition of large dome penetrations to accommodate retrieval equipment has generated additional interest in the limit load and buckling analyses. This paper summarizes the structural analysis methods that were used to evaluate the limit load and buckling of the single shell tanks.

  6. Accuracy of qualitative analysis for assessment of skilled baseball pitching technique.

    Science.gov (United States)

    Nicholls, Rochelle; Fleisig, Glenn; Elliott, Bruce; Lyman, Stephen; Osinski, Edmund

    2003-07-01

    Baseball pitching must be performed with correct technique if injuries are to be avoided and performance maximized. High-speed video analysis is accepted as the most accurate and objective method for evaluation of baseball pitching mechanics. The aim of this research was to develop an equivalent qualitative analysis method for use with standard video equipment. A qualitative analysis protocol (QAP) was developed for 24 kinematic variables identified as important to pitching performance. Twenty male baseball pitchers were videotaped using 60 Hz camcorders, and their technique evaluated using the QAP, by two independent raters. Each pitcher was also assessed using a 6-camera 200 Hz Motion Analysis system (MAS). Four QAP variables (22%) showed significant similarity with MAS results. Inter-rater reliability showed agreement on 33% of QAP variables. It was concluded that a complete and accurate profile of an athlete's pitching mechanics cannot be made using the QAP in its current form, but it is possible such simple forms of biomechanical analysis could yield accurate results before 3-D methods become obligatory.

  7. The Development and Implementation of an Instrument to Assess Students’ Data Analysis Skills in Molecular Biology

    Directory of Open Access Journals (Sweden)

    Brian J. Rybarczyk

    2014-03-01

    Full Text Available Developing visual literacy skills is an important component of scientific literacy in undergraduate science education.  Comprehension, analysis, and interpretation are parts of visual literacy that describe related data analysis skills important for learning in the biological sciences. The Molecular Biology Data Analysis Test (MBDAT was developed to measure students’ data analysis skills connected with scientific reasoning when analyzing and interpreting scientific data generated from experimental research.  The skills analyzed included basic skills such as identification of patterns and trends in data and connecting a method that generated the data and advanced skills such as distinguishing positive and negative controls, synthesizing conclusions, determining if data supports a hypothesis, and predicting alternative or next-step experiments.  Construct and content validity were established and calculated statistical parameters demonstrate that the MBDAT is valid and reliable for measuring students’ data analysis skills in molecular and cell biology contexts.  The instrument also measures students’ perceived confidence in their data interpretation abilities.  As scientific research continues to evolve in complexity, interpretation of scientific information in visual formats will continue to be an important component of scientific literacy.  Thus science education will need to support and assess students’ development of these skills as part of students’ scientific training.

  8. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    Science.gov (United States)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  9. Assessing movement control in children with mental retardation: a generalizability analysis of observers.

    Science.gov (United States)

    Ulrich, D A; Riggen, K J; Ozmun, J C; Screws, D P; Cleland, F E

    1989-09-01

    A generalizability study was conducted to determine the percentage of variance associated with observers and trials when assessing movement control in children with mild mental retardation. One group of observers received competency-based training and another group experienced informal training. A series of decision studies employing the variance component estimates indicated that different conditions of observation need to be employed based on the type of training received in movement control analysis. Observers receiving informal training needed to observe twice as many trials of the kick, jump, and overband throw compared to competency-trained observers to reach an acceptable level of reliability.

  10. Automatic mechanical fault assessment of small wind energy systems in microgrids using electric signature analysis

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Marhadi, Kun Saptohartyadi; Jensen, Bogi Bech

    2013-01-01

    A microgrid is a cluster of power generation, consumption and storage systems capable of operating either independently or as part of a macrogrid. The mechanical condition of the power production units, such as the small wind turbines, is considered of crucial importance especially in the case...... of islanded operation. In this paper, the fault assessment is achieved efficiently and consistently via electric signature analysis (ESA). In ESA the fault related frequency components are manifested as sidebands of the existing current and voltage time harmonics. The energy content between the fundamental, 5...... element model where dynamic eccentricity and bearing outer race defect are simulated under varying fault severity and electric loading conditions....

  11. Guidebook in using Cost Benefit Analysis and strategic environmental assessment for environmental planning in China

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    Environmental planning in China may benefit from greater use of Cost Benefit Analysis (CBA) and Strategic Environmental Assessment (SEA) methodologies. We provide guidance on using these methodologies. Part I and II show the principles behind the methodologies as well as their theoretical structure. Part III demonstrates the methodologies in action in a range of different good practice examples. The case studies and theoretical expositions are intended to teach by way of example as well as by understanding the principles, and to help planners use the methodologies as correctly as possible.(auth)

  12. [Assessment of ultraviolet radiation penetration into human skin. I. Theoretical analysis].

    Science.gov (United States)

    Cader, A; Jankowski, J

    1995-01-01

    This is one of the two articles under the same title "Assessment of ultraviolet radiation penetrating into human skin" which are aimed at presenting a part of broader studies in this area. They drive at identifying biophysical aspects of the effects of ultraviolet radiation on human skin. In order to characterise such parameters as UV reflectance from the skin surface of UV absorption and dispersion coefficients, it is necessary to develop appropriate methods. In Part I--"Theoretical analysis", theoretical principles for interpreting measurements of radiation dispersed in different geometrical configurations are presented. They can serve as a basis for estimating the values of UV linear absorption and dispersion coefficients in skin tissues.

  13. Using computerized text analysis to assess communication within an Italian type 1 diabetes Facebook group

    Directory of Open Access Journals (Sweden)

    Alda Troncone

    2015-11-01

    Full Text Available The purpose of this study was to assess messages posted by mothers of children with type 1 diabetes in the Italian Facebook group “Mamme e diabete” using computerized text analysis. The data suggest that these mothers use online discussion boards as a place to seek and provide information to better manage the disease’s daily demands—especially those tasks linked to insulin correction and administration, control of food intake, and bureaucratic duties, as well as to seek and give encouragement and to share experiences regarding diabetes and related impact on their life. The implications of these findings for the management of diabetes are discussed.

  14. Stock assessment of Haliporoides triarthrus (Fam. Solenoceridae) off Mozambique: a preliminary analysis

    OpenAIRE

    Torstensen, E.; Pacule, H.

    1992-01-01

    The pink shrimp, Haliporoides triarthrus, is an important species in the deep-water shrimp fishery in Mozambique. Total catches are in the range of 1,500 to 2,700 tons, with the pink shrimp accounting for 70-90%. Estimates of growth parameters and of natural mortality are used for a preliminary assessment of the fishery, based on length-structured virtual population analysis and yield-per-recruit analyses. With an arbitrarily chosen terminal fishing mortality F, the results indicate a situati...

  15. Probability analysis of geological processes: a useful tool for the safety assessment of radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    D' Alessandro, M.; Murray, C.N.; Bertozzi, G.; Girardi, F.

    1980-05-01

    In the development of methods for the assessment of the risk associated with the disposal of radioactive wastes over periods up to 10/sup 6/ years, much discussion has occurred on the use of probability analysis for geological processes. The applicability and limitations of this concept are related to the proper use of the geological data-base and the critical interpretation of probability distributions. The interpretation of geological phenomena in terms of probability is discussed and an example of application to the determination of faulting probability is illustrated. The method has been used for the determination of failure probability of geological segregation of a waste repository in a clay formation.

  16. Using computerized text analysis to assess communication within an Italian type 1 diabetes Facebook group.

    Science.gov (United States)

    Troncone, Alda; Cascella, Crescenzo; Chianese, Antonietta; Iafusco, Dario

    2015-07-01

    The purpose of this study was to assess messages posted by mothers of children with type 1 diabetes in the Italian Facebook group "Mamme e diabete" using computerized text analysis. The data suggest that these mothers use online discussion boards as a place to seek and provide information to better manage the disease's daily demands-especially those tasks linked to insulin correction and administration, control of food intake, and bureaucratic duties, as well as to seek and give encouragement and to share experiences regarding diabetes and related impact on their life. The implications of these findings for the management of diabetes are discussed.

  17. Semantic Pattern Analysis for Verbal Fluency Based Assessment of Neurological Disorders

    Energy Technology Data Exchange (ETDEWEB)

    Sukumar, Sreenivas R [ORNL; Ainsworth, Keela C [ORNL; Brown, Tyler C [ORNL

    2014-01-01

    In this paper, we present preliminary results of semantic pattern analysis of verbal fluency tests used for assessing cognitive psychological and neuropsychological disorders. We posit that recent advances in semantic reasoning and artificial intelligence can be combined to create a standardized computer-aided diagnosis tool to automatically evaluate and interpret verbal fluency tests. Towards that goal, we derive novel semantic similarity (phonetic, phonemic and conceptual) metrics and present the predictive capability of these metrics on a de-identified dataset of participants with and without neurological disorders.

  18. A sensitivity analysis of a radiological assessment model for Arctic waters

    DEFF Research Database (Denmark)

    Nielsen, S.P.

    1998-01-01

    A model based on compartment analysis has been developed to simulate the dispersion of radionuclides in Arctic waters for an assessment of doses to man. The model predicts concentrations of radionuclides in the marine environment and doses to man from a range of exposure pathways. A parameter sen...... scavenging, water-sediment interaction, biological uptake, ice transport and fish migration. Two independent evaluations of the release of radioactivity from dumped nuclear waste in the Kara Sea have been used as source terms for the dose calculations....

  19. [Graphical procedures for assessing person-fit in item factor analysis].

    Science.gov (United States)

    Ferrando Piera, Pere Joan; Morales Vives, Fàbia

    2010-05-01

    Flagging the individuals who did not answer consistently can be very useful in certain applied domains, especially in clinical and personnel selection areas. Identification of inconsistent patterns prevents erroneous interpretations of test scores. Two graphic procedures based on linear factor analysis are proposed in this paper. They allow the possible causes of low intra-individual consistency to be assessed once a pattern has been flagged as inconsistent. Moreover, these procedures allow us to identify the items that have contributed the most to the inconsistency. The procedures are illustrated with some empirical examples in personality. Lastly, implications of the results in the construction of personality measures are discussed.

  20. Body electrical loss analysis (BELA) in the assessment of visceral fat: a demonstration

    OpenAIRE

    Blomqvist Kim H; Lundbom Jesper; Lundbom Nina; Sepponen Raimo E

    2011-01-01

    Abstract Background Body electrical loss analysis (BELA) is a new non-invasive way to assess visceral fat depot size through the use of electromagnetism. BELA has worked well in phantom measurements, but the technology is not yet fully validated. Methods Ten volunteers (5 men and 5 women, age: 22-60 y, BMI: 21-30 kg/m2, waist circumference: 73-108 cm) were measured with the BELA instrument and with cross-sectional magnetic resonance imaging (MRI) at the navel level, navel +5 cm and navel -5 c...

  1. Alignment Content Analysis of NAEP 2009 Reading Assessment Analysis Based on Method of Surveys of Enacted Curriculum

    Science.gov (United States)

    Blank, Rolf K.; Smithson, John

    2010-01-01

    Beginning in summer 2009, the complete set of NAEP student assessment items for grades 4 and 8 Science and Reading 2009 assessments were analyzed for comparison to the National Assessment of Educational Progress (NAEP) Item Specifications which are based on the NAEP Assessment Frameworks for these subjects (National Assessment Governing Board,…

  2. Development of a Standard Protocol for the Harmonic Analysis of Radial Pulse Wave and Assessing Its Reliability in Healthy Humans

    OpenAIRE

    ,

    2015-01-01

    This study was aimed to establish a standard protocol and to quantitatively assess the reliability of harmonic analysis of the radial pulse wave measured by a harmonic wave analyzer (TD01C system). Both intraobserver and interobserver assessments were conducted to investigate whether the values of harmonics are stable in successive measurements. An intraclass correlation coefficient (ICC) and a Bland–Altman plot were used for this purpose. For the reliability assessments of the intraobserver ...

  3. On sustainability assessment of technical systems. Experience from systems analysis with the ORWARE and Ecoeffect tools

    Energy Technology Data Exchange (ETDEWEB)

    Assefa, Getachew [Royal Inst. of Technology, Stockholm (Sweden). Dept. of Chemical Engineering

    2006-06-15

    Engineering research and development work is undergoing a reorientation from focusing on specific parts of different systems to a broader perspective of systems level, albeit at a slower pace. This reorientation should be further developed and enhanced with the aim of organizing and structuring our technical systems in meeting sustainability requirements in face of global ecological threats that have far-reaching social and economic implications, which can no longer be captured using conventional approach of research. Until a list of universally acceptable, clear, and measurable indicators of sustainable development is developed, the work with sustainability metrics should continue to evolve as a relative measure of ecological, economic, and social performance of human activities in general, and technical systems in particular. This work can be done by comparing the relative performance of alternative technologies of providing the same well-defined function or service; or by characterizing technologies that enjoy different levels of societal priorities using relevant performance indicators. In both cases, concepts and methods of industrial ecology play a vital role. This thesis is about the development and application of a systematic approach for the assessment of the performance of technical systems from the perspective of systems analysis, sustainability, sustainability assessment, and industrial ecology. The systematic approach developed and characterized in this thesis advocates for a simultaneous assessment of the ecological, economic, and social dimensions of performance of technologies in avoiding sub-optimization and problem shifting between dimensions. It gives a holistic picture by taking a life cycle perspective of all important aspects. The systematic assessment of technical systems provides an even-handed assessment resulting in a cumulative knowledge. A modular structure of the approach makes it flexible enough in terms of comparing a number of

  4. Application of finite element analysis for assessing biocompatibility of intra-arterial catheters and probes.

    Science.gov (United States)

    Bedingham, W; Neavin, T D

    1991-01-01

    A commercial finite element modeling program (FIDAP) was adapted to compute the fluid dynamics of laminar blood flow around an intra-arterial catheter and/or sensor probe. The model provided an accurate transient solution to the Navier-Stokes equations under pulsatile blood flow conditions. To simulate the compliance in the catheter tubing set, a second order convolution integral was incorporated into the boundary conditions. The saline drip rate and catheter compliance could be specified, and the bulk blood flow, blood pressure, and heart rate were varied to simulate specific patient conditions. Analysis of the transient solution was used to assess probable sites for thrombus activation and deposition. The transient velocity and pressure fields identified regions of separated flow and recirculation. The computed shear rates and stresses were used to predict hemolysis, platelet activation, and thrombus formation. Analysis of particle paths provided an estimate of residence times and thrombus deposition sites.

  5. Method of assessing a lipid-related health risk based on ion mobility analysis of lipoproteins

    Science.gov (United States)

    Benner, W. Henry; Krauss, Ronald M.; Blanche, Patricia J.

    2010-12-14

    A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.

  6. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    Science.gov (United States)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  7. Recent developments in imaging system assessment methodology, FROC analysis and the search model.

    Science.gov (United States)

    Chakraborty, Dev P

    2011-08-21

    A frequent problem in imaging is assessing whether a new imaging system is an improvement over an existing standard. Observer performance methods, in particular the receiver operating characteristic (ROC) paradigm, are widely used in this context. In ROC analysis lesion location information is not used and consequently scoring ambiguities can arise in tasks, such as nodule detection, involving finding localized lesions. This paper reviews progress in the free-response ROC (FROC) paradigm in which the observer marks and rates suspicious regions and the location information is used to determine whether lesions were correctly localized. Reviewed are FROC data analysis, a search-model for simulating FROC data, predictions of the model and a method for estimating the parameters. The search model parameters are physically meaningful quantities that can guide system optimization.

  8. The ICR142 NGS validation series: a resource for orthogonal assessment of NGS analysis.

    Science.gov (United States)

    Ruark, Elise; Renwick, Anthony; Clarke, Matthew; Snape, Katie; Ramsay, Emma; Elliott, Anna; Hanks, Sandra; Strydom, Ann; Seal, Sheila; Rahman, Nazneen

    2016-01-01

    To provide a useful community resource for orthogonal assessment of NGS analysis software, we present the ICR142 NGS validation series. The dataset includes high-quality exome sequence data from 142 samples together with Sanger sequence data at 730 sites; 409 sites with variants and 321 sites at which variants were called by an NGS analysis tool, but no variant is present in the corresponding Sanger sequence. The dataset includes 286 indel variants and 275 negative indel sites, and thus the ICR142 validation dataset is of particular utility in evaluating indel calling performance. The FASTQ files and Sanger sequence results can be accessed in the European Genome-phenome Archive under the accession number EGAS00001001332.

  9. Assessment of gait symmetry for Talus Valgus children based on experimental kinematic analysis

    Science.gov (United States)

    Toth-Tascau, Mirela; Pasca, Oana; Vigaru, Cosmina; Rusu, Lucian

    2013-10-01

    The general purpose of this study was to assess the gait symmetry for Talus Valgus deformity based on experimental kinematic analysis. As this foot condition generally occurs in children, the study is focused on two children having five years old, one being healthy, as control subject, and the second one having bilateral Talus Valgus deformity. Kinematic experimental analysis was conducted using Zebris CMS-HS Measuring System. The bilateral symmetry was analyzed using two methods: index of symmetry (SI) calculated for spatio-temporal parameters (stance phase, swing phase, and step length) and kinematic parameter (maximum value of dorsiflexion - plantar flexion angle in the ankle joint), and an unpaired t-test to compare the variation of means values of dorsiflexion - plantar flexion angle in ankle joint for both left and right side. The study evidenced a good bilateral symmetry in case of the control subject and quantified the asymmetry in case of subject with Talus Valgus deformity.

  10. Combining a building simulation with energy systems analysis to assess the benefits of natural ventilation

    DEFF Research Database (Denmark)

    Oropeza-Perez, Ivan; Østergaard, Poul Alberg; Remmen, Arne

    2013-01-01

    systems analysis. Results show that for an energy system such as the Mexican, with a relatively simple connection between supply and demand of electricity, natural ventilation mainly creates savings, whereas in the Danish system, the system operation is also affected by energy savings through natural......This article shows the combination of a thermal air flow simulation program with an energy systems analysis model in order to assess the use of natural ventilation as a method for saving energy within residential buildings in large-scale scenarios. The aim is to show the benefits for utilizing...... natural airflow instead of active systems such as mechanical ventilation or air-conditioning in buildings where the indoor temperature is over the upper limit of the comfort range. The combination is done by introducing the energy saving output - calculated with a model of natural ventilation using...

  11. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    Science.gov (United States)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  12. Confirmatory factor analysis of a Spanish version of the sex fantasy questionnaire: assessing gender differences.

    Science.gov (United States)

    Sierra, Juan Carlos; Ortega, Virgilio; Zubeidat, Ihab

    2006-01-01

    The objective of this study was to validate the factor structure of Wilson's Sex Fantasy Questionnaire (SFQ; Wilson, 1978; Wilson & Lang, 1981) using a Spanish version. In order to do this, we conducted confirmatory factor analysis on two nonclinical samples containing 195 men and 315 women. Both groups were tested for the structure proposed by Wilson and also for some alternative models. Confirmatory factor analysis showed that four factors were reasonably distinct, especially for the men. We proposed shortened version of the instrument that would have sufficient psychometric guarantees for assessing sexual fantasies in both genders. This abridged version improved the fit of the four-factor oblique factor equally for both the samples of men and women. In the light of the results of the validation hypothesis established with some criterion variables (dyadic sexual desire, unconventional sex, homophobia), we discuss discrepancies between both versions.

  13. Assessment of computational issues associated with analysis of high-lift systems

    Science.gov (United States)

    Balasubramanian, R.; Jones, Kenneth M.; Waggoner, Edgar G.

    1992-01-01

    Thin-layer Navier-Stokes calculations for wing-fuselage configurations from subsonic to hypersonic flow regimes are now possible. However, efficient, accurate solutions for using these codes for two- and three-dimensional high-lift systems have yet to be realized. A brief overview of salient experimental and computational research is presented. An assessment of the state-of-the-art relative to high-lift system analysis and identification of issues related to grid generation and flow physics which are crucial for computational success in this area are also provided. Research in support of the high-lift elements of NASA's High Speed Research and Advanced Subsonic Transport Programs which addresses some of the computational issues is presented. Finally, fruitful areas of concentrated research are identified to accelerate overall progress for high lift system analysis and design.

  14. Evolution and Implementation of the NASA Robotic Conjunction Assessment Risk Analysis Concept of Operations

    Science.gov (United States)

    Newman, Lauri K.; Frigm, Ryan C.; Duncan, Matthew G.; Hejduk, Matthew D.

    2014-01-01

    Reacting to potential on-orbit collision risk in an operational environment requires timely and accurate communication and exchange of data, information, and analysis to ensure informed decision-making for safety of flight and responsible use of the shared space environment. To accomplish this mission, it is imperative that all stakeholders effectively manage resources: devoting necessary and potentially intensive resource commitment to responding to high-risk conjunction events and preventing unnecessary expenditure of resources on events of low collision risk. After 10 years of operational experience, the NASA Robotic Conjunction Assessment Risk Analysis (CARA) is modifying its Concept of Operations (CONOPS) to ensure this alignment of collision risk and resource management. This evolution manifests itself in the approach to characterizing, reporting, and refining of collision risk. Implementation of this updated CONOPS is expected to have a demonstrated improvement on the efficacy of JSpOC, CARA, and owner/operator resources.

  15. Hydrodynamic analysis, performance assessment, and actuator design of a flexible tail propulsor in an artificial alligator

    Science.gov (United States)

    Philen, Michael; Neu, Wayne

    2011-09-01

    The overall objective of this research is to develop analysis tools for determining actuator requirements and assessing viable actuator technology for design of a flexible tail propulsor in an artificial alligator. A simple hydrodynamic model that includes both reactive and resistive forces along the tail is proposed and the calculated mean thrust agrees well with conventional estimates of drag. Using the hydrodynamic model forces as an input, studies are performed for an alligator ranging in size from 1 cm to 2 m at swimming speeds of 0.3-1.8 body lengths per second containing five antagonistic pairs of actuators distributed along the length of the tail. Several smart materials are considered for the actuation system, and preliminary analysis results indicate that the acrylic electroactive polymer and the flexible matrix composite actuators are potential artificial muscle technologies for the system.

  16. The Assessment of a Tutoring Program to Meet CAS Standards Using a SWOT Analysis and Action Plan

    Science.gov (United States)

    Fullmer, Patricia

    2009-01-01

    This article summarizes the use of SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis and subsequent action planning as a tool of self-assessment to meet CAS (Council for the Advancement of Standards in Higher Education) requirements for systematic assessment. The use of the evaluation results to devise improvements to increase the…

  17. Assessing hippocampal functional reserve in temporal lobe epilepsy: A multi-voxel pattern analysis of fMRI data

    OpenAIRE

    Bonnici, Heidi M; Sidhu, Meneka; Chadwick, Martin J.; Duncan, John S.; Maguire, Eleanor A.

    2013-01-01

    Summary Assessing the functional reserve of key memory structures in the medial temporal lobes (MTL) of pre-surgical patients with intractable temporal lobe epilepsy (TLE) remains a challenge. Conventional functional MRI (fMRI) memory paradigms have yet to fully convince of their ability to confidently assess the risk of a post-surgical amnesia. An alternative fMRI analysis method, multi-voxel pattern analysis (MVPA), focuses on the patterns of activity across voxels in specific brain regions...

  18. Uncertainty analysis based on probability bounds (p-box) approach in probabilistic safety assessment.

    Science.gov (United States)

    Karanki, Durga Rao; Kushwaha, Hari Shankar; Verma, Ajit Kumar; Ajit, Srividya

    2009-05-01

    A wide range of uncertainties will be introduced inevitably during the process of performing a safety assessment of engineering systems. The impact of all these uncertainties must be addressed if the analysis is to serve as a tool in the decision-making process. Uncertainties present in the components (input parameters of model or basic events) of model output are propagated to quantify its impact in the final results. There are several methods available in the literature, namely, method of moments, discrete probability analysis, Monte Carlo simulation, fuzzy arithmetic, and Dempster-Shafer theory. All the methods are different in terms of characterizing at the component level and also in propagating to the system level. All these methods have different desirable and undesirable features, making them more or less useful in different situations. In the probabilistic framework, which is most widely used, probability distribution is used to characterize uncertainty. However, in situations in which one cannot specify (1) parameter values for input distributions, (2) precise probability distributions (shape), and (3) dependencies between input parameters, these methods have limitations and are found to be not effective. In order to address some of these limitations, the article presents uncertainty analysis in the context of level-1 probabilistic safety assessment (PSA) based on a probability bounds (PB) approach. PB analysis combines probability theory and interval arithmetic to produce probability boxes (p-boxes), structures that allow the comprehensive propagation through calculation in a rigorous way. A practical case study is also carried out with the developed code based on the PB approach and compared with the two-phase Monte Carlo simulation results.

  19. Review and analysis of parameters for assessing transport of environmentally released radionuclides through agriculture

    Energy Technology Data Exchange (ETDEWEB)

    Baes, C.F. III; Sharp, R.D.; Sjoreen, A.L.; Shor, R.W.

    1984-09-01

    Most of the default parameters incorporated into the TERRA computer code are documented including a literature review and systematic analysis of element-specific transfer parameters B/sub v/, B/sub r/, F/sub m/, F/sub f/, and K/sub d/. This review and analysis suggests default values which are consistent with the modeling approaches taken in TERRA and may be acceptable for most assessment applications of the computer code. However, particular applications of the code and additional analysis of elemental transport may require alternative default values. Use of the values reported herein in other computer codes simulating terrestrial transport is not advised without careful interpretation of the limitations and scope these analyses. An approach to determination of vegetation-specific interception fractions is also discussed. The limitations of this approach are many, and its use indicates the need for analysis of deposition, interception, and weathering processes. Judgement must be exercised in interpretation of plant surface concentrations generated. Finally, the location-specific agricultural, climatological, and population parameters in the default SITE data base documented. These parameters are intended as alternatives to average values currently used. Indeed, areas in the United States where intensive crop, milk, or beef production occurs will be reflected in the parameter values as will areas where little agricultural activity occurs. However, the original information sources contained some small error and the interpolation and conversion methods used will add more. Parameters used in TERRA not discussed herein are discussed in the companion report to this one - ORNL-5785. In the companion report the models employed in and the coding of TERRA are discussed. These reports together provide documentation of the TERRA code and its use in assessments. 96 references, 78 figures, 21 tables.

  20. Integration of Gis-analysis and Atmospheric Modelling For Nuclear Risk and Vulnerability Assessment

    Science.gov (United States)

    Rigina, O.; Baklanov, A.; Mahura, A.

    The paper is devoted to the problems of residential radiation risk and territorial vul- nerability with respect to nuclear sites in Europe. The study suggests two approaches, based on an integration of the GIS-analysis and the atmospheric modelling, to calcu- late radiation risk/vulnerability. First, modelling simulations were done for a number of case-studies, based on real data, such as reactor core inventory and estimations from the known accidents, for a number of typical meteorological conditions and different accidental scenarios. Then, using these simulations and the population database as input data, the GIS-analysis reveals administrative units at the highest risk with re- spect to the mean individual and collective doses received by the population. Then, two alternative methods were suggested to assess a probabilistic risk to the population in case of a severe accident on the Kola and Leningrad NPPs (as examples) based on social-geophysical factors: proximity to the accident site, population density and presence of critical groups, and the probabilities of wind trajectories and precipitation. The two latter probabilities were calculated by the atmospheric trajectory models and statistical methods for many years. The GIS analysis was done for the Nordic coun- tries as an example. GIS-based spatial analyses integrated with mathematical mod- elling allow to develop a common methodological approach for complex assessment of regional vulnerability and residential radiation risk, by merging together the sepa- rate aspects: modelling of consequences, probabilistic analysis of atmospheric flows, dose estimation etc. The approach was capable to create risk/vulnerability maps of the Nordic countries and to reveal the most vulnerable provinces with respect to the radiation risk sites.

  1. The Strategic Environment Assessment bibliographic network: A quantitative literature review analysis

    Energy Technology Data Exchange (ETDEWEB)

    Caschili, Simone, E-mail: s.caschili@ucl.ac.uk [UCL QASER Lab, University College London, Gower Street, London WC1E 6BT (United Kingdom); De Montis, Andrea; Ganciu, Amedeo; Ledda, Antonio; Barra, Mario [Dipartimento di Agraria, University of Sassari, viale Italia, 39, 07100 Sassari (Italy)

    2014-07-01

    Academic literature has been continuously growing at such a pace that it can be difficult to follow the progression of scientific achievements; hence, the need to dispose of quantitative knowledge support systems to analyze the literature of a subject. In this article we utilize network analysis tools to build a literature review of scientific documents published in the multidisciplinary field of Strategic Environment Assessment (SEA). The proposed approach helps researchers to build unbiased and comprehensive literature reviews. We collect information on 7662 SEA publications and build the SEA Bibliographic Network (SEABN) employing the basic idea that two publications are interconnected if one cites the other. We apply network analysis at macroscopic (network architecture), mesoscopic (sub graph) and microscopic levels (node) in order to i) verify what network structure characterizes the SEA literature, ii) identify the authors, disciplines and journals that are contributing to the international discussion on SEA, and iii) scrutinize the most cited and important publications in the field. Results show that the SEA is a multidisciplinary subject; the SEABN belongs to the class of real small world networks with a dominance of publications in Environmental studies over a total of 12 scientific sectors. Christopher Wood, Olivia Bina, Matthew Cashmore, and Andrew Jordan are found to be the leading authors while Environmental Impact Assessment Review is by far the scientific journal with the highest number of publications in SEA studies. - Highlights: • We utilize network analysis to analyze scientific documents in the SEA field. • We build the SEA Bibliographic Network (SEABN) of 7662 publications. • We apply network analysis at macroscopic, mesoscopic and microscopic network levels. • We identify SEABN architecture, relevant publications, authors, subjects and journals.

  2. Literature review and analysis of the application of health outcome assessment instruments in Chinese medicine

    Institute of Scientific and Technical Information of China (English)

    Feng-bin Liu; Zheng-kun Hou; Yun-ying Yang; Pei-wu Li; Qian-wen Li; Nelson Xie; Jing-wei Li

    2013-01-01

    OBJECITVE:To evaluate the application of health assessment instruments in Chinese medicine.METHODS:According to a pre-defined search strategy,a comprehensive literature search for all articles published in China National Knowledge Infrastructure databases was conducted.The resulting articles that met the defined inclusion and exclusion criteria were used for analysis.RESULTS:A total of 97 instruments for health outcome assessment in Chinese medicine have been used in fundamental and theoretical research,and 14 of these were also used in 29 clinical trials that were randomized controlled trials,or descriptive or cross-sectional studies.In 2 152 Chinese medicine-based studies that used instruments in their methodology,more than 150 questionnaires were identified.Among the identified questionnaires,51 were used in more than 10 articles (0.5%).Most of these instruments were developed in Western countries and few studies (4%) used the instrument as the primary evidence for their conclusions.CONCLUSION:Usage of instruments for health outcome assessment in Chinese medicine is increasing rapidly; however,current limitations include selection rationale,result interpretation and standardization,which must be addressed accordingly.

  3. Assessment of occupational safety risks in Floridian solid waste systems using Bayesian analysis.

    Science.gov (United States)

    Bastani, Mehrad; Celik, Nurcin

    2015-10-01

    Safety risks embedded within solid waste management systems continue to be a significant issue and are prevalent at every step in the solid waste management process. To recognise and address these occupational hazards, it is necessary to discover the potential safety concerns that cause them, as well as their direct and/or indirect impacts on the different types of solid waste workers. In this research, our goal is to statistically assess occupational safety risks to solid waste workers in the state of Florida. Here, we first review the related standard industrial codes to major solid waste management methods including recycling, incineration, landfilling, and composting. Then, a quantitative assessment of major risks is conducted based on the data collected using a Bayesian data analysis and predictive methods. The risks estimated in this study for the period of 2005-2012 are then compared with historical statistics (1993-1997) from previous assessment studies. The results have shown that the injury rates among refuse collectors in both musculoskeletal and dermal injuries have decreased from 88 and 15 to 16 and three injuries per 1000 workers, respectively. However, a contrasting trend is observed for the injury rates among recycling workers, for whom musculoskeletal and dermal injuries have increased from 13 and four injuries to 14 and six injuries per 1000 workers, respectively. Lastly, a linear regression model has been proposed to identify major elements of the high number of musculoskeletal and dermal injuries.

  4. Advancing effects analysis for integrated, large-scale wildfire risk assessment.

    Science.gov (United States)

    Thompson, Matthew P; Calkin, David E; Gilbertson-Day, Julie W; Ager, Alan A

    2011-08-01

    In this article, we describe the design and development of a quantitative, geospatial risk assessment tool intended to facilitate monitoring trends in wildfire risk over time and to provide information useful in prioritizing fuels treatments and mitigation measures. The research effort is designed to develop, from a strategic view, a first approximation of how both fire likelihood and intensity influence risk to social, economic, and ecological values at regional and national scales. Three main components are required to generate wildfire risk outputs: (1) burn probability maps generated from wildfire simulations, (2) spatially identified highly valued resources (HVRs), and (3) response functions that describe the effects of fire (beneficial or detrimental) on the HVR. Analyzing fire effects has to date presented a major challenge to integrated risk assessments, due to a limited understanding of the type and magnitude of changes wrought by wildfire to ecological and other nonmarket values. This work advances wildfire effects analysis, recognizing knowledge uncertainty and appropriately managing it through the use of an expert systems approach. Specifically, this work entailed consultation with 10 fire and fuels program management officials from federal agencies with fire management responsibilities in order to define quantitative resource response relationships as a function of fire intensity. Here, we demonstrate a proof-of-concept application of the wildland fire risk assessment tool, using the state of Oregon as a case study.

  5. Improving sustainability by technology assessment and systems analysis: the case of IWRM Indonesia

    Science.gov (United States)

    Nayono, S.; Lehmann, A.; Kopfmüller, J.; Lehn, H.

    2016-09-01

    To support the implementation of the IWRM-Indonesia process in a water scarce and sanitation poor region of Central Java (Indonesia), sustainability assessments of several technology options of water supply and sanitation were carried out based on the conceptual framework of the integrative sustainability concept of the German Helmholtz association. In the case of water supply, the assessment was based on the life-cycle analysis and life-cycle-costing approach. In the sanitation sector, the focus was set on developing an analytical tool to improve planning procedures in the area of investigation, which can be applied in general to developing and newly emerging countries. Because sanitation systems in particular can be regarded as socio-technical systems, their permanent operability is closely related to cultural or religious preferences which influence acceptability. Therefore, the design of the tool and the assessment of sanitation technologies took into account the views of relevant stakeholders. The key results of the analyses are presented in this article.

  6. Statistical analysis of data from limiting dilution cloning to assess monoclonality in generating manufacturing cell lines.

    Science.gov (United States)

    Quiroz, Jorge; Tsao, Yung-Shyeng

    2016-07-08

    Assurance of monoclonality of recombinant cell lines is a critical issue to gain regulatory approval in biological license application (BLA). Some of the requirements of regulatory agencies are the use of proper documentations and appropriate statistical analysis to demonstrate monoclonality. In some cases, one round may be sufficient to demonstrate monoclonality. In this article, we propose the use of confidence intervals for assessing monoclonality for limiting dilution cloning in the generation of recombinant manufacturing cell lines based on a single round. The use of confidence intervals instead of point estimates allow practitioners to account for the uncertainty present in the data when assessing whether an estimated level of monoclonality is consistent with regulatory requirements. In other cases, one round may not be sufficient and two consecutive rounds are required to assess monoclonality. When two consecutive subclonings are required, we improved the present methodology by reducing the infinite series proposed by Coller and Coller (Hybridoma 1983;2:91-96) to a simpler series. The proposed simpler series provides more accurate and reliable results. It also reduces the level of computation and can be easily implemented in any spreadsheet program like Microsoft Excel. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:1061-1068, 2016.

  7. Risk assessment of Kermanshah gas storage tanks by energy trace and barrier analysis (2014

    Directory of Open Access Journals (Sweden)

    M. Ghanbari Kakavandi

    2016-12-01

    Full Text Available Background: Despite the cost and millions loss of life due to industrial accidents, often are preventable through risk assessment methods and control measures. Objective: To assess the safety of gas storage tanks in Kermanshah oil refinery by Energy Trace and Barrier Analysis (ETBA. Methods: This case-descriptive study was conducted in gas storage tanks of Kermanshah oil refinery. Energy checklist was used for identification of energy types. Energy flows were tracked and then, management and administrative procedures, and personal protective equipment were considered as safeguard. Exposed and vulnerable targets are also specified. Preliminary levels of risks were determined by combination of severity and likelihood. After suggestion of corrective action for unacceptable risks, risk assessment took place again. Identified risks were expressed using descriptive statistics such as frequency and percentage. Findings: Overall, 74 risks and 121 dangerous energies were identified. Of these, 25 risks were unacceptable, 46 were low risk, and 3 were acceptable risk with revised. Frequency of risks related to electric power was 20, and followed by risk of displacement- pressure-volume, potential and chemical energies, with frequencies of 13, 12 and 9, respectively. Conclusion: Given the environmental and protection conditions of the tanks, in addition, the high percentage of some of the damaging risks in this industry, use of appropriate control measures to prevention the event of future disasters will be inevitable.

  8. Using Habitat Equivalency Analysis to Assess the Cost Effectiveness of Restoration Outcomes in Four Institutional Contexts

    Science.gov (United States)

    Scemama, Pierre; Levrel, Harold

    2016-01-01

    At the national level, with a fixed amount of resources available for public investment in the restoration of biodiversity, it is difficult to prioritize alternative restoration projects. One way to do this is to assess the level of ecosystem services delivered by these projects and to compare them with their costs. The challenge is to derive a common unit of measurement for ecosystem services in order to compare projects which are carried out in different institutional contexts having different goals (application of environmental laws, management of natural reserves, etc.). This paper assesses the use of habitat equivalency analysis (HEA) as a tool to evaluate ecosystem services provided by restoration projects developed in different institutional contexts. This tool was initially developed to quantify the level of ecosystem services required to compensate for non-market impacts coming from accidental pollution in the US. In this paper, HEA is used to assess the cost effectiveness of several restoration projects in relation to different environmental policies, using case studies based in France. Four case studies were used: the creation of a market for wetlands, public acceptance of a port development project, the rehabilitation of marshes to mitigate nitrate loading to the sea, and the restoration of streams in a protected area. Our main conclusion is that HEA can provide a simple tool to clarify the objectives of restoration projects, to compare the cost and effectiveness of these projects, and to carry out trade-offs, without requiring significant amounts of human or technical resources.

  9. Efficiency assessment of coal mine safety input by data envelopment analysis

    Institute of Scientific and Technical Information of China (English)

    TONG Lei; DING Ri-jia

    2008-01-01

    In recent years improper allocation of safety input has prevailed in coal mines in China, which resulted in the frequent accidents in coal mining operation. A comprehensive assessment of the input efficiency of coal mine safety should lead to im-proved efficiency in the use of funds and management resources. This helps government and enterprise managers better understand how safety inputs are used and to optimize allocation of resources. Study on coal mine's efficiency assessment of safety input was conducted in this paper. A C2R model with non-Archimedean infinitesimal vector based on output is established after consideration of the input characteristics and the model properties. An assessment of an operating mine was done using a specific set of input and output criteria. It is found that the safety input was efficient in 2002 and 2005 and was weakly efficient in 2003. However, the effi-ciency was relatively low in both 2001 and 2004. The safety input resources can be optimized and adjusted by means of projection theory. Such analysis shows that, on average in 2001 and 2004, 45% of the expended funds could have been saved. Likewise, 10% of the safety management and technical staff could have been eliminated and working hours devoted to safety could have been reduced by 12%. These conditions could have given the same results.

  10. Parkinson's disease assessment based on gait analysis using an innovative RGB-D camera system.

    Science.gov (United States)

    Rocha, Ana Patrícia; Choupina, Hugo; Fernandes, José Maria; Rosas, Maria José; Vaz, Rui; Silva Cunha, João Paulo

    2014-01-01

    Movement-related diseases, such as Parkinson's disease (PD), progressively affect the motor function, many times leading to severe motor impairment and dramatic loss of the patients' quality of life. Human motion analysis techniques can be very useful to support clinical assessment of this type of diseases. In this contribution, we present a RGB-D camera (Microsoft Kinect) system and its evaluation for PD assessment. Based on skeleton data extracted from the gait of three PD patients treated with deep brain stimulation and three control subjects, several gait parameters were computed and analyzed, with the aim of discriminating between non-PD and PD subjects, as well as between two PD states (stimulator ON and OFF). We verified that among the several quantitative gait parameters, the variance of the center shoulder velocity presented the highest discriminative power to distinguish between non-PD, PD ON and PD OFF states (p = 0.004). Furthermore, we have shown that our low-cost portable system can be easily mounted in any hospital environment for evaluating patients' gait. These results demonstrate the potential of using a RGB-D camera as a PD assessment tool.

  11. Procedure for conducting probabilistic safety assessment: level 1 full power internal event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dae; Lee, Y. H.; Hwang, M. J. [and others

    2003-07-01

    This report provides guidance on conducting a Level I PSA for internal events in NPPs, which is based on the method and procedure that was used in the PSA for the design of Korea Standard Nuclear Plants (KSNPs). Level I PSA is to delineate the accident sequences leading to core damage and to estimate their frequencies. It has been directly used for assessing and modifying the system safety and reliability as a key and base part of PSA. Also, Level I PSA provides insights into design weakness and into ways of preventing core damage, which in most cases is the precursor to accidents leading to major accidents. So Level I PSA has been used as the essential technical bases for risk-informed application in NPPs. The report consists six major procedural steps for Level I PSA; familiarization of plant, initiating event analysis, event tree analysis, system fault tree analysis, reliability data analysis, and accident sequence quantification. The report is intended to assist technical persons performing Level I PSA for NPPs. A particular aim is to promote a standardized framework, terminology and form of documentation for PSAs. On the other hand, this report would be useful for the managers or regulatory persons related to risk-informed regulation, and also for conducting PSA for other industries.

  12. The Statistical Analysis and Assessment of the Solvency of Forest Enterprises

    Directory of Open Access Journals (Sweden)

    Vyniatynska Liudmila V.

    2016-05-01

    Full Text Available The aim of the article is to conduct a statistical analysis of the solvency of forest enterprises through a system of statistical indicators using the sampling method (the sampling is based on the criteria of forest cover percent of regions of Ukraine. Using financial statements of forest enterprises that form a system of information and analytical support for the statistical analysis of the level of solvency of forestry in Ukraine for 2009-2015 has been analyzed and evaluated. With the help of the developed recommended values the results of the statistical analysis of the forest enterprises’ solvency under conditions of self-financing and commercial consideration have been summarized and systematized. Using the methodology of the statistical analysis of the forest enterprises’ solvency conducted on the corresponding conceptual framework, which is relevant and meets the current needs, a system of statistical indicators enabling to assess the level of solvency of forest enterprises and identify the reasons of its low level has been calculated.

  13. Toward a low-cost gait analysis system for clinical and free-living assessment.

    Science.gov (United States)

    Ladha, Cassim; Del Din, Silvia; Nazarpour, Kianoush; Hickey, Aodhan; Morris, Rosie; Catt, Michael; Rochester, Lynn; Godfrey, Alan

    2016-08-01

    Gait is an important clinical assessment tool since changes in gait may reflect changes in general health. Measurement of gait is a complex process which has been restricted to bespoke clinical facilities until recently. The use of inexpensive wearable technologies is an attractive alternative and offers the potential to assess gait in any environment. In this paper we present the development of a low cost analysis gait system built using entirely open source components. The system is used to capture spatio-temporal gait characteristics derived from an existing conceptual model, sensitive to ageing and neurodegenerative pathology (e.g. Parkinson's disease). We demonstrate the system is suitable for use in a clinical unit and will lead to pragmatic use in a free-living (home) environment. The system consists of a wearable (tri-axial accelerometer and gyroscope) with a Raspberry Pi module for data storage and analysis. This forms ongoing work to develop gait as a low cost diagnostic in modern healthcare.

  14. Probabilistic fragility analysis: A tool for assessing design rules of RC buildings

    Institute of Scientific and Technical Information of China (English)

    Nikos D Lagarost

    2008-01-01

    In this work, fragility analysis is performed to assess two groups of reinforced concrete structures. The first group of structures is composed of buildings that implement three common design practices; namely, fully infilled, weak ground story and short columns. The three design practices are applied during the design process of a reinforced concrete building. The structures of the second group vary according to the value of the behavioral factors used to define the seismic forces as specified in design procedures. Most seismic design codes belong to the class of prescriptive procedures where if certain constraints are fulfilled, the structure is considered safe. Prescriptive design procedures express the ability of the structure to absorb energy through inelastic deformation using the behavior factor. The basic objective of this work is to assess both groups of structures with reference to the limit-state probability of exceedance. Thus, four limit state fragility curves are developed on the basis of nonlinear static analysis for both groups of structures. Moreover, the 95% confidence intervals of the fragility curves are also calculated, taking into account two types of random variables that influence structural capacity and seismic demand.

  15. Gait analysis, bone and muscle density assessment for patients undergoing total hip arthroplasty

    Directory of Open Access Journals (Sweden)

    Benedikt Magnússon

    2012-12-01

    Full Text Available Total hip arthroplasty (THA is performed with or without the use of bone cement. Facing the lack of reliable clinical guidelines on decision making whether a patient should receive THA with or without bone cement, a joint clinical and engineering approach is proposed here with the objective to assess patient recovery developing monitoring techniques based on gait analysis, measurements of bone mineral density and structural and functional changes of quadriceps muscles. A clinical trial was conducted with 36 volunteer patients that were undergoing THA surgery for the first time: 18 receiving cemented implant and 18 receiving non-cemented implant. The patients are scanned with Computer Tomographic (CT modality prior-, immediately- and 12 months post-surgery. The CT data are further processed to segment muscles and bones for calculating bone mineral density (BMD. Quadriceps muscle density Hounsfield (HU based value is calculated from the segmented file on healthy and operated leg before and after THA surgery. Furthermore clinical assessment is performed using gait analysis technologies such as a sensing carpet, wireless electrodes and video. Patients undergo these measurements prior-, 6 weeks post - and 52 weeks post-surgery. The preliminary results indicate computational tools and methods that are able to quantitatively analyze patient’s condition pre and post-surgery: The spatial parameters such as step length and stride length increase 6 weeks post op in the patient group receiving cemented implant while the angle in the toe in/out parameter decrease in both patient groups.

  16. Frequency Domain Analysis for Assessing Fluid Responsiveness by Using Instantaneous Pulse Rate Variability

    Directory of Open Access Journals (Sweden)

    Pei-Chen Lin

    2016-02-01

    Full Text Available In the ICU, fluid therapy is conventional strategy for the patient in shock. However, only half of ICU patients have well-responses to fluid therapy, and fluid loading in non-responsive patient delays definitive therapy. Prediction of fluid responsiveness (FR has become intense topic in clinic. Most of conventional FR prediction method based on time domain analysis, and it is limited ability to indicate FR. This study proposed a method which predicts FR based on frequency domain analysis, named instantaneous pulse rate variability (iPRV. iPRV provides a new indication in very high frequency (VHF range (0.4-0.8Hz of spectrum for peripheral responses. Twenty six healthy subjects participated this study and photoplethysmography signal was recorded in supine baseline, during head-up tilt (HUT, and passive leg raising (PLR, which induces variation of venous return and helps for quantitative assessment of FR individually. The result showed the spectral power of VHF decreased during HUT (573.96±756.36 ms2 in baseline; 348.00±434.92 ms2 in HUT and increased during PLR (573.96±756.36 ms2 in baseline; 718.92±973.70 ms2 in PLR, which present the compensated regulation of venous return and FR. This study provides an effective indicator for assessing FR in frequency domain and has potential to be a reliable system in ICU.

  17. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    Science.gov (United States)

    Newman, L.; Hejduk, M.; Johnson, L.

    2016-09-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hardbody radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  18. Sustainability assessment of tertiary wastewater treatment technologies: a multi-criteria analysis.

    Science.gov (United States)

    Plakas, K V; Georgiadis, A A; Karabelas, A J

    2016-01-01

    The multi-criteria analysis gives the opportunity to researchers, designers and decision-makers to examine decision options in a multi-dimensional fashion. On this basis, four tertiary wastewater treatment (WWT) technologies were assessed regarding their sustainability performance in producing recycled wastewater, considering a 'triple bottom line' approach (i.e. economic, environmental, and social). These are powdered activated carbon adsorption coupled with ultrafiltration membrane separation (PAC-UF), reverse osmosis, ozone/ultraviolet-light oxidation and heterogeneous photo-catalysis coupled with low-pressure membrane separation (photocatalytic membrane reactor, PMR). The participatory method called simple multi-attribute rating technique exploiting ranks was employed for assigning weights to selected sustainability indicators. This sustainability assessment approach resulted in the development of a composite index as a final metric, for each WWT technology evaluated. The PAC-UF technology appears to be the most appropriate technology, attaining the highest composite value regarding the sustainability performance. A scenario analysis confirmed the results of the original scenario in five out of seven cases. In parallel, the PMR was highlighted as the technology with the least variability in its performance. Nevertheless, additional actions and approaches are proposed to strengthen the objectivity of the final results.

  19. Assessment of hydrocephalus in children based on digital image processing and analysis

    Directory of Open Access Journals (Sweden)

    Fabijańska Anna

    2014-06-01

    Full Text Available Hydrocephalus is a pathological condition of the central nervous system which often affects neonates and young children. It manifests itself as an abnormal accumulation of cerebrospinal fluid within the ventricular system of the brain with its subsequent progression. One of the most important diagnostic methods of identifying hydrocephalus is Computer Tomography (CT. The enlarged ventricular system is clearly visible on CT scans. However, the assessment of the disease progress usually relies on the radiologist’s judgment and manual measurements, which are subjective, cumbersome and have limited accuracy. Therefore, this paper regards the problem of semi-automatic assessment of hydrocephalus using image processing and analysis algorithms. In particular, automated determination of popular indices of the disease progress is considered. Algorithms for the detection, semi-automatic segmentation and numerical description of the lesion are proposed. Specifically, the disease progress is determined using shape analysis algorithms. Numerical results provided by the introduced methods are presented and compared with those calculated manually by a radiologist and a trained operator. The comparison proves the correctness of the introduced approach.

  20. Rapid ecotoxicological assessment of heavy metal combined polluted soil using canonical analysis

    Institute of Scientific and Technical Information of China (English)

    CHEN Su-hua; ZHOU Qi-xing; SUN Tie-heng; LI Pei-jun

    2003-01-01

    Quick, simple to perform, and cheap biomarkers were combined in a rapid assessment approach to measure the effects of metal pollutants, Cu, Cd, Pb and Zn in meadow burozem on wheat. Analysis of orthogonal design showed that the significant zinc factor indicated both the inhibition rate of shoot mass and that of root elongation were affected by zinc( P < 0.05 and P < 0.01, respectively). The first toxicity canonical variable (TOXI), formed from the toxicity data set, explained 49% of the total variance in the toxicity data set; the first biological canonical variable(BIOL) explained 42% of the total variation in the biological data set. The correlation between the first canonical variables TOXI and BIOL (canonical correlation) was 0.94 ( P < 0.0001). Therefore, it is reliable and feasible to use the achievement to assess toxicity of heavy metal combined polluted soil using canonical analysis. Toxicity of soil combined polluted by heavy metals to plant community was estimated by comparing the IC50 values describing the concentration needed to cause 50% decrease with grow rate compared to no metal addition. Environmental quality standard for soils prescribe that all these tested concentration of heavy metals in soil should not cause hazard and pollution ultimately, whereas it indicated that the soils in second grade cause more or less than 50% inhibition rates of wheat growth. So environmental quality standard for soils can be modified to include other features.

  1. Assessing Credit Default using Logistic Regression and Multiple Discriminant Analysis: Empirical Evidence from Bosnia and Herzegovina

    Directory of Open Access Journals (Sweden)

    Deni Memić

    2015-01-01

    Full Text Available This article has an aim to assess credit default prediction on the banking market in Bosnia and Herzegovina nationwide as well as on its constitutional entities (Federation of Bosnia and Herzegovina and Republika Srpska. Ability to classify companies info different predefined groups or finding an appropriate tool which would replace human assessment in classifying companies into good and bad buckets has been one of the main interests on risk management researchers for a long time. We investigated the possibility and accuracy of default prediction using traditional statistical methods logistic regression (logit and multiple discriminant analysis (MDA and compared their predictive abilities. The results show that the created models have high predictive ability. For logit models, some variables are more influential on the default prediction than the others. Return on assets (ROA is statistically significant in all four periods prior to default, having very high regression coefficients, or high impact on the model's ability to predict default. Similar results are obtained for MDA models. It is also found that predictive ability differs between logistic regression and multiple discriminant analysis.

  2. Improving water quality assessments through a hierarchical Bayesian analysis of variability.

    Science.gov (United States)

    Gronewold, Andrew D; Borsuk, Mark E

    2010-10-15

    Water quality measurement error and variability, while well-documented in laboratory-scale studies, is rarely acknowledged or explicitly resolved in most model-based water body assessments, including those conducted in compliance with the United States Environmental Protection Agency (USEPA) Total Maximum Daily Load (TMDL) program. Consequently, proposed pollutant loading reductions in TMDLs and similar water quality management programs may be biased, resulting in either slower-than-expected rates of water quality restoration and designated use reinstatement or, in some cases, overly conservative management decisions. To address this problem, we present a hierarchical Bayesian approach for relating actual in situ or model-predicted pollutant concentrations to multiple sampling and analysis procedures, each with distinct sources of variability. We apply this method to recently approved TMDLs to investigate whether appropriate accounting for measurement error and variability will lead to different management decisions. We find that required pollutant loading reductions may in fact vary depending not only on how measurement variability is addressed but also on which water quality analysis procedure is used to assess standard compliance. As a general strategy, our Bayesian approach to quantifying variability may represent an alternative to the common practice of addressing all forms of uncertainty through an arbitrary margin of safety (MOS).

  3. Microarray analysis reveals the actual specificity of enrichment media used for food safety assessment.

    Science.gov (United States)

    Kostić, Tanja; Stessl, Beatrix; Wagner, Martin; Sessitsch, Angela

    2011-06-01

    Microbial diagnostic microarrays are tools for simultaneous detection and identification of microorganisms in food, clinical, and environmental samples. In comparison to classic methods, microarray-based systems have the potential for high throughput, parallelism, and miniaturization. High specificity and high sensitivity of detection have been demonstrated. A microbial diagnostic microarray for the detection of the most relevant bacterial food- and waterborne pathogens and indicator organisms was developed and thoroughly validated. The microarray platform based on sequence-specific end labeling of oligonucleotides and the phylogenetically robust gyrB marker gene allowed a highly specific (resolution on genus and/or species level) and sensitive (0.1% relative and 10(4) CFU absolute sensitivity) detection of the target pathogens. In initial challenge studies of the applicability of microarray-based food analysis, we obtained results demonstrating the questionable specificity of standardized culture-dependent microbiological detection methods. Taking into consideration the importance of reliable food safety assessment methods, comprehensive performance assessment is essential. Results demonstrate the potential of this new pathogen diagnostic microarray to evaluate culture-based standard methods in microbiological food analysis.

  4. ASSESSMENT OF OIL PALM PLANTATION AND TROPICAL PEAT SWAMP FOREST WATER QUALITY BY MULTIVARIATE STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Seca Gandaseca

    2014-01-01

    Full Text Available This study reports the spatio-temporal changes in river and canal water quality of peat swamp forest and oil palm plantation sites of Sarawak, Malaysia. To investigate temporal changes, 192 water samples were collected at four stations of BatangIgan, an oil palm plantation site of Sarawak, during July-November in 2009 and April-July in 2010. Nine water quality parameters including Electrical Conductivity (EC, pH, Turbidity (TER, Dissolved Oxygen (DO, Temperature (TEMP, Chemical Oxygen Demand (COD, five-day Biochemical Oxygen Demand (BOD5, ammonia-Nitrogen (NH3-N, Total Suspended Solids (TSS were analysed. To investigate spatial changes, 432water samples were collected from six different sites including BatangIgan during June-August 2010. Six water quality parameters including pH, DO, COD, BOD5, NH3-N and TSS were analysed to see the spatial variations. Most significant parameters which contributed in spatio-temporal variations were assessed by statistical techniques such as Hierarchical Agglomerative Cluster Analysis (HACA, Factor Analysis/Principal Components Analysis (FA/PCA and Discriminant Function Analysis (DFA. HACA identified three different classes of sites: Relatively Unimpaired, Impaired and Less Impaired Regions on the basis of similarity among different physicochemical characteristics and pollutant level between the sampling sites. DFA produced the best results for identification of main variables for temporal analysis and separated parameters (EC, TER, COD and identified three parameters for spatial analysis (pH, NH3-N and BOD5. The results signify that parameters identified by statistical analyses were responsible for water quality change and suggest the possibility the agricultural and oil palm plantation activities as a source of pollutants. The results suggest dire need for proper watershed management measures to restore the water quality of this tributary for a

  5. Assessment of Student Skills for Critiquing Published Primary Scientific Literature Using a Primary Trait Analysis Scale

    Directory of Open Access Journals (Sweden)

    Manuel F. Varela

    2009-12-01

    Full Text Available Instructor evaluation of progressive student skills in the analysis of primary literature is critical for the development of these skills in young scientists. Students in a senior or graduate-level one-semester course in Immunology at a Masters-level comprehensive university were assessed for abilities (primary traits to recognize and evaluate the following elements of a scientific paper: Hypothesis and Rationale, Significance, Methods, Results, Critical Thinking and Analysis, and Conclusions. We tested the hypotheses that average recognition scores vary among elements and that scores change with time differently by trait. Recognition scores (scaled 1 to 5, and differences in scores were analyzed using analysis of variance (ANOVA, regression, and analysis of covariance (ANCOVA (n = 10 papers over 103 days. By multiple comparisons testing, we found that recognition scores statistically fell into two groups: high scores (for Hypothesis and Rationale, Significance, Methods, and Conclusions and low scores (for Results and Critical Thinking and Analysis. Recognition scores only significantly changed with time (increased for Hypothesis and Rationale and Results. ANCOVA showed that changes in recognition scores for these elements were not significantly different in slope (F1,16 = 0.254, P = 0.621 but the Results trait was significantly lower in elevation (F1,17 = 12.456, P = 0.003. Thus, students improved with similar trajectories, but starting and ending with lower Results scores. We conclude that students have greatest difficulty evaluating Results and critically evaluating scientific validity. Our findings show extant student skills, and the significant increase in some traits shows learning. This study demonstrates that students start with variable recognition skills and that student skills may be learned at differential rates. Faculty can use these findings or the primary trait analysis scoring scale to focus on specific paper elements for which

  6. Assessment of genetic stability in micropropagules of Jatropha curcas genotypes by RAPD and AFLP analysis

    KAUST Repository

    Sharma, Sweta K.

    2011-07-01

    Jatropha curcas (Euphorbiaceae), a drought resistant non edible oil yielding plant, has acquired significant importance as an alternative renewable energy source. Low and inconsistent yields found in field plantations prompted for identification of high yielding clones and their large scale multiplication by vegetative propagation to obtain true to type plants. In the current investigation plantlets of J. curcas generated by axillary bud proliferation (micropropagation) using nodal segments obtained from selected high yielding genotypes were assessed for their genetic stability using Randomly Amplified Polymorphic DNA (RAPD) and Amplified Fragment Length Polymorphism (AFLP) analyses. For RAPD analysis, 21 out of 52 arbitrary decamer primers screened gave clear reproducible bands. In the micropropagated plantlets obtained from the 2nd sub-culture, 4 out of a total of 177 bands scored were polymorphic, but in the 8th and 16th sub-cultures (culture cycle) no polymorphisms were detected. AFLP analysis revealed 0.63%, 0% and 0% polymorphism in the 2nd, 8th and 16th generations, respectively. When different genotypes, viz. IC 56557 16, IC 56557 34 and IC 56557 13, were assessed by AFLP, 0%, 0.31% and 0.47% polymorphisms were found, respectively, indicating a difference in genetic stability among the different genotypes. To the best of our knowledge this is the first report on assessment of genetic stability of micropropagated plantlets in J. curcas and suggests that axillary shoot proliferation can safely be used as an efficient micropropagation method for mass propagation of J. curcas. © 2011 Elsevier B.V.

  7. Applications of life cycle assessment and cost analysis in health care waste management

    Energy Technology Data Exchange (ETDEWEB)

    Soares, Sebastiao Roberto, E-mail: soares@ens.ufsc.br [Department of Sanitary Engineering, Federal University of Santa Catarina, UFSC, Campus Universitario, Centro Tecnologico, Trindade, PO Box 476, Florianopolis, SC 88040-970 (Brazil); Finotti, Alexandra Rodrigues, E-mail: finotti@ens.ufsc.br [Department of Sanitary Engineering, Federal University of Santa Catarina, UFSC, Campus Universitario, Centro Tecnologico, Trindade, PO Box 476, Florianopolis, SC 88040-970 (Brazil); Prudencio da Silva, Vamilson, E-mail: vamilson@epagri.sc.gov.br [Department of Sanitary Engineering, Federal University of Santa Catarina, UFSC, Campus Universitario, Centro Tecnologico, Trindade, PO Box 476, Florianopolis, SC 88040-970 (Brazil); EPAGRI, Rod. Admar Gonzaga 1347, Itacorubi, Florianopolis, Santa Catarina 88034-901 (Brazil); Alvarenga, Rodrigo A.F., E-mail: alvarenga.raf@gmail.com [Department of Sanitary Engineering, Federal University of Santa Catarina, UFSC, Campus Universitario, Centro Tecnologico, Trindade, PO Box 476, Florianopolis, SC 88040-970 (Brazil); Ghent University, Department of Sustainable Organic Chemistry and Technology, Coupure Links 653/9000 Gent (Belgium)

    2013-01-15

    Highlights: Black-Right-Pointing-Pointer Three Health Care Waste (HCW) scenarios were assessed through environmental and cost analysis. Black-Right-Pointing-Pointer HCW treatment using microwave oven had the lowest environmental impacts and costs in comparison with autoclave and lime. Black-Right-Pointing-Pointer Lime had the worst environmental and economic results for HCW treatment, in comparison with autoclave and microwave. - Abstract: The establishment of rules to manage Health Care Waste (HCW) is a challenge for the public sector. Regulatory agencies must ensure the safety of waste management alternatives for two very different profiles of generators: (1) hospitals, which concentrate the production of HCW and (2) small establishments, such as clinics, pharmacies and other sources, that generate dispersed quantities of HCW and are scattered throughout the city. To assist in developing sector regulations for the small generators, we evaluated three management scenarios using decision-making tools. They consisted of a disinfection technique (microwave, autoclave and lime) followed by landfilling, where transportation was also included. The microwave, autoclave and lime techniques were tested at the laboratory to establish the operating parameters to ensure their efficiency in disinfection. Using a life cycle assessment (LCA) and cost analysis, the decision-making tools aimed to determine the technique with the best environmental performance. This consisted of evaluating the eco-efficiency of each scenario. Based on the life cycle assessment, microwaving had the lowest environmental impact (12.64 Pt) followed by autoclaving (48.46 Pt). The cost analyses indicated values of US$ 0.12 kg{sup -1} for the waste treated with microwaves, US$ 1.10 kg{sup -1} for the waste treated by the autoclave and US$ 1.53 kg{sup -1} for the waste treated with lime. The microwave disinfection presented the best eco-efficiency performance among those studied and provided a feasible

  8. A Support Analysis Framework for mass movement damage assessment: applications to case studies in Calabria (Italy

    Directory of Open Access Journals (Sweden)

    O. Petrucci

    2009-03-01

    Full Text Available The analysis of data describing damage caused by mass movements in Calabria (Italy allowed the organisation of the Support Analysis Framework (SAF, a spreadsheet that converts damage descriptions into numerical indices expressing direct, indirect, and intangible damage.

    The SAF assesses damage indices of past mass movements and the potential outcomes of dormant phenomena re-activations. It is based on the effects on damaged elements and is independent of both physical and geometric phenomenon characteristics.

    SAF sections that assess direct damage encompass several lines, each describing an element characterised by a value fixed on a relative arbitrary scale. The levels of loss are classified as: L4: complete; L3: high; L2: medium; or L1: low. For a generic line l, the SAF multiplies the value of a damaged element by its level of loss, obtaining dl, the contribution of the line to the damage.

    Indirect damage is appraised by two sections accounting for: (a actions aiming to overcome emergency situations and (b actions aiming to restore pre-movement conditions. The level of loss depends on the number of people involved (a or the cost of actions (b.

    For intangible damage, the level of loss depends on the number of people involved.

    We examined three phenomena, assessing damage using the SAF and SAFL, customised versions of SAF based on the elements actually present in the analysed municipalities that consider the values of elements in the community framework. We show that in less populated, inland, and affluent municipalities, the impact of mass movements is greater than in coastal areas.

    The SAF can be useful to sort groups of phenomena according to their probable future damage, supplying results significant either for insurance companies or for local authorities involved in both disaster management and planning of defensive measures.

  9. Assessing a new gene expression analysis technique for radiation biodosimetry applications

    Energy Technology Data Exchange (ETDEWEB)

    Manning, Grainne; Kabacik, Sylwia; Finnon, Paul; Paillier, Francois; Bouffler, Simon [Cancer Genetics and Cytogenetics, Biological Effects Department, Centre for Radiation, Chemical and Environmental Hazards, Health Protection Agency, Chilton, Didcot, Oxfordshire OX11 ORQ (United Kingdom); Badie, Christophe, E-mail: christophe.badie@hpa.org.uk [Cancer Genetics and Cytogenetics, Biological Effects Department, Centre for Radiation, Chemical and Environmental Hazards, Health Protection Agency, Chilton, Didcot, Oxfordshire OX11 ORQ (United Kingdom)

    2011-09-15

    The response to any radiation accident or incident involving actual or potential ionising radiation exposure requires accurate and rapid assessment of the doses received by individuals. The techniques available today for biodosimetry purposes are not fully adapted to rapid high-throughput measurements of exposures in large numbers of individuals. A recently emerging technique is based on gene expression analysis, as there are a number of genes which are radiation responsive in a dose-dependent manner. The present work aimed to assess a new technique which allows the detection of the level of expression of up to 800 genes without need of enzymatic reactions. In order to do so, human peripheral blood was exposed ex vivo to a range of x-ray doses from 5 mGy to 4 Gy of x-rays and the transcriptional expression of five radiation-responsive genes PHPT1, PUMA, CCNG1, DDB2 and MDM2 was studied by both the nCounter Digital Analyzer and Multiplex Quantitative Real-Time Polymerase Chain Reaction (MQRT-PCR) as the benchmark technology. Results from both techniques showed good correlation for all genes with R{sup 2} values ranging between 0.8160 and 0.9754. The reproducibility of the nCounter Digital Analyzer was also assessed in independent biological replicates and proved to be good. Although the slopes of the correlation of results obtained by the techniques suggest that MQRT-PCR is more sensitive than the nCounter Digital Analyzer, the nCounter Digital Analyzer provides sensitive and reliable data on modifications in gene expression in human blood exposed to radiation without enzymatic amplification of RNA prior to analysis.

  10. Confusion assessment method: a systematic review and meta-analysis of diagnostic accuracy

    Directory of Open Access Journals (Sweden)

    Shi Q

    2013-09-01

    Full Text Available Qiyun Shi,1,2 Laura Warren,3 Gustavo Saposnik,2 Joy C MacDermid1 1Health and Rehabilitation Sciences, Western University, London, Ontario, Canada; 2Stroke Outcomes Research Center, Department of Medicine, St Michael's Hospital, University of Toronto, Toronto, Ontario, Canada; 3Dalla Lana School of Public Health, University of Toronto, Toronto, Ontario, Canada Background: Delirium is common in the early stages of hospitalization for a variety of acute and chronic diseases. Objectives: To evaluate the diagnostic accuracy of two delirium screening tools, the Confusion Assessment Method (CAM and the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU. Methods: We searched MEDLINE, EMBASE, and PsychInfo for relevant articles published in English up to March 2013. We compared two screening tools to Diagnostic and Statistical Manual of Mental Disorders IV criteria. Two reviewers independently assessed studies to determine their eligibility, validity, and quality. Sensitivity and specificity were calculated using a bivariate model. Results: Twenty-two studies (n = 2,442 patients met the inclusion criteria. All studies demonstrated that these two scales can be administered within ten minutes, by trained clinical or research staff. The pooled sensitivities and specificity for CAM were 82% (95% confidence interval [CI]: 69%–91% and 99% (95% CI: 87%–100%, and 81% (95% CI: 57%–93% and 98% (95% CI: 86%–100% for CAM-ICU, respectively. Conclusion: Both CAM and CAM-ICU are validated instruments for the diagnosis of delirium in a variety of medical settings. However, CAM and CAM-ICU both present higher specificity than sensitivity. Therefore, the use of these tools should not replace clinical judgment. Keywords: confusion assessment method, diagnostic accuracy, delirium, systematic review, meta-analysis

  11. A limited assessment of the ASEP human reliability analysis procedure using simulator examination results

    Energy Technology Data Exchange (ETDEWEB)

    Gore, B.R.; Dukelow, J.S. Jr.; Mitts, T.M.; Nicholson, W.L. [Pacific Northwest Lab., Richland, WA (United States)

    1995-10-01

    This report presents a limited assessment of the conservatism of the Accident Sequence Evaluation Program (ASEP) human reliability analysis (HRA) procedure described in NUREG/CR-4772. In particular, the, ASEP post-accident, post-diagnosis, nominal HRA procedure is assessed within the context of an individual`s performance of critical tasks on the simulator portion of requalification examinations administered to nuclear power plant operators. An assessment of the degree to which operator perforn:Lance during simulator examinations is an accurate reflection of operator performance during actual accident conditions was outside the scope of work for this project; therefore, no direct inference can be made from this report about such performance. The data for this study are derived from simulator examination reports from the NRC requalification examination cycle. A total of 4071 critical tasks were identified, of which 45 had been failed. The ASEP procedure was used to estimate human error probability (HEP) values for critical tasks, and the HEP results were compared with the failure rates observed in the examinations. The ASEP procedure was applied by PNL operator license examiners who supplemented the limited information in the examination reports with expert judgment based upon their extensive simulator examination experience. ASEP analyses were performed for a sample of 162 critical tasks selected randomly from the 4071, and the results were used to characterize the entire population. ASEP analyses were also performed for all of the 45 failed critical tasks. Two tests were performed to assess the bias of the ASEP HEPs compared with the data from the requalification examinations. The first compared the average of the ASEP HEP values with the fraction of the population actually failed and it found a statistically significant factor of two bias on the average.

  12. Integrated genomic and BMI analysis for type 2 diabetes risk assessment

    Science.gov (United States)

    Lebrón-Aldea, Dayanara; Dhurandhar, Emily J.; Pérez-Rodríguez, Paulino; Klimentidis, Yann C.; Tiwari, Hemant K.; Vazquez, Ana I.

    2015-01-01

    Type 2 Diabetes (T2D) is a chronic disease arising from the development of insulin absence or resistance within the body, and a complex interplay of environmental and genetic factors. The incidence of T2D has increased throughout the last few decades, together with the occurrence of the obesity epidemic. The consideration of variants identified by Genome Wide Association Studies (GWAS) into risk assessment models for T2D could aid in the identification of at-risk patients who could benefit from preventive medicine. In this study, we build several risk assessment models, evaluated with two different classification approaches (Logistic Regression and Neural Networks), to measure the effect of including genetic information in the prediction of T2D. We used data from to the Original and the Offspring cohorts of the Framingham Heart Study, which provides phenotypic and genetic information for 5245 subjects (4306 controls and 939 cases). Models were built by using several covariates: gender, exposure time, cohort, body mass index (BMI), and 65 SNPs associated to T2D. We fitted Logistic Regressions and Bayesian Regularized Neural Networks and then assessed their predictive ability by using a ten-fold cross validation. We found that the inclusion of genetic information into the risk assessment models increased the predictive ability by 2%, when compared to the baseline model. Furthermore, the models that included BMI at the onset of diabetes as a possible effector, gave an improvement of 6% in the area under the curve derived from the ROC analysis. The highest AUC achieved (0.75) belonged to the model that included BMI, and a genetic score based on the 65 established T2D-associated SNPs. Finally, the inclusion of SNPs and BMI raised predictive ability in all models as expected; however, results from the AUC in Neural Networks and Logistic Regression did not differ significantly in their prediction accuracy. PMID:25852736

  13. Integrated genomic and BMI analysis for type 2 diabetes risk assessment.

    Directory of Open Access Journals (Sweden)

    Dayanara eLebrón-Aldea

    2015-03-01

    Full Text Available Type 2 Diabetes (T2D is a chronic disease arising from the development of insulin absence or resistance within the body, and a complex interplay of environmental and genetic factors. The incidence of T2D has increased throughout the last few decades, together with the occurrence of the obesity epidemic. The consideration of variants identified by Genome Wide Association Studies (GWAS into risk assessment models for T2D could aid in the identification of at-risk patients who could benefit from preventive medicine. In this study, we build several risk assessment models, and evaluated them with two different classification approaches (Logistic Regression and Neural Networks, to measure the effect of including genetic information in the prediction of T2D. We used data from to the Original and the Offspring cohorts of the Framingham Heart Study, which provides phenotypic and genetic information for 5,245 subjects (4,306 controls and 939 cases. Models were built by using several covariates: gender, exposure time, cohort, body mass index (BMI, and 65 established T2D-associated SNPs. We fitted Logistic Regressions and Bayesian Regularized Neural Network and then assessed their predictive ability by using a ten-fold cross validation. We found that the inclusion of genetic information into the risk assessment models increased the predictive ability by 2%, when compared to the baseline model. Furthermore, the models that included BMI at the onset of diabetes as a possible effector, gave an improvement of 6% in the area under the curve derived from the ROC analysis. The highest AUC achieved (0.75 belonged to the model that included BMI, and a genetic score based on the 65 established T2D-associated SNPs. Finally, the inclusion of SNPs and BMI raised predictive ability in all models as expected; however, results from the AUC in Neural Networks and Logistic Regression did not differ significantly in their prediction accuracy.

  14. Application of texture analysis to DAT SPECT imaging: Relationship to clinical assessments

    Directory of Open Access Journals (Sweden)

    Arman Rahmim

    2016-01-01

    Full Text Available Dopamine transporter (DAT SPECT imaging is increasingly utilized for diagnostic purposes in suspected Parkinsonian syndromes. We performed a cross-sectional study to investigate whether assessment of texture in DAT SPECT radiotracer uptake enables enhanced correlations with severity of motor and cognitive symptoms in Parkinson's disease (PD, with the long-term goal of enabling clinical utility of DAT SPECT imaging, beyond standard diagnostic tasks, to tracking of progression in PD. Quantitative analysis in routine DAT SPECT imaging, if performed at all, has been restricted to assessment of mean regional uptake. We applied a framework wherein textural features were extracted from the images. Notably, the framework did not require registration to a common template, and worked in the subject-native space. Image analysis included registration of SPECT images onto corresponding MRI images, automatic region-of-interest (ROI extraction on the MRI images, followed by computation of Haralick texture features. We analyzed 141 subjects from the Parkinson's Progressive Marker Initiative (PPMI database, including 85 PD and 56 healthy controls (HC (baseline scans with accompanying 3 T MRI images. We performed univariate and multivariate regression analyses between the quantitative metrics and different clinical measures, namely (i the UPDRS (part III - motor score, disease duration as measured from (ii time of diagnosis (DD-diag. and (iii time of appearance of symptoms (DD-sympt., as well as (iv the Montreal Cognitive Assessment (MoCA score. For conventional mean uptake analysis in the putamen, we showed significant correlations with clinical measures only when both HC and PD were included (Pearson correlation r = −0.74, p-value < 0.001. However, this was not significant when applied to PD subjects only (r = −0.19, p-value = 0.084, and no such correlations were observed in the caudate. By contrast, for the PD subjects, significant correlations

  15. Authenticity assessment of beef origin by principal component analysis of matrix-assisted laser desorption/ionization mass spectrometric data.

    Science.gov (United States)

    Zaima, Nobuhiro; Goto-Inoue, Naoko; Hayasaka, Takahiro; Enomoto, Hirofumi; Setou, Mitsutoshi

    2011-06-01

    It has become necessary to assess the authenticity of beef origin because of concerns regarding human health hazards. In this study, we used a metabolomic approach involving matrix-assisted laser desorption/ionization imaging mass spectrometry to assess the authenticity of beef origin. Highly accurate data were obtained for samples of extracted lipids from beef of different origin; the samples were grouped according to their origin. The analysis of extracted lipids in this study ended within 10 min, suggesting this approach can be used as a simple authenticity assessment before a definitive identification by isotope analysis.

  16. Short-Term Assessment of Risk and Treatability (START): systematic review and meta-analysis.

    Science.gov (United States)

    O'Shea, Laura E; Dickens, Geoffrey L

    2014-09-01

    This article describes a systematic review of the psychometric properties of the Short-Term Assessment of Risk and Treatability (START) and a meta-analysis to assess its predictive efficacy for the 7 risk domains identified in the manual (violence to others, self-harm, suicide, substance abuse, victimization, unauthorized leave, and self-neglect) among institutionalized patients with mental disorder and/or personality disorder. Comprehensive terms were used to search 5 electronic databases up to January 2013. Additional articles were located by examining references lists and hand-searching. Twenty-three papers were selected to include in the narrative review of START's properties, whereas 9 studies involving 543 participants were included in the meta-analysis. Studies about the feasibility and utility of the tool had positive results but lacked comparators. START ratings demonstrated high internal consistency, interrater reliability, and convergent validity with other risk measures. There was a lack of information about the variability of START ratings over time. Its use in an intervention to reduce violence in forensic psychiatric outpatients was not better than standard care. START risk estimates demonstrated strong predictive validity for various aggressive outcomes and good predictive validity for self-harm. Predictive validity for self-neglect and victimization was no better than chance, whereas evidence for the remaining outcomes is derived from a single, small study. Only 3 of the studies included in the meta-analysis were rated to be at a low risk of bias. Future research should aim to investigate the predictive validity of the START for the full range of adverse outcomes, using well-designed methodologies, and validated outcome tools.

  17. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  18. Environmental assessment of policy through budgetary analysis: Conceptual framework and methods. Manuscript report No. MR5-93

    Energy Technology Data Exchange (ETDEWEB)

    Jerrett, M.L.B.

    1993-01-01

    The need to apply environmental assessment to government policies that encourage environmentally damaging human behaviour is now generally accepted by leading environmental experts, although few methods exist for conducting them. Part of the problem lies in the complexity of the policy-making process, which confuses the identification of policies. This problem of policy identification can be partially overcome by focusing on the budget. This report begins with a rationale for why government budgets are the most appropriate unit of analysis for performing environmental assessment on government policy. A literature review is conducted on the methods for assessing the environmental effects of policy, followed by an analysis of the interactions occurring among budgetary policies, human economic behaviour, natural life support systems, and political systems. Methods of assessing the environmental implications of government policy through government analysis are proposed and there is a discussion of promising directions for future research.

  19. Elimination Method of Multi-Criteria Decision Analysis (MCDA: A Simple Methodological Approach for Assessing Agricultural Sustainability

    Directory of Open Access Journals (Sweden)

    Byomkesh Talukder

    2017-02-01

    Full Text Available In the present world context, there is a need to assess the sustainability of agricultural systems. Various methods have been proposed to assess agricultural sustainability. Like in many other fields, Multi-Criteria Decision Analysis (MCDA has recently been used as a methodological approach for the assessment of agricultural sustainability. In this paper, an attempt is made to apply Elimination, a MCDA method, to an agricultural sustainability assessment, and to investigate its benefits and drawbacks. This article starts by explaining the importance of agricultural sustainability. Common MCDA types are discussed, with a description of the state-of-the-art method for incorporating multi-criteria and reference values for agricultural sustainability assessment. Then, a generic description of the Elimination Method is provided, and its modeling approach is applied to a case study in coastal Bangladesh. An assessment of the results is provided, and the issues that need consideration before applying Elimination to agricultural sustainability, are examined. Whilst having some limitations, the case study shows that it is applicable for agricultural sustainability assessments and for ranking the sustainability of agricultural systems. The assessment is quick compared to other assessment methods and is shown to be helpful for agricultural sustainability assessment. It is a relatively simple and straightforward analytical tool that could be widely and easily applied. However, it is suggested that appropriate care must be taken to ensure the successful use of the Elimination Method during the assessment process.

  20. Content Analysis in Computer-Mediated Communication: Analyzing Models for Assessing Critical Thinking through the Lens of Social Constructivism

    Science.gov (United States)

    Buraphadeja, Vasa; Dawson, Kara

    2008-01-01

    This article reviews content analysis studies aimed to assess critical thinking in computer-mediated communication. It also discusses theories and content analysis models that encourage critical thinking skills in asynchronous learning environments and reviews theories and factors that may foster critical thinking skills and new knowledge…

  1. Assessment as Text Production: Drawing on Systemic Functional Linguistics to Frame the Design and Analysis of Assessment Tasks

    Science.gov (United States)

    Hughes, Clair

    2009-01-01

    The plentiful and steadily increasing literature on teaching and learning in higher education has produced a number of helpful frameworks and guidelines that can be applied to the development and communication of assessment practice. The continued prevalence of much imprecise, unclear and otherwise confusing terminology around the discussion of…

  2. Situation analysis: assessing family planning and reproductive health services. Quality of care.

    Science.gov (United States)

    1997-01-01

    This issue of Population Briefs contains articles on researches conducted by the Population Council concerning the delivery of quality of care, contraceptive development, safe abortion, family planning, demography, and medical anthropology. The cover story focuses on a systematic data collection tool called Situation Analysis that helps managers in program evaluation. This tool has a handbook entitled "The Situation Analysis Approach to Assessing Family Planning and Reproductive Health Services" that contains all the information needed to conduct a Situation Analysis study. The second article reports about a new contraceptive method, the two-rod levonorgestrel, which was developed at the Population Council and was recently approved by the US Food and Drug Administration. The third article reports on a medical abortion procedure that was proven to be safe, effective, and acceptable to women in developing countries. Moreover, the fourth article presents initial findings of the Community Health and Family Planning Project conducted in Northern Ghana. The fifth article discusses the paper written by the Population Council demographer, Mark Montgomery entitled "Learning and lags in mortality perceptions". Finally, the sixth article deals with another paper that reports on women's health perceptions and reproductive health in the Middle East.

  3. Economic analysis and assessment of syngas production using a modeling approach

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hakkwan; Parajuli, Prem B.; Yu, Fei; Columbus, Eugene P.

    2011-08-10

    Economic analysis and modeling are essential and important issues for the development of current feedstock and process technology for bio-gasification. The objective of this study was to develop an economic model and apply to predict the unit cost of syngas production from a micro-scale bio-gasification facility. An economic model was programmed in C++ computer programming language and developed using a parametric cost approach, which included processes to calculate the total capital costs and the total operating costs. The model used measured economic data from the bio-gasification facility at Mississippi State University. The modeling results showed that the unit cost of syngas production was $1.217 for a 60 Nm-3 h-1 capacity bio-gasifier. The operating cost was the major part of the total production cost. The equipment purchase cost and the labor cost were the largest part of the total capital cost and the total operating cost, respectively. Sensitivity analysis indicated that labor costs rank the top as followed by equipment cost, loan life, feedstock cost, interest rate, utility cost, and waste treatment cost. The unit cost of syngas production increased with the increase of all parameters with exception of loan life. The annual cost regarding equipment, labor, feedstock, waste treatment, and utility cost showed a linear relationship with percent changes, while loan life and annual interest rate showed a non-linear relationship. This study provides the useful information for economic analysis and assessment of the syngas production using a modeling approach.

  4. BATCH-GE: Batch analysis of Next-Generation Sequencing data for genome editing assessment.

    Science.gov (United States)

    Boel, Annekatrien; Steyaert, Woutert; De Rocker, Nina; Menten, Björn; Callewaert, Bert; De Paepe, Anne; Coucke, Paul; Willaert, Andy

    2016-07-27

    Targeted mutagenesis by the CRISPR/Cas9 system is currently revolutionizing genetics. The ease of this technique has enabled genome engineering in-vitro and in a range of model organisms and has pushed experimental dimensions to unprecedented proportions. Due to its tremendous progress in terms of speed, read length, throughput and cost, Next-Generation Sequencing (NGS) has been increasingly used for the analysis of CRISPR/Cas9 genome editing experiments. However, the current tools for genome editing assessment lack flexibility and fall short in the analysis of large amounts of NGS data. Therefore, we designed BATCH-GE, an easy-to-use bioinformatics tool for batch analysis of NGS-generated genome editing data, available from https://github.com/WouterSteyaert/BATCH-GE.git. BATCH-GE detects and reports indel mutations and other precise genome editing events and calculates the corresponding mutagenesis efficiencies for a large number of samples in parallel. Furthermore, this new tool provides flexibility by allowing the user to adapt a number of input variables. The performance of BATCH-GE was evaluated in two genome editing experiments, aiming to generate knock-out and knock-in zebrafish mutants. This tool will not only contribute to the evaluation of CRISPR/Cas9-based experiments, but will be of use in any genome editing experiment and has the ability to analyze data from every organism with a sequenced genome.

  5. Analysis of criteria weights for the assessment of corporate sustainability: a case study in sugar manufacturing

    Directory of Open Access Journals (Sweden)

    Panitas Sureeyatanapas

    2016-08-01

    Full Text Available The assessment of sustainability performance has become a topic widely discussed by business practitioners. The complexity of this issue is highlighted by the incorporation of a large number of criteria. Several methods under the context of Multiple Criteria Decision Analysis (MCDA have been employed to facilitate the aggregation of various criteria and to provide a guideline for decision making. As most MCDA methods assume that each criterion plays a role equal to its weight, this paper investigates the weight of each criterion in evaluation of corporate sustainability by focusing on the sugar industry in order to respond to the lack of MCDA and sustainability studies in this sector. The weighting is analysed by means of the relative importance based upon interviews and the direct rating technique. Statistical analysis is also conducted. The results from this empirical research indicate priorities of sustainability criteria and demonstrate the diversity of concerns within the industry when deciding on sustainability policies and strategies. This encourages practitioners to incorporate uncertain weights of sustainability criteria into decision making. Possible reasons for variations or changes in weights have been also discussed, and this enables practitioners to perform a sensitivity analysis in a more realistic way.

  6. Statistical Analysis of Meteorological Data to Assess Evapotranspiration and Infiltration at the Rifle Site, CO, USA

    Science.gov (United States)

    Faybishenko, B.; Long, P. E.; Tokunaga, T. K.; Christensen, J. N.

    2015-12-01

    Net infiltration to the vadose zone, especially in arid or semi-arid climates, is an important control on microbial activity and solute and green house gas fluxes. To assess net infiltration, we performed a statistical analysis of meteorological data as the basis for hydrological and climatic investigations and predictions for the Rifle site, Colorado, USA, located within a floodplain in a mountainous region along the Colorado River, with a semi-arid climate. We carried out a statistical analysis of meteorological 30-year time series data (1985-2015), including: (1) precipitation data, taking into account the evaluation of the snowmelt, (2) evaluation of the evapotranspiration (reference and actual), (3) estimation of the multi-time-scalar Standardized Precipitation-Evapotranspiration Index (SPEI), (4) evaluation of the net infiltration rate, and (5) corroborative analysis of calculated net infiltration rate and groundwater recharge from radioisotopic measurements from samples collected in 2013. We determined that annual net infiltration percentage of precipitation varies from 4.7% to ~18%, with a mean of ~10%, and concluded that calculations of net infiltration based on long-term meteorological data are comparable with those from strontium isotopic investigations. The evaluation of the SPEI showed the intermittent pattern of droughts and wet periods over the past 30 years, with a detectable decreasein the duration of droughts with time. Local measurements within the floodplain indicate a recharge gradient with increased recharge closer to the Colorado River.

  7. Development of PIRT and Assessment Matrix for Verification and Validation of Sodium Fire Analysis Codes

    Science.gov (United States)

    Ohno, Shuji; Ohshima, Hiroyuki; Tajima, Yuji; Ohki, Hiroshi

    Thermodynamic consequence in liquid sodium leak and fire accident is one of the important issues to be evaluated when considering the safety aspect of fast reactor plant building. The authors are therefore initiating systematic verification and validation (V&V) activity to assure and demonstrate the reliability of numerical simulation tool for sodium fire analysis. The V&V activity is in progress with the main focuses on already developed sodium fire analysis codes SPHINCS and AQUA-SF. The events to be evaluated are hypothetical sodium spray, pool, or combined fire accidents followed by thermodynamic behaviors postulated in a plant building. The present paper describes that the ‘Phenomena Identification and Ranking Table (PIRT)’ is developed at first for clarifying the important validation points in the sodium fire analysis codes, and that an ‘Assessment Matrix’ is proposed which summarizes both separate effect tests and integral effect tests for validating the computational models or whole code for important phenomena. Furthermore, the paper shows a practical validation with a separate effect test in which the spray droplet combustion model of SPHINCS and AQUA-SF predicts the burned amount of a falling sodium droplet with the error mostly less than 30%.

  8. Expert assessments and content analysis of crew communication during ISS missions

    Science.gov (United States)

    Yusupova, Anna

    During the last seven years, we have analyzed the communication patterns between ISS crewmembers and mission control personnel and identified a number of different communication styles between these two groups (Gushin et al, 2005). In this paper, we will report on an external validity check we conducted that compares our findings with those of another study using the same research material. For many years the group of psychologists at the Medical Center of Space Flight Control (TCUMOKO) at the Institute for Biomedical Problems (IBMP) in Moscow has been analyzing audio communication sessions of Russian space crews with the ground-based Mission Control during long-duration spaceflight conditions. We compared week by week texts of the standard weekly monitoring reports made by the TsUP psychological group and audiocommunication of space crews with mission control centers. Expert assessments of the crewmembers' psychological state are made by IBMP psychoneurologists on the basis of daily schedule fulfillment, video and audio materials, and psychophysiological data from board. The second approach was based on the crew-ground communication analysis. For both population of messages we applied two corresponding schemas of content analysis. All statements made in communication sessions and weekly reports were divided into three groups in terms of their communication function (Lomov, 1981): 1) informative function (e.g., demands for information, requests, professional slang); 2) socio-regulatory function (e.g., rational consent or discord, operational complaint, refusal to cooperate); and 3) affective (emotional) function (e.g., encouragement, sympathy, emotional consent or discord). Number of statements of the audiocommunication sessions correlated with corresponding functions (informative, regulatory, affective) of communication in weekly monitioring reports made by experts. Crewmembers verbal behavior expresses its psycho-emotional state which is formulated by expert

  9. Extraneous carbon assessment in ultra-microscale radiocarbon analysis using benzene polycarboxylic acids (BPCA)

    Science.gov (United States)

    Hanke, Ulrich M.; McIntyre, Cameron P.; Schmidt, Michael W. I.; Wacker, Lukas; Eglinton, Timothy I.

    2016-04-01

    Measurements of the natural abundance of radiocarbon (14C) concentrations in inorganic and organic carbon-containing materials can be used to investigate their date of origin. Particularly, the biogeochemical cycling of specific compounds in the environment may be investigated applying molecular marker analyses. However, the isolation of specific molecules from environmental matrices requires a complex processing procedure resulting in small sample sizes that often contain less than 30 μg C. Such small samples are sensitive to extraneous carbon (Cex) that is introduced during the purification of the compounds (Shah and Pearson, 2007). We present a thorough radiocarbon blank assessment for benzene polycarboxylic acids (BPCA), a proxy for combustion products that are formed during the oxidative degradation of condensed polyaromatic structures (Wiedemeier et al, in press). The extraneous carbon assessment includes reference material for (1) chemical extraction, (2) preparative liquid chromatography (3) wet chemical oxidation which are subsequently measured with gas ion source AMS (Accelerator Mass Spectrometer, 5-100 μg C). We always use pairs of reference materials, radiocarbon depleted (14Cfossil) and modern (14Cmodern) to determine the fraction modern (F14C) of Cex.Our results include detailed information about the quantification of Cex in radiocarbon molecular marker analysis using BPCA. Error propagation calculations indicate that ultra-microscale samples (20-30 μg) are feasible with uncertainties of less than 10 %. Calculations of the constant contamination reveal important information about the source (F14C) and mass (μg) of Cex (Wacker and Christl, 2011) for each sub procedure. An external correction of compound specific radiocarbon data is essential for robust results that allow for a high degree of confidence in the 14C results. References Shah and Pearson, 2007. Ultra-microscale (5-25μg C) analysis of individual lipids by 14C AMS: Assessment and

  10. Limit Load and Buckling Analysis for Assessing Hanford Single-Shell Tank Dome Structural Integrity - 12278

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Ken I.; Deibler, John E.; Karri, Naveen K.; Pilli, Siva P. [Pacific Northwest National Laboratory, Richland, Washington 99352 (United States); Julyk, Larry J. [M and D Professional Services, Inc., Richland, Washington 99352 (United States)

    2012-07-01

    The U.S. Department of Energy, Office of River Protection has commissioned a structural analysis of record for the Hanford single shell tanks to assess their structural integrity. The analysis used finite element techniques to predict the tank response to the historical thermal and operating loads. The analysis also addressed the potential tank response to a postulated design basis earthquake. The combined response to static and seismic loads was then evaluated against the design requirements of American Concrete Institute standard, ACI-349-06, for nuclear safety-related concrete structures. Further analysis was conducted to estimate the plastic limit load and the elastic-plastic buckling capacity of the tanks. The limit load and buckling analyses estimate the margin between the applied loads and the limiting load capacities of the tank structure. The potential for additional dome loads from waste retrieval equipment and the addition of large dome penetrations to accommodate retrieval equipment has generated additional interest in the limit load and buckling analyses. This paper summarizes the structural analysis methods that were used to evaluate the limit load and buckling of the single shell tanks. This paper summarizes the structural analysis methods that were used to evaluate the limit load and buckling limit states of the underground single shell tanks at the Hanford site. The limit loads were calculated using nonlinear finite element models that capture the progressive deformation and damage to the concrete as it approaches the limit load. Both uniform and concentrated loads over the tank dome were considered, and the analysis shows how adding a penetration in the center of the tank would affect the limit loads. For uniform surface loads, the penetration does not affect the limit load because concrete crushing and rebar yielding initiates first at the top of the wall, away from the penetration. For concentrated loads, crushing initiates at the center of the

  11. Spiritual Assessment within Clinical Interventions Focused on Quality of Life Assessment in Palliative Care: A Secondary Analysis of a Systematic Review

    Directory of Open Access Journals (Sweden)

    Gianluca Catania

    2016-03-01

    Full Text Available One of the most crucial palliative care challenges is in determining how patients’ needs are defined and assessed. Although physical and psychological needs are commonly documented in patient’s charts, spiritual needs are less frequently reported. The aim of this review was to determine which explicit, longitudinal documentation of spiritual concerns would sufficiently affect clinical care to alleviate spiritual distress or promote spiritual wellbeing. A secondary analysis of a systematic review originally aimed at appraising the effectiveness of complex interventions focused on quality of life in palliative care was conducted. Five databases were searched for articles reporting interventions focused on QoL including at least two or more QoL dimensions. A narrative synthesis was performed to synthesize findings. In total, 10 studies were included. Only three studies included spiritual wellbeing assessment. Spirituality tools used to assess spiritual wellbeing were different between studies: Hospital QoL Index 14; Spiritual Needs Inventory; Missoula-Vitas QoL Index; and the Needs Assessment Tool: Progressive Disease-Cancer. Only one study reported a healthcare professional’s session training in the use of the QoL tool. Two out of three studies showed in participants an improvement in spiritual wellbeing, but changes in spiritual wellbeing scores were not significant. Overall patients receiving interventions focused on QoL assessment experienced both improvements in their QoL and in their spiritual needs. Although spiritual changes were not significant, the results provide evidence that a spiritual need exists and that spiritual care should be appropriately planned and delivered. Spiritual needs assessment precedes spiritual caring. It is essential that interventions focused on QoL assessment in palliative care include training on how to conduct a spiritual assessment and appropriate interventions to be offered to patients to address their

  12. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    Science.gov (United States)

    Baruffini, Mirko

    2010-05-01

    Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a

  13. Development of a quantitative morphological assessment of toxicant-treated zebrafish larvae using brightfield imaging and high-content analysis.

    Science.gov (United States)

    Deal, Samantha; Wambaugh, John; Judson, Richard; Mosher, Shad; Radio, Nick; Houck, Keith; Padilla, Stephanie

    2016-09-01

    One of the rate-limiting procedures in a developmental zebrafish screen is the morphological assessment of each larva. Most researchers opt for a time-consuming, structured visual assessment by trained human observer(s). The present studies were designed to develop a more objective, accurate and rapid method for screening zebrafish for dysmorphology. Instead of the very detailed human assessment, we have developed the computational malformation index, which combines the use of high-content imaging with a very brief human visual assessment. Each larva was quickly assessed by a human observer (basic visual assessment), killed, fixed and assessed for dysmorphology with the Zebratox V4 BioApplication using the Cellomics® ArrayScan® V(TI) high-content image analysis platform. The basic visual assessment adds in-life parameters, and the high-content analysis assesses each individual larva for various features (total area, width, spine length, head-tail length, length-width ratio, perimeter-area ratio). In developing the computational malformation index, a training set of hundreds of embryos treated with hundreds of chemicals were visually assessed using the basic or detailed method. In the second phase, we assessed both the stability of these high-content measurements and its performance using a test set of zebrafish treated with a dose range of two reference chemicals (trans-retinoic acid or cadmium). We found the measures were stable for at least 1 week and comparison of these automated measures to detailed visual inspection of the larvae showed excellent congruence. Our computational malformation index provides an objective manner for rapid phenotypic brightfield assessment of individual larva in a developmental zebrafish assay. Copyright © 2016 John Wiley & Sons, Ltd.

  14. Problems and prospects of modern methods of business analysis in the process of assessment of solvency of borrowers

    Directory of Open Access Journals (Sweden)

    Aptekar Saveliy S.

    2013-03-01

    Full Text Available The goal of the article is a comparative analysis of modern methods of business analysis in the process of assessment of solvency of borrowers of Ukrainian commercial banks, study of prospects and problems of the use of methods in the credit process. In the result of the study the article systemises and considers the conduct of the credit process of Ukrainian commercial banks. It becomes clear from result of the study that it is impossible to obtain a single assessment of solvency of a borrower with generalisation of numerical and non-numerical data. Assessment of qualified analysts is required for a justified assessment of solvency apart from information represented in numbers. Improvement of approaches to assessment of solvency of borrowers and adaptation of the existing foreign experience in this field to specific features of formation of solvency of Ukrainian borrowers are important tasks for the Ukrainian banking system. Prospects of further studies in this direction are establishment of importance of the conduct of business analysis and its key role in assessment of solvency of borrowers as a main instrument of minimisation of the credit risk. Improvement of this sphere of analytical work in Ukrainian banks should be carried out in the following main directions: study and analysis of qualitative indicators of business activity; analysis of main sections of the business plan; expansion of the composition of indicators of the financial analysis for obtaining information; conduct of analysis of possible sources of repayment of loan liabilities; and active use of analysis of cash flows of an enterprise.

  15. Effective modelling, analysis and fatigue assessment of welded structures; Effektive Modellbildung, Analyse und Bewertung fuer die rechnerische Lebensdaueranalyse geschweisster Strukturen

    Energy Technology Data Exchange (ETDEWEB)

    Rauch, R.; Schiele, S. [CADFEM GmbH, Stuttgart (Germany); Rother, K.

    2007-07-01

    Analysis of welded structures is a challenge for the analyst. Improvements of Soft- and Hardware enable an analysis containing full assemblies. Especially for welded structures these possibilities show significant benefits leading to more detailed descriptions of the flux of forces and reducing the effort for the engineer. This paper covers the method for modeling, structural analysis and assessment of welded structures using Finite Element Analysis. A hierarchical concept to localize highly stressed regions using a global model and a local approach according to a notch stress analysis will be presented. (orig.)

  16. Problem formulation and option assessment (PFOA) linking governance and environmental risk assessment for technologies: a methodology for problem analysis of nanotechnologies and genetically engineered organisms.

    Science.gov (United States)

    Nelson, Kristen C; Andow, David A; Banker, Michael J

    2009-01-01

    Societal evaluation of new technologies, specifically nanotechnology and genetically engineered organisms (GEOs), challenges current practices of governance and science. Employing environmental risk assessment (ERA) for governance and oversight assumes we have a reasonable ability to understand consequences and predict adverse effects. However, traditional ERA has come under considerable criticism for its many shortcomings and current governance institutions have demonstrated limitations in transparency, public input, and capacity. Problem Formulation and Options Assessment (PFOA) is a methodology founded on three key concepts in risk assessment (science-based consideration, deliberation, and multi-criteria analysis) and three in governance (participation, transparency, and accountability). Developed through a series of international workshops, the PFOA process emphasizes engagement with stakeholders in iterative stages, from identification of the problem(s) through comparison of multiple technology solutions that could be used in the future with their relative benefits, harms, and risk. It provides "upstream public engagement" in a deliberation informed by science that identifies values for improved decision making.

  17. Credit Risk Assessment Model Based Using Principal component Analysis And Artificial Neural Network

    Directory of Open Access Journals (Sweden)

    Hamdy Abeer

    2016-01-01

    Full Text Available Credit risk assessment for bank customers has gained increasing attention in recent years. Several models for credit scoring have been proposed in the literature for this purpose. The accuracy of the model is crucial for any financial institution’s profitability. This paper provided a high accuracy credit scoring model that could be utilized with small and large datasets utilizing a principal component analysis (PCA based breakdown to the significance of the attributes commonly used in the credit scoring models. The proposed credit scoring model applied PCA to acquire the main attributes of the credit scoring data then an ANN classifier to determine the credit worthiness of an individual applicant. The performance of the proposed model was compared to other models in terms of accuracy and training time. Results, based on German dataset showed that the proposed model is superior to others and computationally cheaper. Thus it can be a potential candidate for future credit scoring systems.

  18. Assessment of Smolt Condition for Travel Time Analysis Project, 1987-1997 Project Review.

    Energy Technology Data Exchange (ETDEWEB)

    Schrock, Robin M.; Hans, Karen M.; Beeman, John W. [US Geological Survey, Western Fisheries Research Center, Columbia River Research Laboratory, Cook, WA

    1997-12-01

    The assessment of Smolt Condition for Travel Time Analysis Project (Bonneville Power Administration Project 87-401) monitored attributes of salmonid smolt physiology in the Columbia and Snake River basins from 1987 to 1997, under the Northwest Power Planning Council Fish and Wildlife Program, in cooperation with the Smolt Monitoring Program of the Fish Passage Center. The primary goal of the project was to investigate the physiological development of juvenile salmonids related to migration rates. The assumption was made that the level of smolt development, interacting with environmental factos such as flow, would be reflected in travel times. The Fish Passage Center applied the physiological measurements of smolt condition to Water Budget management, to regulate flows so as to decrease travel time and increase survival.

  19. Assessment of pipeline stability in the Gulf of Mexico during hurricanes using dynamic analysis

    Directory of Open Access Journals (Sweden)

    Yinghui Tian

    2015-03-01

    Full Text Available Pipelines are the critical link between major offshore oil and gas developments and the mainland. Any inadequate on-bottom stability design could result in disruption and failure, having a devastating impact on the economy and environment. Predicting the stability behavior of offshore pipelines in hurricanes is therefore vital to the assessment of both new design and existing assets. The Gulf of Mexico has a very dense network of pipeline systems constructed on the seabed. During the last two decades, the Gulf of Mexico has experienced a series of strong hurricanes, which have destroyed, disrupted and destabilized many pipelines. This paper first reviews some of these engineering cases. Following that, three case studies are retrospectively simulated using an in-house developed program. The study utilizes the offshore pipeline and hurricane details to conduct a Dynamic Lateral Stability analysis, with the results providing evidence as to the accuracy of the modeling techniques developed.

  20. DECISION ANALYSIS AND TECHNOLOGY ASSESSMENTS FOR METAL AND MASONRY DECONTAMINATION TECHNOLOGIES

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Ebadian, Ph.D.

    1999-01-01

    The purpose of this investigation was to conduct a comparative analysis of innovative technologies for the non-aggressive removal of coatings from metal and masonry surfaces and the aggressive removal of one-quarter to one-inch thickness of surface from structural masonry. The technologies tested should be capable of being used in nuclear facilities. Innovative decontamination technologies are being evaluated under standard, non-nuclear conditions at the FIU-HCET technology assessment site in Miami, Florida. This study is being performed to support the OST, the Deactivation and Decommissioning (D&D) Focus Area, and the environmental restoration of DOE facilities throughout the DOE complex by providing objective evaluations of currently available decontamination technologies.

  1. Computational psycholinguistic analysis and its application in psychological assessment of college students

    Directory of Open Access Journals (Sweden)

    Kučera Dalibor

    2015-06-01

    Full Text Available The paper deals with the issue of computational psycholinguistic analysis (CPA and its experimental application in basic psychological and pedagogical assessment. CPA is a new method which may potentially provide interesting, psychologically relevant information about the author of a particular text, regardless of the text’s factual (semantic content and without the need to obtain additional materials. As part of our QPA-FPT research we studied the link between the linguistic form of a text by Czech college students and their personality characteristics obtained from a psychodiagnostic test battery. The article also discusses the basis of the method, opportunities for practical application and potential use within psychological and pedagogical disciplines

  2. Life cycle assessment on biogas production from straw and its sensitivity analysis.

    Science.gov (United States)

    Wang, Qiao-Li; Li, Wei; Gao, Xiang; Li, Su-Jing

    2016-02-01

    This study aims to investigate the synthetically environmental impacts and Global Warming Potentials (GWPs) of straw-based biogas production process via cradle-to-gate life cycle assessment (LCA) technique. Eco-indicator 99 (H) and IPCC 2007 GWP with three time horizons are utilized. The results indicate that the biogas production process shows beneficial effect on synthetic environment and is harmful to GWPs. Its harmful effects on GWPs are strengthened with time. Usage of gas-fired power which burns the self-produced natural gas (NG) can create a more sustainable process. Moreover, sensitivity analysis indicated that total electricity consumption and CO2 absorbents in purification unit have the largest sensitivity to the environment. Hence, more efforts should be made on more efficient use of electricity and wiser selection of CO2 absorbent.

  3. Assessing climate model software quality: a defect density analysis of three models

    Directory of Open Access Journals (Sweden)

    J. Pipitone

    2012-08-01

    Full Text Available A climate model is an executable theory of the climate; the model encapsulates climatological theories in software so that they can be simulated and their implications investigated. Thus, in order to trust a climate model, one must trust that the software it is built from is built correctly. Our study explores the nature of software quality in the context of climate modelling. We performed an analysis of defect reports and defect fixes in several versions of leading global climate models by collecting defect data from bug tracking systems and version control repository comments. We found that the climate models all have very low defect densities compared to well-known, similarly sized open-source projects. We discuss the implications of our findings for the assessment of climate model software trustworthiness.

  4. Risk assessment for enterprise resource planning (ERP) system implementations: a fault tree analysis approach

    Science.gov (United States)

    Zeng, Yajun; Skibniewski, Miroslaw J.

    2013-08-01

    Enterprise resource planning (ERP) system implementations are often characterised with large capital outlay, long implementation duration, and high risk of failure. In order to avoid ERP implementation failure and realise the benefits of the system, sound risk management is the key. This paper proposes a probabilistic risk assessment approach for ERP system implementation projects based on fault tree analysis, which models the relationship between ERP system components and specific risk factors. Unlike traditional risk management approaches that have been mostly focused on meeting project budget and schedule objectives, the proposed approach intends to address the risks that may cause ERP system usage failure. The approach can be used to identify the root causes of ERP system implementation usage failure and quantify the impact of critical component failures or critical risk events in the implementation process.

  5. Multifractal analysis of surface EMG signals for assessing muscle fatigue during static contractions

    Institute of Scientific and Technical Information of China (English)

    WANG Gang; REN Xiao-mei; LI Lei; WANG Zhi-zhong

    2007-01-01

    This study is aimed at assessing muscle fatigue during a static contraction using multifractal analysis and found that the surface electromyographic (SEMG) signals characterized multifractality during a static contraction. By applying the method of direct determination of the f(α) singularity spectrum, the area of the multifractal spectrum of the SEMG signals was computed. The results showed that the spectrum area significantly increased during muscle fatigue. Therefore the area could be used as an assessor of muscle fatigue. Compared with the median frequency (MDF)-the most popular indicator of muscle fatigue, the spectrum area presented here showed higher sensitivity during a static contraction. So the singularity spectrum area is considered to be a more effective indicator than the MDF for estimating muscle fatigue.

  6. Analysis report for WIPP colloid model constraints and performance assessment parameters

    Energy Technology Data Exchange (ETDEWEB)

    Mariner, Paul E.; Sassani, David Carl

    2014-03-01

    An analysis of the Waste Isolation Pilot Plant (WIPP) colloid model constraints and parameter values was performed. The focus of this work was primarily on intrinsic colloids, mineral fragment colloids, and humic substance colloids, with a lesser focus on microbial colloids. Comments by the US Environmental Protection Agency (EPA) concerning intrinsic Th(IV) colloids and Mg-Cl-OH mineral fragment colloids were addressed in detail, assumptions and data used to constrain colloid model calculations were evaluated, and inconsistencies between data and model parameter values were identified. This work resulted in a list of specific conclusions regarding model integrity, model conservatism, and opportunities for improvement related to each of the four colloid types included in the WIPP performance assessment.

  7. A multi-criteria decision analysis assessment of waste paper management options.

    Science.gov (United States)

    Hanan, Deirdre; Burnley, Stephen; Cooke, David

    2013-03-01

    The use of Multi-criteria Decision Analysis (MCDA) was investigated in an exercise using a panel of local residents and stakeholders to assess the options for managing waste paper on the Isle of Wight. Seven recycling, recovery and disposal options were considered by the panel who evaluated each option against seven environmental, financial and social criteria. The panel preferred options where the waste was managed on the island with gasification and recycling achieving the highest scores. Exporting the waste to the English mainland for incineration or landfill proved to be the least preferred options. This research has demonstrated that MCDA is an effective way of involving community groups in waste management decision making.

  8. Neutron activation analysis for assessing the concentrations of trace elements in laboratory detergents

    Energy Technology Data Exchange (ETDEWEB)

    Iskander, F.Y.

    1986-01-01

    Nondestructive instrumental neutron activation analysis was used to assess the concentration of 20 elements in the following laboratory detergents: Micro, Cavi-Clean liquid, RBS-35, Liqui-Nox, Treg-A-Zyme, Alcojet, Alconox, Alcotabs and Radiacwash: and a detergent additive: CaviClean additive. The upper detected limits or the concentration ranges for the detergents are (element concentration in ..mu..g/g): Ba, <20; Ce, <0.8; Cl, 27-10000; Co, <0.1; Cr, <1; Cs, <0.6; Eu, <0.009; Fe, <3-45; Hf, <0.07; Mn, <10; Ni, <5; Rb, <0.08-0.89; Sb, <0.006-1.8; Sc, <0.0003-0.008; Se, <0.05; Sr <30; Th, <0.6; U, <0.1; V, <10; Zn, <0.2-2.0. The concentrations of trace elements in the examined laboratory detergents are below those reported in the literature for household detergents.

  9. Initial assessment of facial nerve paralysis based on motion analysis using an optical flow method.

    Science.gov (United States)

    Samsudin, Wan Syahirah W; Sundaraj, Kenneth; Ahmad, Amirozi; Salleh, Hasriah

    2016-01-01

    An initial assessment method that can classify as well as categorize the severity of paralysis into one of six levels according to the House-Brackmann (HB) system based on facial landmarks motion using an Optical Flow (OF) algorithm is proposed. The desired landmarks were obtained from the video recordings of 5 normal and 3 Bell's Palsy subjects and tracked using the Kanade-Lucas-Tomasi (KLT) method. A new scoring system based on the motion analysis using area measurement is proposed. This scoring system uses the individual scores from the facial exercises and grades the paralysis based on the HB system. The proposed method has obtained promising results and may play a pivotal role towards improved rehabilitation programs for patients.

  10. A Systematic Approach to Explorative Scenario Analysis in Emergy Assessment with Emphasis on Resilience

    DEFF Research Database (Denmark)

    Kamp, Andreas; Østergård, Hanne

    2016-01-01

    the future may bring. We develop a systematic approach to explorative scenario analysis and attempt to quantify aspects of resilience specifically for emergy assessment (EmA) of production systems. We group system inputs into five categories: (1) fossil fuels, their derivatives, metals and minerals, (2) on......-site renewable inputs, (3) slowly renewable inputs, (4) direct labour and (5) indirect labour. We consider the existing EmA indicators of biophysical efficiency (the unit emergy value, UEV), the degree of dependence on free, renewable, natural flows of energy (%R) and the degree of dependence on local inputs...... by corresponding narratives. We analyse the aggregated effect on UEVs of these scenarios for production systems that differ with respect to how the emergy flow is distributed among the five input categories. We find that for most production systems, scenario conditions significantly affect the UEV. The production...

  11. Central blood pressure assessment using 24-hour brachial pulse wave analysis

    Directory of Open Access Journals (Sweden)

    Muiesan ML

    2014-10-01

    Full Text Available Maria Lorenza Muiesan, Massimo Salvetti, Fabio Bertacchini, Claudia Agabiti-Rosei, Giulia Maruelli, Efrem Colonetti, Anna Paini Clinica Medica, Department of Clinical and Experimental Sciences, University of Brescia, Brescia, Italy Abstract: This review describes the use of central blood pressure (BP measurements during ambulatory monitoring, using noninvasive devices. The principles of measuring central BP by applanation tonometry and by oscillometry are reported, and information on device validation studies is described. The pathophysiological basis for the differences between brachial and aortic pressure is discussed. The currently available methods for central aortic pressure measurement are relatively accurate, and their use has important clinical implications, such as improving diagnostic and prognostic stratification of hypertension and providing a more accurate assessment of the effect of treatment on BP. Keywords: aortic blood pressure measurements, ambulatory monitoring, pulse wave analysis

  12. Gene set analysis for GWAS: assessing the use of modified Kolmogorov-Smirnov statistics.

    Science.gov (United States)

    Debrabant, Birgit; Soerensen, Mette

    2014-10-01

    We discuss the use of modified Kolmogorov-Smirnov (KS) statistics in the context of gene set analysis and review corresponding null and alternative hypotheses. Especially, we show that, when enhancing the impact of highly significant genes in the calculation of the test statistic, the corresponding test can be considered to infer the classical self-contained null hypothesis. We use simulations to estimate the power for different kinds of alternatives, and to assess the impact of the weight parameter of the modified KS statistic on the power. Finally, we show the analogy between the weight parameter and the genesis and distribution of the gene-level statistics, and illustrate the effects of differential weighting in a real-life example.

  13. Life Cycle Assessment of Bio-diesel Production—A Comparative Analysis

    Science.gov (United States)

    Chatterjee, R.; Sharma, V.; Mukherjee, S.; Kumar, S.

    2014-04-01

    This work deals with the comparative analysis of environmental impacts of bio-diesel produced from Jatropha curcas, Rapeseed and Palm oil by applying the life cycle assessment and eco-efficiency concepts. The environmental impact indicators considered in the present paper include global warming potential (GWP, CO2 equivalent), acidification potential (AP, SO2 equivalent) and eutrophication potential (EP, NO3 equivalent). Different weighting techniques have been used to present and evaluate the environmental characteristics of bio-diesel. With the assistance of normalization values, the eco-efficiency was demonstrated in this work. The results indicate that the energy consumption of bio-diesel production is lowest in Jatropha while AP and EP are more in case of Jatropha than that of Rapeseed and Palm oil.

  14. Condition assessment of transformer insulation using dielectric frequency response analysis by artificial bee colony algorithm

    Directory of Open Access Journals (Sweden)

    Bigdeli Mehdi

    2016-03-01

    Full Text Available Transformers are one of the most important components of the power system. It is important to maintain and assess the condition. Transformer lifetime depends on the life of its insulation and insulation life is also strongly influenced by moisture in the insulation. Due to importance of this issue, in this paper a new method is introduced for determining the moisture content of the transformer insulation system using dielectric response analysis in the frequency domain based on artificial bee colony algorithm. First, the master curve of dielectric response is modeled. Then, using proposed method the master curve and the measured dielectric response curves are compared. By analyzing the results of the comparison, the moisture content of paper insulation, electrical conductivity of the insulating oil and dielectric model dimensions are determined. Finally, the proposed method is applied to several practical samples to demonstrate its capabilities compared with the well-known conventional method.

  15. Assessment of TEES reg sign applications for Wet Industrial Wastes: Energy benefit and economic analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Elliott, D.C.; Scheer, T.H.

    1992-02-01

    Fundamental work is catalyzed biomass pyrolysis/gasification led to the Thermochemical Environmental Energy System (TEES{reg sign}) concept, a means of converting moist biomass feedstocks to high-value fuel gases such as methane. A low-temperature (350{degrees}C), pressurized (3100 psig) reaction environment and a nickel catalyst are used to reduce volumes of very high-moisture wastes such as food processing byproducts while producing useful quantities of energy. A study was conducted to assess the economic viability of a range of potential applications of the process. Cases examined included feedstocks of cheese whey, grape pomace, spent grain, and an organic chemical waste stream. The analysis indicated that only the organic chemical waste process is economically attractive in the existing energy/economic environment. However, food processing cases will become attractive as alternative disposal practices are curtailed and energy prices rise.

  16. Assessment of oil weathering by gas chromatography-mass spectrometry, time warping and principal component analysis

    DEFF Research Database (Denmark)

    Malmquist, Linus M.V.; Olsen, Rasmus R.; Hansen, Asger B.

    2007-01-01

    Detailed characterization and understanding of oil weathering at the molecular level is an essential part of tiered approaches for forensic oil spill identification, for risk assessment of terrestrial and marine oil spills, and for evaluating effects of bioremediation initiatives. Here......, a chemometricbased method is applied to data from two in vitro experiments in order to distinguish the effects of evaporation and dissolution processes on oil composition. The potential of the method for obtaining detailed chemical information of the effects from evaporation and dissolution processes, to determine...... weathering state and to distinguish between various weathering processes is investigated and discussed. The method is based on comprehensive and objective chromatographic data processing followed by principal component analysis (PCA) of concatenated sections of gas chromatography–mass spectrometry...

  17. MECHANICAL PROPERTIES ANALYSIS AND RELIABILITY ASSESSMENT OF LAMINATED VENEER LUMBER (LVL HAVING DIFFERENT PATTERNS OF ASSEMBLY

    Directory of Open Access Journals (Sweden)

    Bing Xue,

    2012-02-01

    Full Text Available Laminated Veneer Lumber (LVL panels made from poplar (Populus ussuriensis Kom. and birch (Betula platyphylla Suk. veneers were tested for mechanical properties. The effects of the assembly pattern on the modulus of elasticity (MOE and modulus of rupture (MOR of the LVL with vertical load testing were investigated. Three analytical methods were used: composite material mechanics, computer simulation, and static testing. The reliability of the different LVL assembly patterns was assessed using the method of Monte-Carlo. The results showed that the theoretical and ANSYS analysis results of the LVL MOE and MOR were very close to those of the static test results, and the largest proportional error was not greater than 5%. The veneer amount was the same, but the strength and reliability of the LVL made of birch veneers on the top and bottom was much more than the LVL made of poplar veneers. Good assembly patterns can improve the utility value of wood.

  18. Interprofessional service improvement learning and patient safety: a content analysis of pre-registration students' assessments.

    Science.gov (United States)

    Machin, Alison I; Jones, Diana

    2014-02-01

    A culture of continuous service improvement underpins safe, efficient and cost-effective health and social care. This paper reports a qualitative research study of assessment material from one cohort of final year pre-registration health and social care students' interprofessional service improvement learning experience. Initially introduced to the theory of service improvement, students were linked with an interprofessional buddy group, and subsequently planned and implemented, if possible, a small scale service improvement project within a practice placement setting. Assessment was by oral project presentation and written reflection on learning. Summative assessment materials from 150 students were subjected to content analysis to identify: service user triggers for service improvement; ideas to address the identified area for improvement; and perceptions of service improvement learning. Triggers for service improvements included service user disempowerment, poor communication, gaps in service provision, poor transitions, lack of information, lack of role clarity and role duplication, and differed between professions. Ideas for improvement included both the implementation of evidence based best practice protocols in a local context and also innovative approaches to problem solving. Students described both intrapersonal and interprofessional learning as a result of engaging with service improvement theory and practice. Service improvement learning in an interprofessional context has positive learning outcomes for health and social care students. Students can identify improvement opportunities that may otherwise go undetected. Engaging positively in interprofessional service improvement learning as a student is an important rehearsal for life as a qualified practitioner. It can help students to develop an ability to challenge unsafe practice elegantly, thereby acting as advocates for the people in their care. Universities can play a key support role by working

  19. Exposure-response analysis to assess concentration–QTc relationship of CC-122

    Directory of Open Access Journals (Sweden)

    Li Y

    2016-09-01

    Full Text Available Yan Li, Leonidas N Carayannopoulos, Michael Thomas, Maria Palmisano, Simon Zhou Translational Development and Clinical Pharmacology, Celgene Corporation, Summit, NJ, USA Abstract: CC-122 hydrochloride is a novel pleiotropic pathway modifier compound that binds cereblon, a substrate receptor of the Cullin 4 RING E3 ubiquitin ligase complex. CC-122 has multiple activities including modulation of immune cells, antiproliferative activity of multiple myeloma and lymphoma cells, and antiangiogenic activity. CC-122 is being developed as an oncology treatment for hematologic malignancies and advanced solid tumors. Cardiovascular and vital sign assessments of CC-122 have been conducted in hERG assays in vitro and in a 28-day good laboratory practice monkey study with negative signals. To assess the potential concentration–QTc relationship in humans and to ascertain or exclude a small QT effect by CC-122, a plasma concentration exposure- and ΔQTcF-response model of CC-122 was developed. Intensive CC-122 concentration and paired triplicate electrocardiogram data from a single ascending dose study were included in the analysis. The parameters included in the final linear exposure-response model are intercept, slope, and treatment effect. The slope estimate of 0.0201 with 90% CI of (0.009, 0.035 indicates a weak relationship between ΔQTcF and CC-122 concentration. The upper bounds of the 90% CI of the model-predicted ΔΔQTcF effect at Cmax from the 4 mg clinical dose and the supratherapeutic dose of 15 mg (1.18 ms and 8.76 ms, respectively are <10 ms threshold, suggesting that the risk of CC-122 QT prolongation effect at the relevant therapeutic dose range from 1 mg to 4 mg is low. Keywords: cardiovascular assessment, QT prolongation effect

  20. Assessment of Cr(VI-induced cytotoxicity and genotoxicity using high content analysis.

    Directory of Open Access Journals (Sweden)

    Chad M Thompson

    Full Text Available Oral exposure to high concentrations of hexavalent chromium [Cr(VI] induces intestinal redox changes, villus cytotoxicity, crypt hyperplasia, and intestinal tumors in mice. To assess the effects of Cr(VI in a cell model relevant to the intestine, undifferentiated (proliferating and differentiated (confluent Caco-2 cells were treated with Cr(VI, hydrogen peroxide or rotenone for 2-24 hours. DNA damage was then assessed by nuclear staining intensity of 8-hydroxydeoxyguanosine (8-OHdG and phosphorylated histone variant H2AX (γ-H2AX measured by high content analysis methods. In undifferentiated Caco-2, all three chemicals increased 8-OHdG and γ-H2AX staining at cytotoxic concentrations, whereas only 8-OHdG was elevated at non-cytotoxic concentrations at 24 hr. Differentiated Caco-2 were more resistant to cytotoxicity and DNA damage than undifferentiated cells, and there were no changes in apoptotic markers p53 or annexin-V. However, Cr(VI induced a dose-dependent translocation of the unfolded protein response transcription factor ATF6 into the nucleus. Micronucleus (MN formation was assessed in CHO-K1 and A549 cell lines. Cr(VI increased MN frequency in CHO-K1 only at highly cytotoxic concentrations. Relative to the positive control Mitomycin-C, Cr(VI only slightly increased MN frequency in A549 at mildly cytotoxic concentrations. The results demonstrate that Cr(VI genotoxicity correlates with cytotoxic concentrations, and that H2AX phosphorylation occurs at higher concentrations than oxidative DNA damage in proliferating Caco-2 cells. The findings suggest that in vitro genotoxicity of Cr(VI is primarily oxidative in nature at low concentrations. Implications for in vivo intestinal toxicity of Cr(VI will be discussed.

  1. Assessment of Cr(VI)-Induced Cytotoxicity and Genotoxicity Using High Content Analysis

    Science.gov (United States)

    Thompson, Chad M.; Fedorov, Yuriy; Brown, Daniel D.; Suh, Mina; Proctor, Deborah M.; Kuriakose, Liz; Haws, Laurie C.; Harris, Mark A.

    2012-01-01

    Oral exposure to high concentrations of hexavalent chromium [Cr(VI)] induces intestinal redox changes, villus cytotoxicity, crypt hyperplasia, and intestinal tumors in mice. To assess the effects of Cr(VI) in a cell model relevant to the intestine, undifferentiated (proliferating) and differentiated (confluent) Caco-2 cells were treated with Cr(VI), hydrogen peroxide or rotenone for 2–24 hours. DNA damage was then assessed by nuclear staining intensity of 8-hydroxydeoxyguanosine (8-OHdG) and phosphorylated histone variant H2AX (γ-H2AX) measured by high content analysis methods. In undifferentiated Caco-2, all three chemicals increased 8-OHdG and γ-H2AX staining at cytotoxic concentrations, whereas only 8-OHdG was elevated at non-cytotoxic concentrations at 24 hr. Differentiated Caco-2 were more resistant to cytotoxicity and DNA damage than undifferentiated cells, and there were no changes in apoptotic markers p53 or annexin-V. However, Cr(VI) induced a dose-dependent translocation of the unfolded protein response transcription factor ATF6 into the nucleus. Micronucleus (MN) formation was assessed in CHO-K1 and A549 cell lines. Cr(VI) increased MN frequency in CHO-K1 only at highly cytotoxic concentrations. Relative to the positive control Mitomycin-C, Cr(VI) only slightly increased MN frequency in A549 at mildly cytotoxic concentrations. The results demonstrate that Cr(VI) genotoxicity correlates with cytotoxic concentrations, and that H2AX phosphorylation occurs at higher concentrations than oxidative DNA damage in proliferating Caco-2 cells. The findings suggest that in vitro genotoxicity of Cr(VI) is primarily oxidative in nature at low concentrations. Implications for in vivo intestinal toxicity of Cr(VI) will be discussed. PMID:22905163

  2. Operational Modal Analysis and the Performance Assessment of Vehicle Suspension Systems

    Directory of Open Access Journals (Sweden)

    L. Soria

    2012-01-01

    Full Text Available Comfort, road holding and safety of passenger cars are mainly influenced by an appropriate design of suspension systems. Improvements of the dynamic behaviour can be achieved by implementing semi-active or active suspension systems. In these cases, the correct design of a well-performing suspension control strategy is of fundamental importance to obtain satisfying results. Operational Modal Analysis allows the experimental structural identification in those that are the real operating conditions: Moving from output-only data, leading to modal models linearised around the more interesting working points and, in the case of controlled systems, providing the needed information for the optimal design and verification of the controller performance. All these characters are needed for the experimental assessment of vehicle suspension systems. In the paper two suspension architectures are considered equipping the same car type. The former is a semi-active commercial system, the latter a novel prototypic active system. For the assessment of suspension performance, two different kinds of tests have been considered, proving ground tests on different road profiles and laboratory four poster rig tests. By OMA-processing the signals acquired in the different testing conditions and by comparing the results, it is shown how this tool can be effectively utilised to verify the operation and the performance of those systems, by only carrying out a simple, cost-effective road test.

  3. Dependence Assessment in Human Reliability Analysis Using Evidence Theory and AHP.

    Science.gov (United States)

    Su, Xiaoyan; Mahadevan, Sankaran; Xu, Peida; Deng, Yong

    2015-07-01

    Dependence assessment among human errors in human reliability analysis (HRA) is an important issue. Many of the dependence assessment methods in HRA rely heavily on the expert's opinion, thus are subjective and may sometimes cause inconsistency. In this article, we propose a computational model based on the Dempster-Shafer evidence theory (DSET) and the analytic hierarchy process (AHP) method to handle dependence in HRA. First, dependence influencing factors among human tasks are identified and the weights of the factors are determined by experts using the AHP method. Second, judgment on each factor is given by the analyst referring to anchors and linguistic labels. Third, the judgments are represented as basic belief assignments (BBAs) and are integrated into a fused BBA by weighted average combination in DSET. Finally, the CHEP is calculated based on the fused BBA. The proposed model can deal with ambiguity and the degree of confidence in the judgments, and is able to reduce the subjectivity and improve the consistency in the evaluation process.

  4. Source term assessment with ASTEC and associated uncertainty analysis using SUNSET tool

    Energy Technology Data Exchange (ETDEWEB)

    Chevalier-Jabet, K., E-mail: karine.chevalier-jabet@irsn.fr; Cousin, F.; Cantrel, L.; Séropian, C.

    2014-06-01

    Several accidental scenarios have been simulated using the ASTEC integral IRSN-GRS code for a French 1300 MWe PWR, including several break sizes or locations, highlighting the effect of safety systems and of iodine chemistry in the reactor coolant system (RCS) and in the containment on iodine source term evaluations. Iodine chemistry in the RCS and in the containment is still subject to significant uncertainties and it is thus studied in on-going R and D programs. To assess the impact of uncertainties, ASTEC has been coupled to the IRSN uncertainty propagation and sensitivity analysis tool SUNSET. Focusing on a loss of feed-water of steam generator accident, ASTEC/SUNSET calculations have been performed to assess the effect of remaining uncertainties relative to iodine behaviour on the source term. Calculations show that the postulated lack of knowledge may impact the iodine source term in the environment by at least one decade, confirming the importance of the on-going R and D programs to improve the knowledge on iodine chemistry.

  5. Life cycle assessment and economic analysis of a low concentrating photovoltaic system.

    Science.gov (United States)

    De Feo, G; Forni, M; Petito, F; Renno, C

    2016-10-01

    Many new photovoltaic (PV) applications, such as the concentrating PV (CPV) systems, are appearing on the market. The main characteristic of CPV systems is to concentrate sunlight on a receiver by means of optical devices and to decrease the solar cells area required. A low CPV (LCPV) system allows optimizing the PV effect with high increase of generated electric power as well as decrease of active surface area. In this paper, an economic analysis and a life cycle assessment (LCA) study of a particular LCPV scheme is presented and its environmental impacts are compared with those of a PV traditional system. The LCA study was performed with the software tool SimaPro 8.0.2, using the Econinvent 3.1 database. A functional unit of 1 kWh of electricity produced was chosen. Carbon Footprint, Ecological Footprint and ReCiPe 2008 were the methods used to assess the environmental impacts of the LCPV plant compared with a corresponding traditional system. All the methods demonstrated the environmental convenience of the LCPV system. The innovative system allowed saving 16.9% of CO2 equivalent in comparison with the traditional PV plant. The environmental impacts saving was 17% in terms of Ecological Footprint, and, finally, 15.8% with the ReCiPe method.

  6. [Application of uncertainty assessment in NIR quantitative analysis of traditional Chinese medicine].

    Science.gov (United States)

    Xue, Zhong; Xu, Bing; Liu, Qian; Shi, Xin-Yuan; Li, Jian-Yu; Wu, Zhi-Sheng; Qiao, Yan-Jiang

    2014-10-01

    The near infrared (NIR) spectra of Liuyi San samples were collected during the mixing process and the quantitative models by PLS (partial least squares) method were generated for the quantification of the concentration of glycyrrhizin. The PLS quantitative model had good calibration and prediction performances (r(cal) 0.998 5, RMSEC = 0.044 mg · g(-1); r(val) = 0.947 4, RMSEP = 0.124 mg · g(-1)), indicating that NIR spectroscopy can be used as a rapid determination method of the concentration of glycyrrhizin in Liuyi San powder. After the validation tests were designed, the Liao-Lin-Iyer approach based on Monte Carlo simulation was used to estimate β-content-γ-confidence tolerance intervals. Then the uncertainty was calculated, and the uncer- tainty profile was drawn. The NIR analytical method was considered valid when the concentration of glycyrrhizin is above 1.56 mg · g(-1) since the uncertainty fell within the acceptable limits (λ = ± 20%). The results showed that uncertainty assessment can be used in NIR quantitative models of glycyrrhizin for different concentrations and provided references for other traditional Chinese medicine to finish the uncertainty assessment using NIR quantitative analysis.

  7. Assessing breast cancer masking risk with automated texture analysis in full field digital mammography

    DEFF Research Database (Denmark)

    Kallenberg, Michiel Gijsbertus J; Lillholm, Martin; Diao, Pengfei

    2015-01-01

    status in a five-fold cross validation. To assess the interaction of the texture scores with breast density, Volpara Density Grade (VDG) was determined for each image using Volpara, Matakina Technology, New Zealand. RESULTS We grouped women into low (VDG 1/2) versus high (VDG 3/4) dense, and low...... for the high texture score group (as compared to the low texture score group) this OR was 2.19 (1.37-3.49). Women who were classified as low dense but had a high texture score had a higher masking risk (OR 1.66 (0.53-5.20)) than women with dense breasts but a low texture score. CONCLUSION Mammographic texture...... is associated with breast cancer masking risk. We were able to identify a subgroup of women who are at an increased risk of having a cancer that is not detected due to textural masking, even though their breasts are non-dense. CLINICAL RELEVANCE/APPLICATION Automatic texture analysis enables assessing the risk...

  8. Multi-observation PET image analysis for patient follow-up quantitation and therapy assessment

    Energy Technology Data Exchange (ETDEWEB)

    David, S; Visvikis, D; Roux, C; Hatt, M, E-mail: simon.david@etudiant.univ-brest.fr [INSERM U650, LaTIM, Brest, F-29200 (France)

    2011-09-21

    In positron emission tomography (PET) imaging, an early therapeutic response is usually characterized by variations of semi-quantitative parameters restricted to maximum SUV measured in PET scans during the treatment. Such measurements do not reflect overall tumor volume and radiotracer uptake variations. The proposed approach is based on multi-observation image analysis for merging several PET acquisitions to assess tumor metabolic volume and uptake variations. The fusion algorithm is based on iterative estimation using a stochastic expectation maximization (SEM) algorithm. The proposed method was applied to simulated and clinical follow-up PET images. We compared the multi-observation fusion performance to threshold-based methods, proposed for the assessment of the therapeutic response based on functional volumes. On simulated datasets the adaptive threshold applied independently on both images led to higher errors than the ASEM fusion and on clinical datasets it failed to provide coherent measurements for four patients out of seven due to aberrant delineations. The ASEM method demonstrated improved and more robust estimation of the evaluation leading to more pertinent measurements. Future work will consist in extending the methodology and applying it to clinical multi-tracer datasets in order to evaluate its potential impact on the biological tumor volume definition for radiotherapy applications.

  9. A computer-based image analysis method for assessing the severity of hip joint osteoarthritis

    Energy Technology Data Exchange (ETDEWEB)

    Boniatis, Ioannis [Department of Medical Physics, University of Patras, School of Medicine, 265 00 Patras (Greece); Costaridou, Lena [Department of Medical Physics, University of Patras, School of Medicine, 265 00 Patras (Greece); Cavouras, Dionisis [Department of Medical Instrumentation Technology, Technological Educational Institute of Athens, 122 10 Athens (Greece); Panagiotopoulos, Elias [Department of Orthopaedics, School of Medicine, University of Patras, 265 00 Patras (Greece); Panayiotakis, George [Department of Medical Physics, University of Patras, School of Medicine, 265 00 Patras (Greece)]. E-mail: panayiot@upatras.gr

    2006-12-20

    A computer-based image analysis method was developed for assessing the severity of hip osteoarthritis (OA). Eighteen pelvic radiographs of patients with verified unilateral hip OA, were digitized and enhanced employing custom developed software. Two ROIs corresponding to osteoarthritic and contralateral-physiological radiographic Hip Joint Spaces (HJSs) were determined on each radiograph. Textural features were extracted from the HJS-ROIs utilizing the run-length matrices and Laws textural measures. A k-Nearest Neighbour based hierarchical tree structure was designed for classifying hips into three OA severity categories labeled as 'Normal', 'Mild/Moderate', and 'Severe'. Employing the run-length features, the overall classification accuracy of the hierarchical tree structure was 86.1%. The utilization of Laws' textural measures improved the system classification performance, providing an overall classification accuracy of 94.4%. The proposed method maybe of value to physicians in assessing the severity of hip OA.

  10. Fuzzy-logic-based network for complex systems risk assessment: application to ship performance analysis.

    Science.gov (United States)

    Abou, Seraphin C

    2012-03-01

    In this paper, a new interpretation of intuitionistic fuzzy sets in the advanced framework of the Dempster-Shafer theory of evidence is extended to monitor safety-critical systems' performance. Not only is the proposed approach more effective, but it also takes into account the fuzzy rules that deal with imperfect knowledge/information and, therefore, is different from the classical Takagi-Sugeno fuzzy system, which assumes that the rule (the knowledge) is perfect. We provide an analytical solution to the practical and important problem of the conceptual probabilistic approach for formal ship safety assessment using the fuzzy set theory that involves uncertainties associated with the reliability input data. Thus, the overall safety of the ship engine is investigated as an object of risk analysis using the fuzzy mapping structure, which considers uncertainty and partial truth in the input-output mapping. The proposed method integrates direct evidence of the frame of discernment and is demonstrated through references to examples where fuzzy set models are informative. These simple applications illustrate how to assess the conflict of sensor information fusion for a sufficient cooling power system of vessels under extreme operation conditions. It was found that propulsion engine safety systems are not only a function of many environmental and operation profiles but are also dynamic and complex.

  11. Assessment of metal sorption mechanisms by aquatic macrophytes using PIXE analysis

    Energy Technology Data Exchange (ETDEWEB)

    Módenes, A.N., E-mail: anmodenes@yahoo.com.br [Department of Chemical Engineering-Postgraduate Program, West Parana State University, Campus of Toledo, rua da Faculdade 645, Jd. La Salle, 85903-000 Toledo, PR (Brazil); Espinoza-Quiñones, F.R.; Santos, G.H.F.; Borba, C.E. [Department of Chemical Engineering-Postgraduate Program, West Parana State University, Campus of Toledo, rua da Faculdade 645, Jd. La Salle, 85903-000 Toledo, PR (Brazil); Rizzutto, M.A. [Physics Institute, University of São Paulo, Rua do Matão s/n, Travessa R 187, 05508-900 São Paulo, SP (Brazil)

    2013-10-15

    Highlights: • Divalent metal ion removals by Egeria densa biosorbent. • Multielements concentrations in biosorbent samples by PIXE analysis. • Elements mass balance in liquid and solid phase before and after metal removals. • Assessment of the mechanisms involved in Cd{sup 2+} and Zn{sup 2+} removal by biosorbent. • Confirmation of the signature of ion exchange process in metal removal. -- Abstract: In this work, a study of the metal sorption mechanism by dead biomass has been performed. All batch metal biosorption experiments were performed using the aquatic macrophyte Egeria densa as biosorbent. Divalent cadmium and zinc solutions were used to assess the sorption mechanisms involved. Using a suitable equilibrium time of 2 h and a mixture of 300 mg biosorbent and 50 mL metal solution at pH 5, monocomponent sorption experiments were performed. In order to determine the residual amounts of metals in the aqueous solutions and the concentrations of removed metals in the dry biomass, Particle Induced X-ray Emission (PIXE) measurements in thin and thick target samples were carried out. Based on the strong experimental evidence from the mass balance among the major elements participating in the sorption processes, an ion exchange process was identified as the mechanism responsible for metal removal by the dry biomass.

  12. Assessing short summaries with human judgments procedure and latent semantic analysis in narrative and expository texts.

    Science.gov (United States)

    León, José A; Olmos, Ricardo; Escudero, Inmaculada; Cañas, José J; Salmerón, Lalo

    2006-11-01

    In the present study, we tested a computer-based procedure for assessing very concise summaries (50 words long) of two types of text (narrative and expository) using latent semantic analysis (LSA) in comparison with the judgments of four human experts. LSA was used to estimate semantic similarity using six different methods: four holistic (summary-text, summary-summaries, summary-expert summaries, and pregraded-ungraded summary) and two componential (summary-sentence text and summary-main sentence text). A total of 390 Spanish middle and high school students (14-16 years old) and six experts read a narrative or expository text and later summarized it. The results support the viability of developing a computerized assessment tool using human judgments and LSA, although the correlation between human judgments and LSA was higher in the narrative text than in the expository, and LSA correlated more with human content ratings thanwith hu mancoherence ratings. Finally, theholistic methods were found to be more reliable than the componential methods analyzed in this study.

  13. Electric Power quality Analysis in research reactor: Impacts on nuclear safety assessment and electrical distribution reliability

    Energy Technology Data Exchange (ETDEWEB)

    Touati, Said; Chennai, Salim; Souli, Aissa [Nuclear Research Centre of Birine, Ain Oussera, Djelfa Province (Algeria)

    2015-07-01

    The increased requirements on supervision, control, and performance in modern power systems make power quality monitoring a common practise for utilities. Large databases are created and automatic processing of the data is required for fast and effective use of the available information. Aim of the work presented in this paper is the development of tools for analysis of monitoring power quality data and in particular measurements of voltage and currents in various level of electrical power distribution. The study is extended to evaluate the reliability of the electrical system in nuclear plant. Power Quality is a measure of how well a system supports reliable operation of its loads. A power disturbance or event can involve voltage, current, or frequency. Power disturbances can originate in consumer power systems, consumer loads, or the utility. The effect of power quality problems is the loss power supply leading to severe damage to equipments. So, we try to track and improve system reliability. The assessment can be focused on the study of impact of short circuits on the system, harmonics distortion, power factor improvement and effects of transient disturbances on the Electrical System during motor starting and power system fault conditions. We focus also on the review of the Electrical System design against the Nuclear Directorate Safety Assessment principles, including those extended during the last Fukushima nuclear accident. The simplified configuration of the required system can be extended from this simple scheme. To achieve these studies, we have used a demo ETAP power station software for several simulations. (authors)

  14. A Cost-Benefit and Accurate Method for Assessing Microalbuminuria: Single versus Frequent Urine Analysis

    Directory of Open Access Journals (Sweden)

    Roholla Hemmati

    2013-01-01

    Full Text Available Background. The purpose of this study was to answer the question whether a single testing for microalbuminuria results in a reliable conclusion leading costs saving. Methods. This current cross-sectional study included a total of 126 consecutive persons. Microalbuminuria was assessed by collection of two fasting random urine specimens on arrival to the clinic as well as one week later in the morning. Results. In overall, 17 out of 126 participants suffered from microalbuminuria that, among them, 12 subjects were also diagnosed as microalbuminuria once assessing this factor with a sensitivity of 70.6%, a specificity of 100%, a PPV of 100%, a NPV of 95.6%, and an accuracy of 96.0%. The measured sensitivity, specificity, PVV, NPV, and accuracy in hypertensive patients were 73.3%, 100%, 100%, 94.8%, and 95.5%, respectively. Also, these rates in nonhypertensive groups were 50.0%, 100%, 100%, 97.3%, and 97.4%, respectively. According to the ROC curve analysis, a single measurement of UACR had a high value for discriminating defected from normal renal function state (c=0.989. Urinary albumin concentration in a single measurement had also high discriminative value for diagnosis of damaged kidney (c=0.995. Conclusion. The single testing of both UACR and urine albumin level rather frequent testing leads to high diagnostic sensitivity, specificity, and accuracy as well as high predictive values in total population and also in hypertensive subgroups.

  15. Seamless Level 2/Level 3 probabilistic risk assessment using dynamic event tree analysis

    Science.gov (United States)

    Osborn, Douglas Matthew

    The current approach to Level 2 and Level 3 probabilistic risk assessment (PRA) using the conventional event-tree/fault-tree methodology requires pre-specification of event order occurrence which may vary significantly in the presence of uncertainties. Manual preparation of input data to evaluate the possible scenarios arising from these uncertainties may also lead to errors from faulty/incomplete input preparation and their execution using serial runs may lead to computational challenges. A methodology has been developed for Level 2 analysis using dynamic event trees (DETs) that removes these limitations with systematic and mechanized quantification of the impact of aleatory uncertainties on possible consequences and their likelihoods. The methodology is implemented using the Analysis of Dynamic Accident Progression Trees (ADAPT) software. For the purposes of this work, aleatory uncertainties are defined as those arising from the stochastic nature of the processes under consideration, such as the variability of weather, in which the probability of weather patterns is predictable but the conditions at the time of the accident are a matter of chance. Epistemic uncertainties are regarded as those arising from the uncertainty in the model (system code) input parameters (e.g., friction or heat transfer correlation parameters). This work conducts a seamless Level 2/3 PRA using a DET analysis. The research helps to quantify and potentially reduce the magnitude of the source term uncertainty currently experienced in Level 3 PRA. Current techniques have been demonstrated with aleatory uncertainties for environmental releases of radioactive materials. This research incorporates epistemic and aleatory uncertainties in a phenomenologically consistent manner through use of DETs. The DETs were determined using the ADAPT framework and linking ADAPT with MELCOR, MELMACCS, and the MELCOR Accident Consequence Code System, Version 2. Aleatory and epistemic uncertainties incorporated

  16. Groundwater quality assessment using chemometric analysis in the Adyar River, South India.

    Science.gov (United States)

    Venugopal, T; Giridharan, L; Jayaprakash, M

    2008-08-01

    A multivariate statistical technique has been used to assess the factors responsible for the chemical composition of the groundwater near the highly polluted Adyar River. Basic chemical parameters of the groundwater have been pooled together for evaluating and interpreting a few empirical factors controlling the chemical nature of the water. Twenty-three groundwater samples were collected in the vicinity of the Adyar River. Box-whisker plots were drawn to evaluate the chemical variation and the seasonal effect on the variables. R-mode factor analysis and cluster analysis were applied to the geochemical parameters of the water to identify the factors affecting the chemical composition of the groundwater. Dendograms of both the seasons gives two major clusters reflecting the groups of polluted and unpolluted stations. The other two minor clusters and the movement of stations from one cluster to another clearly bring out the seasonal variation in the chemical composition of the groundwater. The results of the R-mode factor analysis reveal that the groundwater chemistry of the study area reflects the influence of anthropogenic activities, rock-water interactions, saline water intrusion into the river water, and subsequent percolation into the groundwater. The complex geochemical data of the groundwater were interpreted by reducing them to seven major factors, and the seasonal variation in the chemistry of water was clearly brought out by these factors. The higher concentration of heavy metals such as Fe and Cr is attributed to the rock-water interaction and effluents from industries such as tanning, chrome-plating, and dyeing. In the urban area, the Pb concentration is high due to industrial as well as urban runoff of the atmospheric deposition from automobile pollution. Factor score analysis was used successfully to delineate the stations under study with the contributing factors, and the seasonal effect on the sample stations was identified and evaluated.

  17. Analysis of risk factors and risk assessment for ischemic stroke recurrence

    Directory of Open Access Journals (Sweden)

    Xiu-ying LONG

    2016-08-01

    Full Text Available Objective To screen the risk factors for recurrence of ischemic stroke and to assess the risk of recurrence. Methods Essen Stroke Risk Score (ESRS was used to evaluate the risk of recurrence in 176 patients with ischemic stroke (96 cases of first onset and 80 cases of recurrence. Univariate and multivariate stepwise Logistic regression analysis was used to screen risk factors for recurrence of ischemic stroke.  Results There were significant differences between first onset group and recurrence group on age, the proportion of > 75 years old, hypertension, diabetes, coronary heart disease, peripheral angiopathy, transient ischemic attack (TIA or ischemic stroke, drinking and ESRS score (P < 0.05, for all. First onset group included one case of ESRS 0 (1.04%, 8 cases of 1 (8.33%, 39 cases of 2 (40.63%, 44 cases of 3 (45.83%, 4 cases of 4 (4.17%. Recurrence group included 2 cases of ESRS 3 (2.50%, 20 cases of 4 (25% , 37 cases of 5 (46.25% , 18 cases of 6 (22.50% , 3 cases of 7 (3.75% . There was significant difference between 2 groups (Z = -11.376, P = 0.000. Logistic regression analysis showed ESRS > 3 score was independent risk factor for recurrence of ischemic stroke (OR = 31.324, 95%CI: 3.934-249.430; P = 0.001.  Conclusions ESRS > 3 score is the independent risk factor for recurrence of ischemic stroke. It is important to strengthen risk assessment of recurrence of ischemic stroke. To screen and control risk factors is the key to secondary prevention of ischemic stroke. DOI: 10.3969/j.issn.1672-6731.2016.07.011

  18. Reduced-complexity modeling of braided rivers: Assessing model performance by sensitivity analysis, calibration, and validation

    Science.gov (United States)

    Ziliani, L.; Surian, N.; Coulthard, T. J.; Tarantola, S.

    2013-12-01

    paper addresses an important question of modeling stream dynamics: How may numerical models of braided stream morphodynamics be rigorously and objectively evaluated against a real case study? Using simulations from the Cellular Automaton Evolutionary Slope and River (CAESAR) reduced-complexity model (RCM) of a 33 km reach of a large gravel bed river (the Tagliamento River, Italy), this paper aims to (i) identify a sound strategy for calibration and validation of RCMs, (ii) investigate the effectiveness of multiperformance model assessments, (iii) assess the potential of using CAESAR at mesospatial and mesotemporal scales. The approach used has three main steps: first sensitivity analysis (using a screening method and a variance-based method), then calibration, and finally validation. This approach allowed us to analyze 12 input factors initially and then to focus calibration only on the factors identified as most important. Sensitivity analysis and calibration were performed on a 7.5 km subreach, using a hydrological time series of 20 months, while validation on the whole 33 km study reach over a period of 8 years (2001-2009). CAESAR was able to reproduce the macromorphological changes of the study reach and gave good results as for annual bed load sediment estimates which turned out to be consistent with measurements in other large gravel bed rivers but showed a poorer performance in reproducing the characteristics of the braided channel (e.g., braiding intensity). The approach developed in this study can be effectively applied in other similar RCM contexts, allowing the use of RCMs not only in an explorative manner but also in obtaining quantitative results and scenarios.

  19. Extended defense systems :I. adversary-defender modeling grammar for vulnerability analysis and threat assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Merkle, Peter Benedict

    2006-03-01

    Vulnerability analysis and threat assessment require systematic treatments of adversary and defender characteristics. This work addresses the need for a formal grammar for the modeling and analysis of adversary and defender engagements of interest to the National Nuclear Security Administration (NNSA). Analytical methods treating both linguistic and numerical information should ensure that neither aspect has disproportionate influence on assessment outcomes. The adversary-defender modeling (ADM) grammar employs classical set theory and notation. It is designed to incorporate contributions from subject matter experts in all relevant disciplines, without bias. The Attack Scenario Space U{sub S} is the set universe of all scenarios possible under physical laws. An attack scenario is a postulated event consisting of the active engagement of at least one adversary with at least one defended target. Target Information Space I{sub S} is the universe of information about targets and defenders. Adversary and defender groups are described by their respective Character super-sets, (A){sub P} and (D){sub F}. Each super-set contains six elements: Objectives, Knowledge, Veracity, Plans, Resources, and Skills. The Objectives are the desired end-state outcomes. Knowledge is comprised of empirical and theoretical a priori knowledge and emergent knowledge (learned during an attack), while Veracity is the correspondence of Knowledge with fact or outcome. Plans are ordered activity-task sequences (tuples) with logical contingencies. Resources are the a priori and opportunistic physical assets and intangible attributes applied to the execution of associated Plans elements. Skills for both adversary and defender include the assumed general and task competencies for the associated plan set, the realized value of competence in execution or exercise, and the opponent's planning assumption of the task competence.

  20. The impact of expert knowledge on natural hazard susceptibility assessment using spatial multi-criteria analysis

    Science.gov (United States)

    Karlsson, Caroline; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve

    2016-04-01

    Road and railway networks are one of the key factors to a country's economic growth. Inadequate infrastructural networks could be detrimental to a society if the transport between locations are hindered or delayed. Logistical hindrances can often be avoided whereas natural hindrances are more difficult to control. One natural hindrance that can have a severe adverse effect on both infrastructure and society is flooding. Intense and heavy rainfall events can trigger other natural hazards such as landslides and debris flow. Disruptions caused by landslides are similar to that of floods and increase the maintenance cost considerably. The effect on society by natural disasters is likely to increase due to a changed climate with increasing precipitation. Therefore, there is a need for risk prevention and mitigation of natural hazards. Determining susceptible areas and incorporating them in the decision process may reduce the infrastructural harm. Spatial multi-criteria analysis (SMCA) is a part of decision analysis, which provides a set of procedures for analysing complex decision problems through a Geographic Information System (GIS). The objective and aim of this study was to evaluate the usefulness of expert judgements for inundation, landslide and debris flow susceptibility assessments through a SMCA approach using hydrological, geological and land use factors. The sensitivity of the SMCA model was tested in relation to each perspective and impact on the resulting susceptibility. A least cost path function was used to compare new alternative road lines with the existing ones. This comparison was undertaken to identify the resulting differences in the susceptibility assessments using expert judgements as well as historic incidences of flooding and landslides in order to discuss the usefulness of the model in road planning.

  1. ENVIRONMENTAL IMPACT ASSESSMENT OF ROSIA JIU OPENCAST AREA USING AN INTEGRATED SAR ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. D. Poenaru

    2016-06-01

    Full Text Available The satellite data provide a new perspective to analyse and interpret environmental impact assessment as function of topography and vegetation. The main goal of this paper is to investigate the new Staring Spotlight TerraSAR-X mode capabilities to monitor land degradation in Rosia Jiu opencast area taking into account the mining engineering standards and specifications. The second goal is to relate mining activities with spatio-temporal dynamics of land degradation by using differential Synthetic Aperture Radar interferometry (DInSAR. The experimental analysis was carried out on data acquired in the LAN_2277 scientific proposal framework during 2014-2015 period. A set of 25 very height resolution SAR data gathered in the VV polarisation mode with a resolution of 0.45 m x 0.16m and an incidence angle of 37° have been used in this study. Preliminary results showed that altered terrain topography with steep slopes and deep pits has led to the layover of radar signal. Initially, ambiguous results have been obtained due to the highly dynamic character of subsidence induced by activities which imply mass mining methods. By increasing the SAR data number, the land degradation assessment has been improved. Most of the interferometric pairs have low coherence therefore the product coherence threshold was set to 0.3. A coherent and non-coherent analysis is performed to delineate land cover changes and complement the deformation model. Thus, the environmental impact of mining activities is better studied. Moreover, the monitoring of changes in pit depths, heights of stock-piles and waste dumps and levels of tailing dumps provide additional information about production data.

  2. Environmental Impact Assessment of Rosia Jiu Opencast Area Using AN Integrated SAR Analysis

    Science.gov (United States)

    Poenaru, V. D.; Negula, I. F. Dana; Badea, A.; Cuculici, R.

    2016-06-01

    The satellite data provide a new perspective to analyse and interpret environmental impact assessment as function of topography and vegetation. The main goal of this paper is to investigate the new Staring Spotlight TerraSAR-X mode capabilities to monitor land degradation in Rosia Jiu opencast area taking into account the mining engineering standards and specifications. The second goal is to relate mining activities with spatio-temporal dynamics of land degradation by using differential Synthetic Aperture Radar interferometry (DInSAR). The experimental analysis was carried out on data acquired in the LAN_2277 scientific proposal framework during 2014-2015 period. A set of 25 very height resolution SAR data gathered in the VV polarisation mode with a resolution of 0.45 m x 0.16m and an incidence angle of 37° have been used in this study. Preliminary results showed that altered terrain topography with steep slopes and deep pits has led to the layover of radar signal. Initially, ambiguous results have been obtained due to the highly dynamic character of subsidence induced by activities which imply mass mining methods. By increasing the SAR data number, the land degradation assessment has been improved. Most of the interferometric pairs have low coherence therefore the product coherence threshold was set to 0.3. A coherent and non-coherent analysis is performed to delineate land cover changes and complement the deformation model. Thus, the environmental impact of mining activities is better studied. Moreover, the monitoring of changes in pit depths, heights of stock-piles and waste dumps and levels of tailing dumps provide additional information about production data.

  3. Value-Based Assessment of New Medical Technologies: Towards a Robust Methodological Framework for the Application of Multiple Criteria Decision Analysis in the Context of Health Technology Assessment.

    Science.gov (United States)

    Angelis, Aris; Kanavos, Panos

    2016-05-01

    In recent years, multiple criteria decision analysis (MCDA) has emerged as a likely alternative to address shortcomings in health technology assessment (HTA) by offering a more holistic perspective to value assessment and acting as an alternative priority setting tool. In this paper, we argue that MCDA needs to subscribe to robust methodological processes related to the selection of objectives, criteria and attributes in order to be meaningful in the context of healthcare decision making and fulfil its role in value-based assessment (VBA). We propose a methodological process, based on multi-attribute value theory (MAVT) methods comprising five distinct phases, outline the stages involved in each phase and discuss their relevance in the HTA process. Importantly, criteria and attributes need to satisfy a set of desired properties, otherwise the outcome of the analysis can produce spurious results and misleading recommendations. Assuming the methodological process we propose is adhered to, the application of MCDA presents three very distinct advantages to decision makers in the context of HTA and VBA: first, it acts as an instrument for eliciting preferences on the performance of alternative options across a wider set of explicit criteria, leading to a more complete assessment of value; second, it allows the elicitation of preferences across the criteria themselves to reflect differences in their relative importance; and, third, the entire process of preference elicitation can be informed by direct stakeholder engagement, and can therefore reflect their own preferences. All features are fully transparent and facilitate decision making.

  4. Gait and function as tools for the assessment of fracture repair - the role of movement analysis for the assessment of fracture healing.

    Science.gov (United States)

    Rosenbaum, Dieter; Macri, Felipe; Lupselo, Fernando Silva; Preis, Osvaldo Cristiano

    2014-06-01

    Assessment of gait and function might be as sensitive tool to monitor the progress of fracture healing. Currently available assessment tools for function use instrumented three dimensional gait analysis or pedobarography. The analysis is focused on gait or movement parameters and seeks to identify abnormalities or asymmetries between legs or arms. The additional inclusion of muscle function by electromyography can further elucidate functional performance and its temporal development. Alternative approaches abstain from directly assessing function in the laboratory but rather determine the amount of activities of daily living or the mere ability to perform defined tasks such as walking, stair climbing or running. Some of these methods have been applied to determine recovery after orthopaedic interventions including fracture repair. The combination of lab-based functional measurements and assessment of physical activities in daily live may offer a valuable level of information about the gait quality and quantity of individual patients which sheds light on functional limitations or rehabilitation of gait and mobility after a disease or injury and the respective conservative, medical or surgical treatment.

  5. Sensitivity analysis in a life cycle assessment of an aged red wine production from Catalonia, Spain

    Energy Technology Data Exchange (ETDEWEB)

    Meneses, M., E-mail: montse.meneses@uab.cat [Universitat Autònoma de Barcelona, Systems Engineering and Telecomunication Department, ESE, 08193 Bellaterra (Spain); Torres, C.M.; Castells, F. [Universitat Rovira i Virgili, Departament d' Enginyeria Química, Environmental Analysis and Management Group, AGA, Av. Paisos Catalans 26, 43007 Tarragona (Spain)

    2016-08-15

    Sustainability in agriculture and food processing is an issue with a clear growing interest; especially in products were consumers have particular awareness regarding its environmental profile. This is the case of wine industry depending on grape production, winemaking and bottling. Also viticulture and generally agricultural production is significantly affected by climate variations. The aim of this article is to determine the environmental load of an aged red wine from a winery in Catalonia, Spain, over its entire life cycle, including sensitivity analysis of the main parameters related to the cultivation, vinification and bottling. The life cycle assessment (LCA) methodology is used for the environmental analysis. In a first step, life cycle inventory (LCI) data were collected by questionnaires and interviews with the winemaker, all data are actual operating data and all the stages involved in the production have been taken into account (viticulture, vinification, bottling and the disposal subsystem). Data were then used to determine the environmental profile by a life cycle impact assessment using the ReCiPe method. Annual variability in environmental performance, stresses the importance of including timeline analysis in the wine sector. Because of that this study is accompanied with a sensitivity analysis carried out by a Monte Carlo simulation that takes into account the uncertainty and variability of the parameters used. In this manner, the results are presented with confidence intervals to provide a wider view of the environmental issues derived from the activities of the studied wine estate regardless of the eventualities of a specific harvesting year. Since the beverage packaging has an important influence in this case, a dataset for the production of green glass was adapted to reflect the actual recycling situation in Spain. Furthermore, a hypothetical variation of the glass-recycling rate in the glass production completes this article, as a key variable

  6. Body electrical loss analysis (BELA in the assessment of visceral fat: a demonstration

    Directory of Open Access Journals (Sweden)

    Blomqvist Kim H

    2011-11-01

    Full Text Available Abstract Background Body electrical loss analysis (BELA is a new non-invasive way to assess visceral fat depot size through the use of electromagnetism. BELA has worked well in phantom measurements, but the technology is not yet fully validated. Methods Ten volunteers (5 men and 5 women, age: 22-60 y, BMI: 21-30 kg/m2, waist circumference: 73-108 cm were measured with the BELA instrument and with cross-sectional magnetic resonance imaging (MRI at the navel level, navel +5 cm and navel -5 cm. The BELA signal was compared with visceral and subcutaneous fat areas calculated from the MR images. Results The BELA signal did not correlate with subcutaneous fat area at any level, but correlated significantly with visceral fat area at the navel level and navel +5 cm. The correlation was best at level of navel +5 cm (R2 = 0.74, P 2, LOOCV = 40.1 cm2, where SEE is the standard error of the estimate and LOOCV is the root mean squared error of leave-one-out style cross-validation. The average estimate of repeatability of the BELA signal observed through the study was ±9.6 %. One of the volunteers had an exceptionally large amount of visceral fat, which was underestimated by BELA. Conclusions The correlation of the BELA signal with the visceral but not with the subcutaneous fat area as measured by MRI is promising. The lack of correlation with the subcutaneous fat suggests that subcutaneous fat has a minor influence to the BELA signal. Further research will show if it is possible to develop a reliable low-cost method for the assessment of visceral fat either using BELA only or combining it, for example, with bioelectrical impedance measurement. The combination of these measurements may help assessing visceral fat in a large scale of body composition. Before large-scale clinical testing and ROC analysis, the initial BELA instrumentation requires improvements. The accuracy of the present equipment is not sufficient for such new technology.

  7. Objective assessment of motor fatigue in multiple sclerosis using kinematic gait analysis: a pilot study

    Directory of Open Access Journals (Sweden)

    Sehle Aida

    2011-10-01

    Full Text Available Abstract Background Fatigue is a frequent and serious symptom in patients with Multiple Sclerosis (MS. However, to date there are only few methods for the objective assessment of fatigue. The aim of this study was to develop a method for the objective assessment of motor fatigue using kinematic gait analysis based on treadmill walking and an infrared-guided system. Patients and methods Fourteen patients with clinically definite MS participated in this study. Fatigue was defined according to the Fatigue Scale for Motor and Cognition (FSMC. Patients underwent a physical exertion test involving walking at their pre-determined patient-specific preferred walking speed until they reached complete exhaustion. Gait was recorded using a video camera, a three line-scanning camera system with 11 infrared sensors. Step length, width and height, maximum circumduction with the right and left leg, maximum knee flexion angle of the right and left leg, and trunk sway were measured and compared using paired t-tests (α = 0.005. In addition, variability in these parameters during one-minute intervals was examined. The fatigue index was defined as the number of significant mean and SD changes from the beginning to the end of the exertion test relative to the total number of gait kinematic parameters. Results Clearly, for some patients the mean gait parameters were more affected than the variability of their movements while other patients had smaller differences in mean gait parameters with greater increases in variability. Finally, for other patients gait changes with physical exertion manifested both in changes in mean gait parameters and in altered variability. The variability and fatigue indices correlated significantly with the motoric but not with the cognitive dimension of the FSMC score (R = -0.602 and R = -0.592, respectively; P Conclusions Changes in gait patterns following a physical exertion test in patients with MS suffering from motor fatigue can be

  8. Multi-criteria decision analysis for assessment and appraisal of orphan drugs

    Directory of Open Access Journals (Sweden)

    Georgi Iskrov

    2016-09-01

    Full Text Available Background: Limited resources and expanding expectations push all countries and types of health systems to adopt new approaches in priority setting and resources allocation. Despite best efforts, it is difficult to reconcile all competing interests and trade-offs are inevitable. This is why multi-criteria decision analysis (MCDA has played a major role in recent uptake of value-based reimbursement. MCDA framework enables exploration of stakeholders’ preferences, as well as explicit organization of broad range of criteria on which real-world decisions are made.Assessment and appraisal of orphan drugs tend to be one of the most complicated health technology assessment (HTA tasks. Access to market approved orphan therapies remains an issue. Early constructive dialogue among rare disease stakeholders and elaboration of orphan drug-tailored decision support tools could set the scene for ongoing accumulation of evidence, as well as for proper reimbursement decision-making.Objective: The objective of this study was to create a MCDA value measurement model to assess and appraise orphan drugs. This was achieved by exploring the preferences on decision criteria’s weights and performance scores through a stakeholder-representative survey and a focus group discussion that were both organized in Bulgaria.Results/Conclusions: Decision criteria that describe the health technology’s characteristics were unanimously agreed as the most important group of reimbursement considerations. This outcome, combined with the high individual weight of disease severity and disease burden criteria underlined some of the fundamental principles of healthcare – equity and fairness. Our study proved that strength of evidence may be a key criterion in orphan drug assessment and appraisal. Evidence is not only used to shape reimbursement decision-making, but also to lend legitimacy to policies pursued. The need for real-world data on orphan drugs was largely stressed

  9. Higher Education End-of-Course Evaluations: Assessing the Psychometric Properties Utilizing Exploratory Factor Analysis and Rasch Modeling Approaches

    Directory of Open Access Journals (Sweden)

    Kelly D. Bradley

    2016-07-01

    Full Text Available This paper offers a critical assessment of the psychometric properties of a standard higher education end-of-course evaluation. Using both exploratory factor analysis (EFA and Rasch modeling, the authors investigate the (a an overall assessment of dimensionality using EFA, (b a secondary assessment of dimensionality using a principal components analysis (PCA of the residuals when the items are fit to the Rasch model, and (c an assessment of item-level properties using item-level statistics provided when the items are fit to the Rasch model. The results support the usage of the scale as a supplement to high-stakes decision making such as tenure. However, the lack of precise targeting of item difficulty to person ability combined with the low person separation index renders rank-ordering professors according to minuscule differences in overall subscale scores a highly questionable practice.

  10. Chemical analysis of World Trade Center fine particulate matter for use in toxicologic assessment.

    Science.gov (United States)

    McGee, John K; Chen, Lung Chi; Cohen, Mitchell D; Chee, Glen R; Prophete, Colette M; Haykal-Coates, Najwa; Wasson, Shirley J; Conner, Teri L; Costa, Daniel L; Gavett, Stephen H

    2003-06-01

    The catastrophic destruction of the World Trade Center (WTC) on 11 September 2001 caused the release of high levels of airborne pollutants into the local environment. To assess the toxicity of fine particulate matter [particulate matter with a mass median aerodynamic diameter fraction was isolated on filters. Here we report the chemical and physical properties of PM2.5 derived from these samples and compare them with PM2.5 fractions of three reference materials that range in toxicity from relatively inert to acutely toxic (Mt. St. Helens PM; Washington, DC, ambient air PM; and residual oil fly ash). X-ray diffraction of very coarse sieved WTC PM (fraction. Analysis of WTC PM2.5 using X-ray fluorescence, neutron activation analysis, and inductively coupled plasma spectrometry showed high levels of calcium (range, 22-33%) and sulfur (37-43% as sulfate) and much lower levels of transition metals and other elements. Aqueous extracts of WTC PM2.5 were basic (pH range, 8.9-10.0) and had no evidence of significant bacterial contamination. Levels of carbon were relatively low, suggesting that combustion-derived particles did not form a significant fraction of these samples recovered in the immediate aftermath of the destruction of the towers. Because gypsum and calcite are known to cause irritation of the mucus membranes of the eyes and respiratory tract, inhalation of high doses of WTC PM2.5 could potentially cause toxic respiratory effects.

  11. Heroin shortage in Coastal Kenya: A rapid assessment and qualitative analysis of heroin users’ experiences

    Science.gov (United States)

    Mital, Sasha; Miles, Gillian; McLellan-Lemal, Eleanor; Muthui, Mercy; Needle, Richard

    2016-01-01

    Introduction While relatively rare events, abrupt disruptions in heroin availability have a significant impact on morbidity and mortality risk among those who are heroin dependent. A heroin shortage occurred in Coast Province, Kenya from December 2010 to March 2011. This qualitative analysis describes the shortage events and consequences from the perspective of heroin users, along with implications for health and other public sectors. Methods As part of a rapid assessment, 66 key informant interviews and 15 focus groups among heroin users in Coast Province, Kenya were conducted. A qualitative thematic analysis was undertaken in Atlas.ti. to identify salient themes related to the shortage. Results Overall, participant accounts were rooted in a theme of desperation and uncertainty, with emphasis on six sub-themes: (1) withdrawal and strategies for alleviating withdrawal, including use of medical intervention and other detoxification attempts; (2) challenges of dealing with unpredictable drug availability, cost, and purity; (3) changes in drug use patterns, and actions taken to procure heroin and other drugs; (4) modifications in drug user relationship dynamics and networks, including introduction of risky group-level injection practices; (5) family and community response; and (6) new challenges with the heroin market resurgence. Conclusions The heroin shortage led to a series of consequences for drug users, including increased risk of morbidity, mortality and disenfranchisement at social and structural levels. Availability of evidence-based services for drug users and emergency preparedness plans could have mitigated this impact. PMID:26470646

  12. ANALYSIS OF METHODS OF RISK ASSESSMENT OF INVESTMENT IN PERSONNEL DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    О. Domkina

    2015-06-01

    Full Text Available In the article, we study the main approaches to the assessment of investment risk aiming to find the most appropriate method for estimation of risks of the investment in personnel development considering the human factor. We analyze the pros and cons of the existing methods. As a result, we suggest using the combination of expert and ranking methods as it provides wide opportunities for risk factors analysis in the sitiuation of data scarcity, in spite of the methods’ limitations of the subjectivity of expert judgments that can, however, be reduced by some of the advanced expert methods. Additionally, we consider the application of the analytical method that provides factor analysis and a foundation for the further risk management of these factors. The use of the statistical group of methods, although promising, is not feasible in practice yet because of the paucity of required data and difficulty of obtaining it from the companies which do not have incentives to provide such sensible information. Logically, the next step of the research should be a practical application of the listed methods, a test of the presented hypotheses, and an evaluation of the obtained results with the accent on the quality of risk indicators, data demands, utility and complexity of the methods’ practical application.

  13. Quantitative Analysis of Ductile Iron Microstructure – A Comparison of Selected Methods for Assessment

    Directory of Open Access Journals (Sweden)

    B. Mrzygłód

    2013-07-01

    Full Text Available Stereological description of dispersed microstructure is not an easy task and remains the subject of continuous research. In its practical aspect, a correct stereological description of this type of structure is essential for the analysis of processes of coagulation and spheroidisation, or for studies of relationships between structure and properties. One of the most frequently used methods for an estimation of the density Nv and size distribution of particles is the Scheil - Schwartz - Saltykov method. In this article, the authors present selected methods for quantitative assessment of ductile iron microstructure, i.e. the Scheil - Schwartz - Saltykov method, which allows a quantitative description of three-dimensional sets of solids using measurements and counts performed on two-dimensional cross-sections of these sets (microsections and quantitative description of three-dimensional sets of solids by X-ray computed microtomography, which is an interesting alternative for structural studies compared to traditional methods of microstructure imaging since, as a result, the analysis provides a three-dimensional imaging of microstructures examined.

  14. Kinetic and Kinematic Analysis for Assessing the Differences in Countermovement Jump Performance in Rugby Players.

    Science.gov (United States)

    Floría, Pablo; Gómez-Landero, Luis A; Suárez-Arrones, Luis; Harrison, Andrew J

    2016-09-01

    Floría, P, Gómez-Landero, LA, Suárez-Arrones, L, and Harrison, AJ. Kinetic and kinematic analysis for assessing the differences in countermovement jump performance in rugby players. J Strength Cond Res 30(9): 2533-2539, 2016-The aim of this study was to ascertain the differences in kinetic and kinematic profiles between better and poorer performers of the vertical jump within a homogeneous group of trained adults. Fifty rugby players were divided into low scoring (LOW) and high scoring (HIGH) groups based on their performance in the vertical jump. The force, velocity, displacement, and rate of force development (RFD)-time curves were analyzed to determine the differences between groups. The analysis of the data showed differences in all the patterns of the ensemble mean curves of the HIGH and LOW groups. During the eccentric phase, the differences in the HIGH group with respect to the LOW group were lower crouch position, higher downward velocity, and higher force and RFD during the braking of the downward movement. During the concentric phase, the HIGH group achieved higher upward velocity, higher force at the end of phase, and a higher position at takeoff. The higher jump performances seem to be related to a more effective stretch-shortening cycle function that is characterized by a deeper and faster countermovement with higher eccentric forces being applied to decelerate the downward movement leading to enhanced force generation during the concentric phase.

  15. Automatic roof plane detection and analysis in airborne lidar point clouds for solar potential assessment.

    Science.gov (United States)

    Jochem, Andreas; Höfle, Bernhard; Rutzinger, Martin; Pfeifer, Norbert

    2009-01-01

    A relative height threshold is defined to separate potential roof points from the point cloud, followed by a segmentation of these points into homogeneous areas fulfilling the defined constraints of roof planes. The normal vector of each laser point is an excellent feature to decompose the point cloud into segments describing planar patches. An object-based error assessment is performed to determine the accuracy of the presented classification. It results in 94.4% completeness and 88.4% correctness. Once all roof planes are detected in the 3D point cloud, solar potential analysis is performed for each point. Shadowing effects of nearby objects are taken into account by calculating the horizon of each point within the point cloud. Effects of cloud cover are also considered by using data from a nearby meteorological station. As a result the annual sum of the direct and diffuse radiation for each roof plane is derived. The presented method uses the full 3D information for both feature extraction and solar potential analysis, which offers a number of new applications in fields where natural processes are influenced by the incoming solar radiation (e.g., evapotranspiration, distribution of permafrost). The presented method detected fully automatically a subset of 809 out of 1,071 roof planes where the arithmetic mean of the annual incoming solar radiation is more than 700 kWh/m(2).

  16. Hair analysis: another approach for the assessment of human exposure to selected persistent organochlorine pollutants.

    Science.gov (United States)

    Covaci, Adrian; Tutudaki, Maria; Tsatsakis, Aristidis M; Schepens, Paul

    2002-01-01

    Hair analysis was used for the assessment of exposure to organochlorine pollutants in specimens from Greece, Romania and Belgium. A simple method (using 3 N HCI as incubation reagent, liquid-liquid extraction with hexane/ dichloromethane (DCM), alumina/acid silica clean-up and GC-ECD/GC-MS analysis) was used for screening of specimens. The highest organochlorine load (up to 148 ng/g hair for the sum of PCB, DDT and hexachlorocyclohexane (HCH) isomers) was found in samples from a group of Greek women with past occupational exposure to pesticides. DDTs were the main organochlorine pollutants in Greek samples (up to 70%), while in Belgian hair samples their contribution was reduced to 40%. PCB mean concentration was higher in Belgian specimens (up to 14 ng/g hair). Lindane (y-HCH) was the main HCH isomer found in the samples (up to 82% in the Greek samples). Contribution of p,p'-DDT to the sum of DDTs was higher in Greek samples and indicates recent exposure to technical DDT. Similar PCB 153/sum PCBs ratios were found for each of the three countries suggesting similar sources of pollution with PCBs (mainly dietary). Artificially coloured hair samples were found to have lower, but not statistically significant concentrations of organochlorine pollutants than the non-coloured hair.

  17. Hair analysis used to assess chronic exposure to the organophosphate diazinon: a model study with rabbits.

    Science.gov (United States)

    Tutudaki, Maria; Tsakalof, Andreas K; Tsatsakis, Aristidis M

    2003-03-01

    The main purpose of the present study was to determine whether hair analysis would be a suitable method to assess chronic exposure of rabbits to the pesticide diazinon. A controlled study was designed, in which white rabbits of the New Zealand variety were systemically exposed to two dosage levels (15 mg/kg per day and 8 mg/kg per day) of the pesticide, through their drinking water, for a period of 4 months. Hair samples from the back of the rabbits were removed before commencing the experiment and at the end of the dosing period. Parallel experiments with spiked hair were carried out in order to design a simple and efficient method of extraction of diazinon from hair. The hair was pulverized in a ball mill homogenizer, incubated in methanol at 37 degrees C overnight, liquid-liquid extracted with ethyl acetate and measured by chromatography techniques (GC-NPD and GC-MS) for confirmation. The concentration of the diazinon in the hair of the exposed animals ranged from 0.11 to 0.26 ng/mg hair. It was concluded that there is a relationship between the administered dose and the detected pesticide concentration in hair. Finally, it seems that hair analysis may be used to investigate chronic exposure to the pesticide.

  18. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    Science.gov (United States)

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms.

  19. Assessment of Slope Instability and Risk Analysis of Road Cut Slopes in Lashotor Pass, Iran

    Directory of Open Access Journals (Sweden)

    Mohammad Hossein Taherynia

    2014-01-01

    Full Text Available Assessment of the stability of natural and artificial rock slopes is an important topic in the rock mechanics sciences. One of the most widely used methods for this purpose is the classification of the slope rock mass. In the recent decades, several rock slope classification systems are presented by many researchers. Each one of these rock mass classification systems uses different parameters and rating systems. These differences are due to the diversity of affecting parameters and the degree of influence on the rock slope stability. Another important point in rock slope stability is appraisal hazard and risk analysis. In the risk analysis, the degree of danger of rock slope instability is determined. The Lashotor pass is located in the Shiraz-Isfahan highway in Iran. Field surveys indicate that there are high potentialities of instability in the road cut slopes of the Lashotor pass. In the current paper, the stability of the rock slopes in the Lashotor pass is studied comprehensively with different classification methods. For risk analyses, we estimated dangerous area by use of the RocFall software. Furthermore, the dangers of falling rocks for the vehicles passing the Lashotor pass are estimated according to rockfall hazard rating system.

  20. APPLICATION OF LASER SCANNING SURVEYING TO ROCK SLOPES RISK ASSESSMENT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. Corsetti

    2014-01-01

    Full Text Available The methods for understanding rock instability mechanisms and for evaluating potential destructive scenarios are of great importance in risk assessment analysis dedicated to the establishment of appropriate prevention and mitigation actions. When the portion of the unstable rock mass is very large, effective actions to counteract the risks are complex and expensive. In these conditions, an optimal risk management cannot ignore procedures able to faster and accurately acquire i geometrical data for modeling the geometry of the rock walls and implementing reliable forecasting models and ii monitoring data able to describe the magnitude and the direction of deformation processes. These data contributes to the prediction of the behavior of a landslide if the measurements are acquired frequently and reliable numerical models can be implemented. Innovative geomatic techniques, based on GPS, Terrestrial Laser Scanning Surveying (TLS, automated total station and satellite and ground SAR Interferometry, have been recently applied to define the geometry and monitoring the displacements of unstable slopes. Among these, TLS is mainly adopted to generate detailed 3D models useful to reconstruct rock wall geometry by contributing to the estimation of geo-mechanical parameters, that is orientation, persistence and apparent spacing of rock discontinuities. Two examples of applications of TLS technique to the analysis of a large front in a quarry and of a rock shoulder of a dam are presented.

  1. Comparative analysis of assessment methods for operational and anesthetic risks in ulcerative gastroduodenal bleeding

    Directory of Open Access Journals (Sweden)

    Potakhin S.N.

    2015-09-01

    Full Text Available Aim of the investigation: to conduct a comparative analysis of methods of evaluation of surgical and anesthetic risks in ulcerative gastroduodenal bleeding. Materials and methods. A retrospective analysis ofthe extent of the surgical and anesthetic risks and results of treatment of 71 patients with peptic ulcer bleeding has been conducted in the study. To evaluate the surgical and anesthetic risks classification trees are used, scale ТА. Rockall and prognosis System of rebleeding (SPRK, proposed by N. V. Lebedev et al. in 2009, enabling to evaluate the probability of a fatal outcome. To compare the efficacy ofthe methods the following indicators are used: sensitivity, specificity and prediction of positive result. Results. The study compared the results ofthe risk assessment emergency operation by using these methods with the outcome ofthe operation. The comparison ofthe prognosis results in sensitivity leads to the conclusion that the scales ТА. Rockall and SPRK are worse than the developed method of classification trees in recognizing patients with poor outcome of surgery. Conclusion. The method of classification trees can be considered as the most accurate method of evaluation of surgical and anesthetic risks in ulcerative gastroduodenal bleeding.

  2. Multiscale Entropy Analysis of Heart Rate Variability for Assessing the Severity of Sleep Disordered Breathing

    Directory of Open Access Journals (Sweden)

    Wen-Yao Pan

    2015-01-01

    Full Text Available Obstructive sleep apnea (OSA is an independent cardiovascular risk factor to which autonomic nervous dysfunction has been reported to be an important contributor. Ninety subjects recruited from the sleep center of a single medical center were divided into four groups: normal snoring subjects without OSA (apnea hypopnea index, AHI < 5, n = 11, mild OSA (5 ≤ AHI < 15, n = 10, moderate OSA (15 ≤ AHI < 30, n = 24, and severe OSA (AHI ≥ 30, n = 45. Demographic (i.e., age, gender, anthropometric (i.e., body mass index, neck circumference, and polysomnographic (PSG data were recorded and compared among the different groups. For each subject, R-R intervals (RRI from 10 segments of 10-minute electrocardiogram recordings during non-rapid eye movement sleep at stage N2 were acquired and analyzed for heart rate variability (HRV and sample entropy using multiscale entropy index (MEI that was divided into small scale (MEISS, scale 1–5 and large scale (MEILS, scale 6–10. Our results not only demonstrated that MEISS could successfully distinguish normal snoring subjects and those with mild OSA from those with moderate and severe disease, but also revealed good correlation between MEISS and AHI with Spearman correlation analysis (r = −0.684, p < 0.001. Therefore, using the two parameters of EEG and ECG, MEISS may serve as a simple preliminary screening tool for assessing the severity of OSA before proceeding to PSG analysis.

  3. Assessment of trabecular bone changes around endosseous implants using image analysis techniques: A preliminary study

    Energy Technology Data Exchange (ETDEWEB)

    Zuki, Mervet El [Dept. of Oral Medicine and Radiology, Benghazi University College of Dentistry, Benghazi (Libya); Omami, Galal [Oral Diagnosis and Polyclinics, Faculty of Dentistry, The University of Hong Kong (Hong Kong); Horner, Keith [Dept. of Oral Radiology, University Dental Hospital of Manchester, Manchester (United Kingdom)

    2014-06-15

    The objective of this study was to assess the trabecular bone changes that occurred around functional endosseous dental implants by means of radiographic image analysis techniques. Immediate preoperative and postoperative periapical radiographs of de-identified implant patients at the University Dental Hospital of Manchester were retrieved, screened for specific inclusion criteria, digitized, and quantified for structural elements of the trabecular bone around the endosseous implants, by using image analysis techniques. Data were analyzed using SPSS version 11.5. P values of less than 0.05 were considered statistically significant. A total of 12 implants from 11 patients were selected for the study, and 26 regions of interest were obtained. There was a significant increase in the bone area in terms of the mean distance between nodes (p=0.006) and a significant decrease in the marrow area in terms of the bone area (p=0.006) and the length of marrow spaces (p=0.032). It appeared that the bone around the implant underwent remodeling that resulted in a net increase in bone after implant placement.

  4. Assessment of Infrared Sounder Radiometric Noise from Analysis of Spectral Residuals

    Science.gov (United States)

    Dufour, E.; Klonecki, A.; Standfuss, C.; Tournier, B.; Serio, C.; Masiello, G.; Tjemkes, S.; Stuhlmann, R.

    2016-08-01

    For the preparation and performance monitoring of the future generation of hyperspectral InfraRed sounders dedicated to the precise vertical profiling of the atmospheric state, such as the Meteosat Third Generation hyperspectral InfraRed Sounder, a reliable assessment of the instrument radiometric error covariance matrix is needed.Ideally, an inflight estimation of the radiometrric noise is recommended as certain sources of noise can be driven by the spectral signature of the observed Earth/ atmosphere radiance. Also, unknown correlated noise sources, generally related to incomplete knowledge of the instrument state, can be present, so a caracterisation of the noise spectral correlation is also neeed.A methodology, relying on the analysis of post-retreival spectral residuals, is designed and implemented to derive in-flight the covariance matrix on the basis of Earth scenes measurements. This methodology is successfully demonstrated using IASI observations as MTG-IRS proxy data and made it possible to highlight anticipated correlation structures explained by apodization and micro-vibration effects (ghost). This analysis is corroborated by a parallel estimation based on an IASI black body measurement dataset and the results of an independent micro-vibration model.

  5. Uncertainty Analysis for Peer Assessment: Oral Presentation Skills for Final Year Project

    Science.gov (United States)

    Kim, Ho Sung

    2014-01-01

    Peer assessment plays an important role in engineering education for an active involvement in the assessment process, developing autonomy, enhancing reflection, and understanding of how to achieve the learning outcomes. Peer assessment uncertainty for oral presentation skills as part of the FYP assessment is studied. Validity and reliability for…

  6. Sensitivity analysis in a life cycle assessment of an aged red wine production from Catalonia, Spain.

    Science.gov (United States)

    Meneses, M; Torres, C M; Castells, F

    2016-08-15

    Sustainability in agriculture and food processing is an issue with a clear growing interest; especially in products were consumers have particular awareness regarding its environmental profile. This is the case of wine industry depending on grape production, winemaking and bottling. Also viticulture and generally agricultural production is significantly affected by climate variations. The aim of this article is to determine the environmental load of an aged red wine from a winery in Catalonia, Spain, over its entire life cycle, including sensitivity analysis of the main parameters related to the cultivation, vinification and bottling. The life cycle assessment (LCA) methodology is used for the environmental analysis. In a first step, life cycle inventory (LCI) data were collected by questionnaires and interviews with the winemaker, all data are actual operating data and all the stages involved in the production have been taken into account (viticulture, vinification, bottling and the disposal subsystem). Data were then used to determine the environmental profile by a life cycle impact assessment using the ReCiPe method. Annual variability in environmental performance, stresses the importance of including timeline analysis in the wine sector. Because of that this study is accompanied with a sensitivity analysis carried out by a Monte Carlo simulation that takes into account the uncertainty and variability of the parameters used. In this manner, the results are presented with confidence intervals to provide a wider view of the environmental issues derived from the activities of the studied wine estate regardless of the eventualities of a specific harvesting year. Since the beverage packaging has an important influence in this case, a dataset for the production of green glass was adapted to reflect the actual recycling situation in Spain. Furthermore, a hypothetical variation of the glass-recycling rate in the glass production completes this article, as a key variable

  7. Bayesian Analysis for Food-Safety Risk Assessment: Evaluation of Dose-Response Functions within WinBUGS

    OpenAIRE

    Williams, Michael S.; Ebel, Eric D.; Jennifer A Hoeting

    2011-01-01

    Bayesian methods are becoming increasingly popular in the field of food-safety risk assessment. Risk assessment models often require the integration of a dose-response function over the distribution of all possible doses of a pathogen ingested with a specific food. This requires the evaluation of an integral for every sample for a Markov chain Monte Carlo analysis of a model. While many statistical software packages have functions that allow for the evaluation of the integral, this functional...

  8. Application of a utility analysis to evaluate a novel assessment tool for clinically oriented physiology and pharmacology.

    Science.gov (United States)

    Cramer, Nicholas; Asmar, Abdo; Gorman, Laurel; Gros, Bernard; Harris, David; Howard, Thomas; Hussain, Mujtaba; Salazar, Sergio; Kibble, Jonathan D

    2016-09-01

    Multiple-choice questions are a gold-standard tool in medical school for assessment of knowledge and are the mainstay of licensing examinations. However, multiple-choice questions items can be criticized for lacking the ability to test higher-order learning or integrative thinking across multiple disciplines. Our objective was to develop a novel assessment that would address understanding of pathophysiology and pharmacology, evaluate learning at the levels of application, evaluation and synthesis, and allow students to demonstrate clinical reasoning. The rubric assesses student writeups of clinical case problems. The method is based on the physician's traditional postencounter Subjective, Objective, Assessment and Plan note. Students were required to correctly identify subjective and objective findings in authentic clinical case problems, to ascribe pathophysiological as well as pharmacological mechanisms to these findings, and to justify a list of differential diagnoses. A utility analysis was undertaken to evaluate the new assessment tool by appraising its reliability, validity, feasibility, cost effectiveness, acceptability, and educational impact using a mixed-method approach. The Subjective, Objective, Assessment and Plan assessment tool scored highly in terms of validity and educational impact and had acceptable levels of statistical reliability but was limited in terms of acceptance, feasibility, and cost effectiveness due to high time demands on expert graders and workload concerns from students. We conclude by making suggestions for improving the tool and recommend deployment of the instrument for low-stakes summative assessment or formative assessment.

  9. Application of Cluster Analysis in Assessment of Dietary Habits of Secondary School Students

    Directory of Open Access Journals (Sweden)

    Zalewska Magdalena

    2014-12-01

    Full Text Available Maintenance of proper health and prevention of diseases of civilization are now significant public health problems. Nutrition is an important factor in the development of youth, as well as the current and future state of health. The aim of the study was to show the benefits of the application of cluster analysis to assess the dietary habits of high school students. The survey was carried out on 1,631 eighteen-year-old students in seven randomly selected secondary schools in Bialystok using a self-prepared anonymous questionnaire. An evaluation of the time of day meals were eaten and the number of meals consumed was made for the surveyed students. The cluster analysis allowed distinguishing characteristic structures of dietary habits in the observed population. Four clusters were identified, which were characterized by relative internal homogeneity and substantial variation in terms of the number of meals during the day and the time of their consumption. The most important characteristics of cluster 1 were cumulated food ration in 2 or 3 meals and long intervals between meals. Cluster 2 was characterized by eating the recommended number of 4 or 5 meals a day. In the 3rd cluster, students ate 3 meals a day with large intervals between them, and in the 4th they had four meals a day while maintaining proper intervals between them. In all clusters dietary mistakes occurred, but most of them were related to clusters 1 and 3. Cluster analysis allowed for the identification of major flaws in nutrition, which may include irregular eating and skipping meals, and indicated possible connections between eating patterns and disturbances of body weight in the examined population.

  10. Progress in pesticide and POPs hair analysis for the assessment of exposure.

    Science.gov (United States)

    Tsatsakis, Aristidis; Tutudaki, Maria

    2004-10-29

    The present paper reviews the work that has been done in the field of pesticide and persistent organic pollutants (POPs) hair analysis during the last 15 years. It summarizes the compounds of interest, the methods of analyte extraction from the hair matrix, the analytical techniques employed and the results obtained. The most widely studied POPs are the polychlorinated dibenzodioxins (PCDDs), the dibenzofurans (PCDFs) the co-planar biphenyls (co-PCBs) and total biphenyls (PCBs). The most widely studied pesticides are the organochlorine ones, like the hexachlorocyclohexanes and the DDTs, which nowadays are only found as environmental pollutants, some organophosphates, selected pyrethroids and the carbamate methomyl. The most widely applied technique was gas chromatography (GC) coupled to mass spectrometry (MS). Other detectors like the ECD in the case of organochlorine analysis and the NPD in the case of organophosphate analysis were also used. The presented data concern human and animal studies. The levels of DDTs detected in hair were between 19 and 400 ng/g, of co-PCBs 0.27 and 0.45 ng/g, of total PCBs 5-13 ng/g of PCDDs and PCDFs 0.1-10 pg/g of lindane 20-400 ng/g of HCHs 14-40 ng/g of diazinon 110-520 ng/g and of methomyl 900-1800 ng/g. These results strongly support the possibility of using hair as a suitable indicator for the assessment of long-term exposure to POPs and pesticides.

  11. Assessing Progress towards Public Health, Human Rights, and International Development Goals Using Frontier Analysis.

    Science.gov (United States)

    Luh, Jeanne; Cronk, Ryan; Bartram, Jamie

    2016-01-01

    Indicators to measure progress towards achieving public health, human rights, and international development targets, such as 100% access to improved drinking water or zero maternal mortality ratio, generally focus on status (i.e., level of attainment or coverage) or trends in status (i.e., rates of change). However, these indicators do not account for different levels of development that countries experience, thus making it difficult to compare progress between countries. We describe a recently developed new use of frontier analysis and apply this method to calculate country performance indices in three areas: maternal mortality ratio, poverty headcount ratio, and primary school completion rate. Frontier analysis is used to identify the maximum achievable rates of change, defined by the historically best-performing countries, as a function of coverage level. Performance indices are calculated by comparing a country's rate of change against the maximum achievable rate at the same coverage level. A country's performance can be positive or negative, corresponding to progression or regression, respectively. The calculated performance indices allow countries to be compared against each other regardless of whether they have only begun to make progress or whether they have almost achieved the target. This paper is the first to use frontier analysis to determine the maximum achievable rates as a function of coverage level and to calculate performance indices for public health, human rights, and international development indicators. The method can be applied to multiple fields and settings, for example health targets such as cessation in smoking or specific vaccine immunizations, and offers both a new approach to analyze existing data and a new data source for consideration when assessing progress achieved.

  12. Assessment of paclitaxel induced sensory polyneuropathy with "Catwalk" automated gait analysis in mice.

    Directory of Open Access Journals (Sweden)

    Petra Huehnchen

    Full Text Available Neuropathic pain as a symptom of sensory nerve damage is a frequent side effect of chemotherapy. The most common behavioral observation in animal models of chemotherapy induced polyneuropathy is the development of mechanical allodynia, which is quantified with von Frey filaments. The data from one study, however, cannot be easily compared with other studies owing to influences of environmental factors, inter-rater variability and differences in test paradigms. To overcome these limitations, automated quantitative gait analysis was proposed as an alternative, but its usefulness for assessing animals suffering from polyneuropathy has remained unclear. In the present study, we used a novel mouse model of paclitaxel induced polyneuropathy to compare results from electrophysiology and the von Frey method to gait alterations measured with the Catwalk test. To mimic recently improved clinical treatment strategies of gynecological malignancies, we established a mouse model of dose-dense paclitaxel therapy on the common C57Bl/6 background. In this model paclitaxel treated animals developed mechanical allodynia as well as reduced caudal sensory nerve action potential amplitudes indicative of a sensory polyneuropathy. Gait analysis with the Catwalk method detected distinct alterations of gait parameters in animals suffering from sensory neuropathy, revealing a minimized contact of the hind paws with the floor. Treatment of mechanical allodynia with gabapentin improved altered dynamic gait parameters. This study establishes a novel mouse model for investigating the side effects of dose-dense paclitaxel therapy and underlines the usefulness of automated gait analysis as an additional easy-to-use objective test for evaluating painful sensory polyneuropathy.

  13. Nondestructive Damage Assessment of Composite Structures Based on Wavelet Analysis of Modal Curvatures: State-of-the-Art Review and Description of Wavelet-Based Damage Assessment Benchmark

    Directory of Open Access Journals (Sweden)

    Andrzej Katunin

    2015-01-01

    Full Text Available The application of composite structures as elements of machines and vehicles working under various operational conditions causes degradation and occurrence of damage. Considering that composites are often used for responsible elements, for example, parts of aircrafts and other vehicles, it is extremely important to maintain them properly and detect, localize, and identify the damage occurring during their operation in possible early stage of its development. From a great variety of nondestructive testing methods developed to date, the vibration-based methods seem to be ones of the least expensive and simultaneously effective with appropriate processing of measurement data. Over the last decades a great popularity of vibration-based structural testing has been gained by wavelet analysis due to its high sensitivity to a damage. This paper presents an overview of results of numerous researchers working in the area of vibration-based damage assessment supported by the wavelet analysis and the detailed description of the Wavelet-based Structural Damage Assessment (WavStructDamAs Benchmark, which summarizes the author’s 5-year research in this area. The benchmark covers example problems of damage identification in various composite structures with various damage types using numerous wavelet transforms and supporting tools. The benchmark is openly available and allows performing the analysis on the example problems as well as on its own problems using available analysis tools.

  14. Analysis of Frequency of Use of Different Scar Assessment Scales Based on the Scar Condition and Treatment Method

    OpenAIRE

    Bae, Seong Hwan; Bae, Yong Chan

    2014-01-01

    Analysis of scars in various conditions is essential, but no consensus had been reached on the scar assessment scale to select for a given condition. We reviewed papers to determine the scar assessment scale selected depending on the scar condition and treatment method. We searched PubMed for articles published since 2000 with the contents of the scar evaluation using a scar assessment scale with a Journal Citation Report impact factor >0.5. Among them, 96 articles that conducted a scar evalu...

  15. Simulation for Prediction of Entry Article Demise (SPEAD): An Analysis Tool for Spacecraft Safety Analysis and Ascent/Reentry Risk Assessment

    Science.gov (United States)

    Ling, Lisa

    2014-01-01

    For the purpose of performing safety analysis and risk assessment for a potential off-nominal atmospheric reentry resulting in vehicle breakup, a synthesis of trajectory propagation coupled with thermal analysis and the evaluation of node failure is required to predict the sequence of events, the timeline, and the progressive demise of spacecraft components. To provide this capability, the Simulation for Prediction of Entry Article Demise (SPEAD) analysis tool was developed. The software and methodology have been validated against actual flights, telemetry data, and validated software, and safety/risk analyses were performed for various programs using SPEAD. This report discusses the capabilities, modeling, validation, and application of the SPEAD analysis tool.

  16. Analysis of third-party certification approaches using an occupational health and safety conformity-assessment model.

    Science.gov (United States)

    Redinger, C F; Levine, S P

    1998-11-01

    The occupational health and safety conformity-assessment model presented in this article was developed (1) to analyze 22 public and private programs to determine the extent to which these programs use third parties in conformity-assessment determinations, and (2) to establish a framework to guide future policy developments related to the use of third parties in occupational health and safety conformity-assessment activities. The units of analysis for this study included select Occupational Safety and Health Administration programs and standards, International Organization for Standardization-based standards and guidelines, and standards and guidelines developed by nongovernmental bodies. The model is based on a 15-cell matrix that categorizes first-, second-, and third-party activities in terms of assessment, accreditation, and accreditation-recognition activities. The third-party component of the model has three categories: industrial hygiene/safety testing and sampling; product, equipment, and laboratory certification; and, occupational health and safety management system registration/certification. Using the model, 16 of the 22 programs were found to have a third-party component in their conformity-assessment structure. The analysis revealed that (1) the model provides a useful means to describe and analyze various third-party approaches, (2) the model needs modification to capture aspects of traditional governmental conformity-assessment/enforcement activities, and (3) several existing third-party conformity-assessment systems offer robust models that can guide future third-party policy formulation and implementation activities.

  17. Assessment of stress reactions of recruits based on quantitative analysis of the characteristics of fingertip photoplethysmographic

    Directory of Open Access Journals (Sweden)

    Li-jun XIAO

    2011-09-01

    Full Text Available Objective To investigate the feasibility of stress assessment based on specific quantification techniques and analysis of stress status with photoplethysmographic(PPG signals.Methods The PPG signals were stratified and randomly sampled from 58 recruits before and after stress induced by International Affective Pictures Systems(IAPS images of negative emotion.The signals were collected and processed in a tool called HC2180-D,an enhanced solution of a blood flow monitoring system,through which the characteristic parameters of pulsatile waveform were derived as components of bioinformation for quantitative and comparative study.A concise mental rating scale,the "brief profile of mood state"(BPOMS,was used as criterion.Results As the characteristic parameter and biomarker were derived,significant increases in values were observed in both the vasoconstriction/vasodilation fraction(CDF and the ordinate of the area under the pulse contour of the vasoconstriction phase(Y2 when the recruits were shown IAPS images of negative emotion(P 0.05 were shown.The values of both the CDF and the Y2 were positively correlated in mid-range to tension,depression,and anger(P < 0.05,and in low-range to fatigue and confusion(P < 0.05.The values were negatively correlated in mid-range with vigor(P < 0.05.Conclusion The characteristic parameter and biomarker derived from the pulsatile waveform obtained at the fingertip are highly sensitive and can be utilized as measures in the quantitative assessment of stress response among recruits.

  18. Using Probabilistic Seismic Hazard Analysis in Assessing Seismic Risk for Taipei City and New Taipei City

    Science.gov (United States)

    Hsu, Ming-Kai; Wang, Yu-Ju; Cheng, Chin-Tung; Ma, Kuo-Fong; Ke, Siao-Syun

    2016-04-01

    In this study, we evaluate the seismic hazard and risk for Taipei city and new Taipei city, which are important municipalities and the most populous cities in Taiwan. The evaluation of seismic risk involves the combination of three main components: probabilistic seismic hazard model, exposure model defining the spatial distribution of elements exposed to the hazard and vulnerability functions capable of describing the distribution of percentage of loss for a set of intensity measure levels. Seismic hazard at Taipei city and New Taipei city assumed as the hazard maps are presented in terms of ground motion values expected to be exceed at a 10% probability level in 50 years (return period 475 years) and a 2% probability level in 50 years (return period 2475 years) according to the Taiwan Earthquake Model (TEM), which assesses two seismic hazard models for Taiwan. The first model adopted the source parameters of 38 seismogenic structures identified by the TEM geologists. The other model considered 33 active faults and was published by the Central Geological Survey (CGS), Taiwan, in 2010. The 500m by 500m Grid-based building data were selected for the evaluation which capable of providing detail information about the location, value and vulnerability classification of the exposed elements. The results from this study were evaluated by the Openquake engine, the open-source software for seismic risk and hazard assessment developed within the global earthquake model (GEM) initiative. Our intention is to give the first attempt on the modeling the seismic risk from hazard in an open platform for Taiwan. An analysis through disaggregation of hazard components will be also made to prioritize the risk for further policy making.

  19. Uncertainty treatment and sensitivity analysis of the European Probabilistic Seismic Hazard Assessment

    Science.gov (United States)

    Woessner, J.; Danciu, L.; Giardini, D.

    2013-12-01

    Probabilistic seismic hazard assessment (PSHA) aims to characterize the best available knowledge on seismic hazard of a study area, ideally taking into account all sources of uncertainty. The EC-FP7 funded project Seismic Hazard Harmonization for Europe (SHARE) generated a time-independent community-based hazard model for the European region for ground motion parameters spanning from spectral ordinates of PGA to 10s and annual exceedance probabilities from one-in-ten to one-in-ten thousand years. The results will serve as reference to define engineering applications within the EuroCode 8 and provide homogeneous input for state-of-the art seismic safety assessment of critical infrastructure. The SHARE model accounts for uncertainties, whether aleatory or epistemic, via a logic tree. Epistemic uncertainties within the seismic source-model are represented by three source models including a traditional area source model, a model that characterizes fault sources, and an approach that uses kernel-smoothing for seismicity and fault source moment release. Activity rates and maximum magnitudes in the source models are treated as aleatory uncertainties. For practical implementation and computational purposes, some of the epistemic uncertainties in the source model (i.e. dip and strike angles) are treated as aleatory, and a mean seismicity model is considered. Epistemic uncertainties for ground motions are considered by multiple Ground Motion Prediction Equations as a function of tectonic settings and treated as being correlated. The final results contain the full distribution of ground motion variability. We show how we used the logic-tree approach to consider the alternative models and how, based on the degree-of-belief in the models, we defined the weights of the single branches. This contribution features results and sensitivity analysis of the entire European hazard model and selected sites.

  20. Applications of life cycle assessment and cost analysis in health care waste management.

    Science.gov (United States)

    Soares, Sebastião Roberto; Finotti, Alexandra Rodrigues; da Silva, Vamilson Prudêncio; Alvarenga, Rodrigo A F

    2013-01-01

    The establishment of rules to manage Health Care Waste (HCW) is a challenge for the public sector. Regulatory agencies must ensure the safety of waste management alternatives for two very different profiles of generators: (1) hospitals, which concentrate the production of HCW and (2) small establishments, such as clinics, pharmacies and other sources, that generate dispersed quantities of HCW and are scattered throughout the city. To assist in developing sector regulations for the small generators, we evaluated three management scenarios using decision-making tools. They consisted of a disinfection technique (microwave, autoclave and lime) followed by landfilling, where transportation was also included. The microwave, autoclave and lime techniques were tested at the laboratory to establish the operating parameters to ensure their efficiency in disinfection. Using a life cycle assessment (LCA) and cost analysis, the decision-making tools aimed to determine the technique with the best environmental performance. This consisted of evaluating the eco-efficiency of each scenario. Based on the life cycle assessment, microwaving had the lowest environmental impact (12.64 Pt) followed by autoclaving (48.46 Pt). The cost analyses indicated values of US$0.12 kg(-1) for the waste treated with microwaves, US$1.10 kg(-1) for the waste treated by the autoclave and US$1.53 kg(-1) for the waste treated with lime. The microwave disinfection presented the best eco-efficiency performance among those studied and provided a feasible alternative to subsidize the formulation of the policy for small generators of HCW.